Archive for the ‘General’ Category

Ethical Implications of GIScience

Monday, October 5th, 2015

In his article, GIS, Indigenous Peoples, and Epistemological Diversity, Rudstrom expounds upon the power hierarchies that contextualize doing GIS. He boldly asserts that doing GIS as a “technoscience” reinforces and perpetuates narratives of dominance that disenfranchise indigenous ways of thinking. Therefore, if GIS adheres to Western epistemology, then is it really appropriate to apply these systematic ways of knowing to indigenous ways of being? Any process that involves structuring and classifying the world around us are inherently exclusive in nature. Therefore, how can we ethically claim that our way of mapping the world could encompass the entirety of non-western ontologies? I’m not sure if these questions will ever be answered fully, but as GIS users we must entertain the the possibility that how we do GIS may have extreme ethical implications. For example, if certain geographic knowledge is privileged in indigenous societies, then do we have the right to map them for the sake of the long term preservation of knowledge and culture?

In addition, generalizing indigenous communities suggests that there is an inherent nature to indigenous epistemologies. However, indigenous communities in Northern Quebec have very different ways of perceiving and managing their environment compared to indigenous communities in Central America. Conflating indigenous epistemologies does a disservice to the complexity and diversity of how space and processes of thought engage with one another.

I hope that more community participation in GIS and the geospatial web may tackle some of these problems. Such work will be crucial for understanding the ethical and practical implementations of doing GIS in the future.

-GeoBloggerRB

Rundstrom 1995

Monday, October 5th, 2015

For me the most striking aspect of this article relates to differing attitudes toward uncertainty. I have done extensive reading on uncertainty for my upcoming seminar and uncertainty is regarded in textbooks as something to be “aware of” and “open about”, as if it were an affliction. By contrast, Rundstrom points out that many indigenous cultures consider uncertainty (in particular ambiguity) to be a key enriching element of existence. Of course, this idea is not alien to western cultures, as we also find that the deepest of meanings are intangible. Furthermore, we are theoretically aware that in reality most strict boundaries and definitions are socially constructed. However, GIScience still seems to be fundamentally incapable of helping us to view uncertainty positively, because ultimately a GIS must work with either objects or pixels. Taking this into consideration, we should certainly heed Rundstrom’s warning that the effort to promote GIS in indigenous communities is likely to further suffocate indigenous worldviews. On the other hand, we must take care not to look at indigenous communities as passive entities with no agency. In many situations indigenous people may feel that in order for their communities to thrive while surrounded by non-indigenous civilization, they must forge a connection with us so that they can “manage” us. GIS could easily fit into this type of strategy. Ultimately, I think that as GIS practitioners we will have to scrutinize every application of GIS to indigenous culture to so as to discern whether it is truly a decision made by indigenous communities to use GIS or if it is imposed on them from outside with a colonial mindset.

– Yojo

 

GIS: Just another means of colonization?

Monday, October 5th, 2015

Rundstrom’s 1995 article “GIS, Indigenous Peoples and Epistemological Diversity” is an insightful critique of how geospatial technologies and Western science are fundamentally incompatible, exclusive and oppressive to indigenous epistemologies. For me, this has been the most thought-provoking topic yet. It made me reflect on just how pervasive and deeply-rooted colonialism is, how indigenous epistemologies have survived, and how that implicates me as a student of GIScience.

Rundstrom states that he understands GIS as a “technoscience,” which “modify and transform the worlds which are revealed through them” (46). Rundstrom actually highlights the division between GIS as a science and a tool. As a science, GIScience is fundamentally incompatible with indigenous worldviews. For centuries, Western science has actively invalidated indigenous ways of knowing. The legacy of colonization lives on through our settler society, which continues to inhabit stolen indigenous land. Western science’s desire to know more, to represent more, to describe more of our world is the means to exploit more, expand more and take more. As a tool, GIS is a technology, which have historically been used for assimilation and continue colonization. The technical capability, language (jargon) and education required to participate in the use of technologies also exclude indigenous people and their ways of knowing. Undeniably, our tools hold power over other people.

Where does this leave GIS, and indigenous ways of knowing and describing geography? I think Rundstrom would argue that indigenous knowledge should not be incorporated into GIS for the sake of taking what is “useful” to us and leaving the rest – which is historically what has been done, again and again, to indigenous groups through colonialism. Instead, indigenous groups could use it for their own aims, because GIS is likely to be believed by empirically-minded policymakers. For example, Operation Thunderbird uses crowdsourced mapping to display information on missing and murdered indigenous women: http://www.giswatch.org/en/country-report/womens-rights-gender/canada. Although GIS still has a long way to go before it can be at all compatible with indigenous epistemologies, it has potential to be an advantageous political tool.

-denasaur

Geospatial Agents, Agents Everywhere

Monday, September 28th, 2015

The first reading So go downtown gave an introduction to agent-based modelling. As mentioned in the article, one of the large limitations of the model is that pedestrians are generated according to a Poisson distribution. Similar to the train example, I would propose that it limits this model for use on campuses where large numbers of students are released at once at regular time intervals. That being said, this article is more than 10 years old and I’m sure agent-based modelling has progressed rapidly since then. Advances in CPU capabilities likely allow researchers to simulate way more agents with a more complex set of behaviors and landscapes.
Reading Prof. Sengupta and Prof. Sieber’s article Geospatial Agents, Agents Everywhere, I was excited to learn that the models have progressed and been applied to several scenarios from movement in alpine environments to shopping behavior. One of the most interesting applications mentioned in the article was a system that could vary highway tolls based on traffic density. This immediately reminded me of the car sharing service Uber, which currently varies its fares based on demand. Uber would likely be interested in traffic-predicting geospatial agent models, so that their cars could both avoid traffic and be well located to pick up passengers before they even request a lift. For example when a large event ends traditional taxis may have exclusive rights to park right outside the venue, forcing Uber cars to linger a couple blocks away. By using geospatial agent modelling, the Uber cars could predict crowd behavior leaving the concert and better distribute their cars to better compete with traditional taxis.
Fares could even become geofenced, so that zones with a high predicted agent density receive a higher fare bracket than low density zones. In this scenario Uber could entice more cars into specific areas before they are needed, and influence crowd behavior by encouraging thrifty pedestrians to enter zones of low predicted density.
-anontarian

Model Citizens: Haklay et al, “So Go Downtown”

Monday, September 28th, 2015

Haklay et al’s article “So Go Downtown” describes an intricate model of pedestrian movement, STREETS. I began the article somewhat skeptical of the need for such a model (is it not good enough to simply collect enough data on pedestrians?) and the capacity of the model to think of everything; for example, agents deviating from their agenda, socioeconomic status, etc. Nearly every “but what about…” was answered in the article, and I was surprised by the complexity of the model and how much it takes into account. I also found the combination of raster, vector and network data to be fascinating: often in our education, these data models are taught as disparate and we do not use them in conjunction. This article started to give me an idea of the ways that these data models can, in fact, be used together.

One problem with the model that the authors raise is that the town is “spatially closed” – the town in the model is a bubble, with no competing towns or suburbs nearby. The authors recognize that adding further complexity by expanding what is a closed model would be an incredible task. It is difficult to place boundaries on what should and should not be included in a model – it requires making serious choices about what is significant enough to be included in the micro world of the model.

Clearly, there is room for expansion and improvement in the modeling and simulation realm of GIScience, as existing models are modified and new ones are created.

– denasaur

Modular, spatial ABMs: Haklay et al., 2001

Monday, September 28th, 2015

In the intervening fourteen years since “So go down town” (Hackley et al., 2001) was published, agent-based modeling has, unsurprisingly, been harnessed for an ever-expanding number of applications. In the wake of the late-2000s recession, which appeared to discredit the economistic assumption of equilibrium, influential science journal Nature published an editorial calling for the synthesis of existing ABM techniques into a modular representation of the existing economy. Spatial ABMs (such as the STREETS model) have surfaced in mainstream news as potential predictors of crowd behaviour. Needless to say, Hackley, et al. were on to something very important with the development of their modular, multi-scalar representation of pedestrian behaviour. Avoidable catastrophes such as the 2010 Love Parade disaster, in which 21 people were killed by trampling due to a dynamic feedback phenomenon now known as “crowd turbulence,” have provided fodder for the study of the effects of interrelated psychological and physical forces on large crowds.

In general, ABMs appear to be one of the most promising intersections of social science and computer science, due to its ability to model situations of staggering complexity, involving thousands or millions of agents whose dynamic interactions produce highly unpredictable results. Our last discussion about geolocated SNA produced some interesting conjecture about what could be done — for better or worse — with the datasets of Google or Facebook, which contain geolocated information on billions of real individuals. Haklay’s observation that ABM research in the 1990s was hindered by “sufficiently powerful comptuers and suitably rich data sets” points to the potential that this information has to expand human knowledge, as well as to enable much more effective control of human populations.

I would venture that combinations of current-day iterations of modular ABMs like STREETS, combined with these ever-growing, dynamic sources of socioeconomic data, hold the potential to create very well-informed models that capture the dynamism of emergence with the power of immense and ever-evolving observations of real people. With so much relevant research now being conducted behind closed doors at intelligence agencies and in corporations whose business is selling data, the current and future possibilities of spatial ABMs remain both fascinating and frightening.

 

-grandblvd

Haklay et al 2001

Monday, September 28th, 2015

I imagine that agent-based modeling is much more complex than most models in natural sciences, such as climate models or forest growth models. While for now agent-based modeling is applicable for more simple aspects of human behavior such as commuting, further application in economics or sociology would probably require significant advances in fields such as artificial intelligence, which would improve our ability to simulate human decision-making. Since the writing of this article, however, I imagine vast advances have been made. Such advances would allow computer models to complement or perhaps replace some survey-based research. Choice experiments, for example, represent a survey-based approach that is used to understand how subsistence farmers and herders use ecosystem services based on environmental and socio-economic factors. I would be intrigued to see computer models simulate such scenarios.

I wish that the “planner” module were functional and applicable at the time that this article were written. Perhaps it would be representative of people having multiple, completely different modes of behavior. For example, would a student or worker have a “weekday” plan and a “weekend” plan that the planner module would alternate between? Also, I was very intrigued by the term “cognitive map”, but the paper did not expand on it. Furthermore, the discussion of emergence was difficult to grasp. I believe it was talking about whether we should try to look for clear behavior patterns  and systems at aggregate scales or just accept ambiguity or lack of patterns as they are.

 
-yojo

More than ‘plausible’: pedestrian simulations and the future.

Monday, September 28th, 2015

In their 2000 article, Hacklay et. al. present a model of impressive complexity – STREETS – to simulate pedestrian movements in central urban areas, relying on the use of several different ‘modules’ to control individual agents as well as interactions with the environment and crowd dynamics. The authors outline a number of shortcomings to their methodology, notably the assumption that the town centre in the simulation is a ‘spatially closed’ (p.10).

Initially skeptical of the model’s use beyond simply confirming or denying existing ideas about pedestrian behaviour, I was reminded (as in previous articles) of our in class discussion about the quantification bias that can legitimize numerical/computational work over more qualitative approaches, and has indisputably helped maintain the relevance of GIS (and now modelling) in contemporary geography. This made me question the relevance of modelling human behaviour; I felt the assumptions in the STREETS model were too damning, and the implied complexity would never be adequately abstracted. To my surprise, this was boldly addressed by the authors through a fascinating discussion of ‘bottom up emergence’ (p.25-26).

In discussing the role of agent-based modelling and its ‘one-way notion of emergence’ (p.26), the authors detach themselves from the notion that inductive research is possible in the STREETS model, and the jump from describing pedestrian movement as ‘plausible’ to ‘self-organizing’ (p. 25) is significant. This discussion peaked my interest, for it suggests that there is more to modelling than increasing its complexity every time advances in computational power allow for it. Clearly, the addition of dozens of modules or parameters is not enough to allow reliable inductive research to be conducted. Nevertheless, the power of modelling hundreds of thousands of agents at a time far exceeds the current possibilities of qualitative research on pedestrian movement, suggesting that modelling will remain highly relevant in the study of pedestrian movement into the future.

At what point will models transition into ‘self-organizing emergent structures’ (p.26)? I honestly cannot say, and my level of understanding doesn’t even allow for an educated guess – all I know for sure is that it won’t be exclusively dependent on computational power. In any case, I look forward to seeing how the field develops.

-XYCoordinator

 

Are all Trip Generators Created Equal? (ABMs)

Monday, September 28th, 2015

In their article “So go downtown: simulating pedestrian movement in town centres”, Mordechay Haklay et al describe ways in which “agent-based modelling” have produced superior models of pedestrian behaviour, by taking into accout variability in the preferences and behaviour of pedestrians based on the purpose of their trip, their demographic characteristics, and a variety of other considerations. However, one aspect of earlier pedestrian traffic modelling–from which the assumptions of agent-based modelling are derived–underlined some of the limitations of the agent-based modelling approach. Haklay et al indicate that pedestrian models typically incorporate two elements of a place (typically a city block, tract, or some similar defined area) to predict the volume of pedestrian activity: the “population at [the] location” and the “measure of the attraction of facilities at [the] location” (Haklay et al 7). However, this begs the question: are all attractors created equal? In less abstrsct terms, can the number of trips generated by commercial and employment nodes be considered with equal weight as a trip generator as the relative permancy of a residential population at a particular location? In my opinion, they surely cannot. The variablity of pedestrian trips–particulary to retail–cannot be overlooked. While the “attraction of facilities” (7) at a location can vary on an hourly basis, residential populations fluctuate significantly only over several years at a time. Some factors that affect pedestrian trips to facilities at a location–particularly retail facilities–include: variablity in the seasonal commerce (e.g.: Christmas shopping, tourism season, etc.); variable personal preferences from person-to-person in different weather conditions (e.g.: a shop may see less clients during inclement weather, while a movie theatre might benefit); personal preferences in walking speed and environment (e.g.: some people may prefer quieter streets so they can walk faster, while others prefer busier, slower streets); and variable tolerance to environmental conditions, such as the urban heat island effect. Although incorporating these elements into agent-based modelling would be arduous and expensive, the potential benefits to countless urban environments is unimagineable. For instance, pedestrian modelling which incorporates pedestrian behavioural response to changing weather conditions could correspond to public transit network, deploying more or less vehicles during times of demand induced by weather (e.g.: several people seeking bus service during a rain storm). Models which comsidered variability in tourist traffic could help business owners make educated decisions about their investments (e.g.: where to locate, what hours to have etc.). But perhaps most intriguigly of all, pedestrian models could ipactually show what factors in the environment affect pedestrian behaviour adversely, allowing for targeted investments that enhance the walkability of an area and maintain the vitality of pedestrian-oriented neighbourhoods.

-CRAZY15

Role of Geospatial Agents in GIScience

Monday, September 28th, 2015

In their article, Geospatial Agents, Agents Everywhere…, Sengupta and Sieber (2007) demonstrate how the paradigm of agents in AI both serve and benefit from research in GIScience. I found it interesting that Artificial Life Geospatial Agents (ALGAs) are relevant to our previous discussion about the importance of spatializing social networks. ALGAs are relevant to spatial social networks in that they model “rational-decision making behavior as impacted by a social network” (486-487).  Therefore, applying our knowledge about spatial social networks (as opposed to just social networks) to ALGA development could perhaps help us better understand and model social interactions and information passing between individual agents.

In addition, the interoperability of Software Geospatial Agents (SGAs) across software and hardware platforms informs us about ontology, representation, and semantics in GIScience. Therefore, SGAs might unlock answers concerning key questions surrounding geospatial ontologies and semantics. This is because SGAs have the key responsibility of determining the standards to interpret semantically. These standards may help with important GIScience tasks of expressing topology and geospatial data in GIS. Therefore, the fact that SGAs are “geospatial” in nature will impact the extent of how we “do GIS” as geographers.

I am interested to know the extent that ALGAs are able to incorporate temporal dimensions within its frame of development. I suspect that adopting the added dimension of time into these platforms and models would be a crucial challenge for ALGA research in GIScience.

-GeoBloggerRB

On Geospatial Agents

Monday, September 28th, 2015

Firstly, I can see why ALGAs are dominating the GIScience literature on agents. Modeling complex social relationships and migration patterns as well as predator-prey interactions (and more) has a much more compelling and interesting implications for geography (at least on the surface) than does information mining. Even with my limited knowledge of AI agents, I find my mind is flooded with scenarios in which I could apply ALGAs; I grasp the concept of using a computer to model intelligent systems easily.  With that said, I certainly do not wish to underwrite the potential of SGAs. The implications of the ability to work across multiple platforms is somewhat lost on me, and I will attempt to explore in my upcoming lecture with the authors of this piece.

I find most of my difficulty in understanding SGAs and their potential applications lies in what is said by Sieber and Sengupta on page 492. The authors describe how SGAs are divided by tasks while ALGAs are divided by themes. What I gather from this is the following statement: ALGAs are defined by an application, while SGAs are defined within an application.

It seems that these agents certainly have a place in Geography. I have faced more than in one situation in which I felt like a robot data mining unsuccessfully and re-iterating nearly similar interpolations. Another potential use for SGAs crossed my mind, in identifying patterns between z-spectrum graphs in Remote Sensing. My experience with the graphs was that they are very data intensive and uninterpretable.

Smitty_1

7:45 pm, 9/28/12

 

Geospatial Agents, Agents Everywhere…

Saturday, September 26th, 2015

Sengupta and Sieber’s review of artificial intelligence (AI) agent research history and its current landscape sought to define and ponder the legitimacy of ‘geospatial’ agents within GIScience.

The discussion of artificial life agents, often used for modeling human interactions and other dynamic populations, complemented my current research into complexity theory and agent-based modeling of chaotic systems that are sensitive to initial conditions, as it holistically related them back to GIScience.

However ‘software’ agents, defined as agents that mediate human-computer interactions, was an unfamiliar notion to me. I found it more understandable to read about these types of agents if instead I replaced it with the words ‘computer program’, ‘process’, or ‘application’.

As a student familiar with software development, the article made me question a lot of the computational theory I’ve learned thus far, and raised some big questions: What does it truly take for an agent or program to be characterized as autonomous? If an agent or program engages in recursive processes, does that count as being autonomous, as it essentially calls itself to action? And when is a software agent considered to be ‘rational’?

I wonder if rationality in decision making should even be included in the definition of an agent. Humans often make irrational decisions. Our decision making process and socialization patterns are highly complex and difficult to model, issues that are quick to see even when attempting to analyze static representations of spatial social networks.

I look forward to see how this conversation evolves.

-ClaireM

Spatializing Social Networks

Monday, September 21st, 2015

Radil, Flint, and Tita’s “Spatializing Social Networks: Using Social Network Analysis to Investigate Geographies of Rivalry, Territoriality, and Violence” (2010) demonstrates a promising integration of social network and spatial analysis in a study of gang violence occurrences and intergang rivalry in an LA neighbourhood with an above-average rate of violent crime.

The article highlights the fundamental interdependence of relationships and space, and thereby the fruitfulness of analyzing both “network space” and “geographic space” at the same time. While the article speculates as to why certain areas with particular network roles may experience higher rates of violence – interstitial or “brokerage” areas in particular – it stops short of musing on the potential predictive power that such analyses may one day hold.

Discussions of GIScience in 2015 inevitably seem to gravitate toward questions of power, which is well as we are at a moment of paradigmatic change in the amount of geolocated information that is collected on all digitally-active individuals on a regular basis. On a fundamental level, the article’s attempt at developing a synthesized geographic and social network analysis method points to a future where individuals’ and groups’ positions in network and geographic space can be studied simultaneously and automatically.

This in turn has considerable implications for questions of both collective and individual freedom. If Facebook, using data derived from the frequency of message exchanges, can predict a breakup between two romantically involved individuals, as CRAZY15 noted, what could it do with geolocational data synthesized with all of its existing (and future) network/relational indicators?

-grandblvd

Radil et al 2010

Monday, September 21st, 2015

This study takes on quite a difficult task, in that it attempts to quantitatively analyze a social system while simultaneously using two distinct concepts of space. In this case, the gang rivalries correlated for the most part with the geographic proximity of the gangs. I think the utility of this approach would have been more obvious if the spatiality of rivalries and geographic proximity were much more divergent. The fact that gangs were usually embedded with and structurally equivalent with neighboring gangs makes the results appear underwhelming. However, certain exceptions, such as the existence of a center-periphery geography in the northern part of Hollenbeck, as well as the condition of “betweenness” being associated with more violence, exemplify the exceptions to the norm that would be difficult to discern without this kind of analysis. I struggled to grasp which stage it was that social networks and geographic space were combined. When using CONCOR to make the dendogram, did both location and gang rivalry influence which position each gang was placed in? An aim of this study was to quantify the interaction between the two spatialities, but it seems to me that the network positions are themselves quantified, but their comparison to the geographic positions is only qualitative, i.e. “north-south” and “center-periphery”, and that these characteristics were determined visually. Nevertheless, these types of qualitative characterizations should still be immensely useful in predicting gang behavior. This type of analysis could potentially perform any combination of spatialities, including those in which neither of the two spatialities is geographic space.

-yojo

Spatializing Social Networks

Monday, September 21st, 2015

In Radil et al.’s Spatializing Social Networks (2010), the authors introduced an innovative method of integrating social network concepts of closeness and space with those of proximity and location called ‘structural equivalence’ (2010:308). A case study of rivalry and territoriality in the Hollenbeck Policing Area of Los Angeles was used to demonstrate how social network analysis can go beyond mapping of spatial networks (called ‘relational embeddedness’) of gangs in this area, but also the social positions of the gangs (also called the ‘structural position in network space’) within the Hollenbeck social network of gangs (2010:309).

Radil et al.’s publication achieves its goal of presenting to readers thoughtful (for 2010 at least) methods of incorporating the fundamental ideas behind sociological constructs of human interaction and social networks into spatial network analysis. I found the publication to have a thorough literature review of past forays into structural equivalence and concepts of spatial and social types of embeddedness, albeit difficult to understand at times, for readers unfamiliar with this geographic information science subdomain.

I found the Radil et al.’s Spatializing Social Networks to be an intriguing exercise in the harmonizing of social and geographical sciences. Most of all, I appreciated the authors obvious endeavour to use as much scientific terminology as possible (and very little tool-talk), in an effort to elevate geographic information science away from the simplistic ‘GIS is only a tool, not a science worthy of funding’ label.

The authors addressed a question that was raised in my mind as I was reading the article; that of temporal dynamism of spatial and social networks. I would be very interested to see how the CONCOR (convergence of iterated correlations) positional analysis would fair when a third, temporal dimension were to be added to the positional analysis. How would social constructs of space change over time? How would changes to the temporal resolution (i.e. scale) affect the magnitude of these changes? How could these results sway our understanding of Hollenbeck and the structural positions of gang network space?

-ClaireM

Spatializing Social Networks: Quantification Gone Too Far?

Monday, September 21st, 2015

In their 2010 article, Radil, Flint and Tita attempt to combine spatial analysis techniques with a social networks approach to tease out spatial patterns in gang activity across Hollenbeck, CA. While successful in presenting a three-tiered distribution of gang activity with interesting spatial phenomena, including the effect of ‘relational betweenness’ on violent crime (p.321) and elements of ‘north-south’ (317) and ‘core-periphery’ (p.320) territoriality, the authors highlight the potential of the technique to be used ‘in concert with other ways of knowing’ (p. 322) and suggest that the static nature of their study is an important limitation considering the ‘dynamism’ (p.321) of gang activity across space and time.

While I do appreciate the author’s efforts to quantify a traditionally qualitative area of Geography, their detachment from the subject of study leaves me very uneasy. Gang violence affects the day-to-day experience of hundreds of thousands of Angelenos, and to ‘focus on methodology’ (p.322) and engage so little with its repercussions leaves the door wide open for criticism from a cultural geography standpoint.

Could these patterns had been identified through a qualitative approach? How do testimonials acquired ‘in the field’ stack up to the matrices, network diagrams and spatial analyses in this paper? As discussed in class, quantification (quite like the term ‘science’ as a qualifier for the ‘S’ in GIS) has traditionally been associated with a higher level of recognition and funding in academia. While there are undoubtedly benefits to this, the value of the study assigned this week is limited so long as we agree that it uncovers little new information and only bolsters (or at least claims to bolster) the legitimacy of known patterns and distributions.

The First Law of Geography was derived from a similarly complex attempt to model urban sprawl; a simple message with enormous repercussions drawn from a paper riddled with number crunching and model making (see Tobler, 1970). The degree to which we can use quantitative methods for inductive reasoning in social geography is, in my opinion, an interesting debate that I would love to expand on in class. Let’s remember that Los Angeles is more than a collection of statistics…

-XYCoordinator

Spatializing Social Networks

Monday, September 21st, 2015

In this week’s article, Spatializing Social Networks, researchers looked at gang violence in a section of Los Angeles. To understand the context of the violence they looked at to spatialities of gangs, firstly there geographic position and secondly there position within a network of rivalries. It was important for the researchers to start with a Moran’s I test. This showed them that there was not a significant spatial autocorrelation of gang violence meaning that a purely spatial analysis would be unhelpful to understanding gang violence. This was a good clear rationale for moving their study beyond just location and including network analysis as well. Using GIS they mapped the positions of gang territory as well as overlaying the network of rivalries. In their analysis they found a first split divided geographically between north and south of the freeway, and then a second split within each region divided by a core of dense rivalry linkages creating violence and a more peaceful periphery.

This study was interesting and shows how, as discussed in class, GIS can be used to improve knowledge for law enforcement. In this specific case, nobody outside of the LA gangs would argue this knowledge is a bad thing. The data was collected by survey from willing law enforcement and gang informants/experts. However it presents an interesting question for scientists who are developing methods of simultaneously analyzing social networks and geographic space. What are the implications now that all that data that can be gleamed from Twitter or Facebook users? Could law enforcement be made more efficient by predicting spaces of rivalry, or could it be used as an authoritarian tool? In a Ukrainian or Syrian uprising scenario, to what extent could governments use these same techniques to quickly quell dissent?
-anontarian

Spatializing Social Networks

Monday, September 21st, 2015

The subject of social network analysis is fascinating; however, I found the article by Radil, Flint and Tita (2010) to be somewhat difficult reading. The article was full of jargon, such as “spatializing” “spatialities” “betweenness” “positional analysis” and the authors often needed to translate themselves by writing “in other words…” Nevertheless, the topic and the application of it to rival gangs in Hollenbeck were very interesting. The authors discuss the idea of embeddedness: how social behavior is produced by and inextricably connected to space, and use spatial statistics such as Moran’s I to examine the social networks and splits between gangs. The example of gang territory is an excellent one, because turf and territory have a significant geographical element that manifests itself in gang rivalries and behavior.

While reading the article, I became interested other applications of social network analysis. I found myself thinking, “How could GIS be used to consider the spatial networks of other things more positive than gang territory?” For example, one could explore the spatiality of activist social networks or a network analysis of the use of health centers. Social media use is also a relevant example because it is, of course, social, but it also has an important spatial element. One could use a spatial network analysis to learn how information is distributed through social media across space and time.

This article explores some of the issues and recurring questions of GIScience. For example, the authors struggle to incorporate both space and time in their analysis, as they address that their static model doesn’t address the dynamism of constantly-changing social networks. The authors also address the multi-disciplinary aspect of GIScience, by encouraging that the results they found be strengthened by other ways of knowing from other disciplines.

 

-denasaur

Wright et al 1997

Tuesday, September 15th, 2015

The article is a good follow-up to Goodchild’s 1992 article. Perhaps it represents the first real “victory” for proponents of the scientific view of GIS, in which they manage to stand their ground in a debate with those who dismiss their viewpoint. What I get out of this article is insight into processes of technological and scientific discovery. People may invent an ingenious tool or technique, but it could take them a long time before they realize that they have unearthed something much bigger than a tool. Perhaps almost any given science can be thought of as an iceberg whose tip represents day-to-day application, but whose vast underwater body of understanding buoys that application. The author’s mention of applied sciences as a challenge to the simple tool/science dichotomy is very poignant. It reminds me that Agricultural science, as an applied science, would probably not meet the conservative definitions of science used by the “tool” side in this article, but agricultural science has certainly obtained prestige and support in the academy, and rightly so because agriculture is so widely practiced and fundamental to our survival. It is possible that after a technology is used for long enough and accumulates enough “reps” or a record of use, that only then can it finally be analyzed scientifically. Therefore the case for GIScience has gotten stronger over time. Finally, the article’s questioning of the very definition of science is important not only for GIS but for any important new field of study that is hindered by academic conservatism.

 

-yojo

 

Goodchild 1992

Tuesday, September 15th, 2015

In this article we can see that Goodchild is an instrumental figure in GIS who wants to ensure that GIS ends up being a truly valuable contribution to society. The mere fact that a technology is developing rapidly doesn’t insure that people will know how to put it to revolutionary use. An impressive new tool may get its time in the limelight, but if during that brief moment it is merely put to innocuous uses than people may see it as a gimmick, in which case it will soon be forgotten about and overtaken by the next new technology. It reminds me a bit of Tamim Ansary’s book about Islamic civilization, in which he describes a steam engine that was invented in Persia before the one in Britain, but that it was used by a king merely to spin a rotisserie for a mutton banquet. Seeing it from this angle, I can’t be too critical of the fact that I found the paper quite dense and hard to read. When papers are hard to read it’s often because the writer doesn’t fully comprehend the subject matter. That’s totally forgivable in this case because Goodchild is doing his best to envision a new field of science that hasn’t yet been developed at the time of his writing. Despite the difficulty he goes about his task well, arguing for the uniqueness of GIS Science, identifying common questions to make it a cohesive science, and imagining how best it can establish itself among other sciences.

-yojo