Posts Tagged ‘goodchild’

Changes in thought and perceptions of science, tools and GIS

Thursday, January 17th, 2013

Wright, Goodchild, and  Proctor in their article “GIS: Tool or Science?” outline the varying cases for GIS being a tool or a science.  The article may point to opening our definition of science and shifting how science is quantified by results to a broader definition of applied and practical use. However, one can ask how does one’s perception and thought on, what is a tool? and what is a science? influence how GIS is viewed, and thus how it is defined. Can GIS not be both a tool and a science? Does not every science include tools and equations to understand the variability in nature and our world, and does not every tool rely on science to have a use? I believe GIS is a tool and a science. One just has to think of mathematics or physics where equations originally developed as a tool to answer a question have themselves become a science. For example, quantum mechanics where once only considered a tool to understand the atom but has since become a field of science and scientific research, although quantum mechanics can still be a tool. Any tool can become a science and any science can become a tool.

Science is derived in latin from the word for knowledge, therefore it can be considered as the pursuit and modification of knowledge and is that   not what GIS allow people to accomplish through the gathering and modification of information. Yet GIS is still a tool because it offers a means to an end (i.e. it allow a person to modify data to get a result). The way a person thinks, influences how they may perceive GIS. For instance if GIS is a means to an end, it is a tool, like a surveyor’s station to a surveyor who is plotting a map. Oppositely, if GIS is used to gather and study, it is a science, like a surveyor’s station to a geologist who is gathering data to understand the relationships of rocks to a point.  At present, technology and science are at a crux where both are intertwined, yet have the same definitions given centuries ago and are perceived in that same old fashion. Maybe it is time for a new definition to be created, as development in our world advances the tools and sciences we do as humans,where to integrate both GIS as a means and an end together.

C_N_Cycles

Coarse grained data issues low resource settings

Friday, March 16th, 2012

Despite Goodchild et al.’s (1998) article’s technical components, the article did make me think of uncertainty regarding boundaries and course grained satellite imagery. Exploring low resource settings on Google Earth is one such example. Although an incomplete geolibrary, I consider Google Earth to be effective in its user friendly interface and features (layers and photographs), and of course, ubiquity. It’s a start. With this in mind, ‘flying’ over towns in Colombia on Google Earth, and the terrible, terrible satellite imagery that was available. (The low quality imagery remains unchanged since the last time I checked it half a year ago). One of the towns/districts is Puerto Gaitan. How do we account for the lack of resources given to collecting fine grained even medium grained visualizations?

According to Goodchild et al., alternative methods for displaying fuzzy regions must be applied where cartographic techniques are not enough. “A dashed region boundary would be easy to draw, but it would not communicate the amount of positional uncertainty or anything about the form of the z(X) surface” (208). What do we do then, when the data cannot even be analyzed because it is too coarse? For low resource settings, we are just going back to where we started. No financial incentives to improve data (from coarse to fine) = continuation of coarse grained data = poor visualization = cannot be utilized in studies = no advancements in research are made = back to the start, no financial incentives to improve the quality of data. How do we break this cycle?

-henry miller

Uncertainty: management and confidence

Friday, March 9th, 2012

Hunter and Goodchild’s article presented critical questions regarding spatial databases and uncertainty. One imminent question stood out: “How can the results be used in practice?” In the spatial databases produced, keeping track of errors is vital because it encourages transparency, and initiates a chain reaction. This management feature can lead to an increase in the quality of products. Thus, a better product is created through improved value judgements made. This, in turn, influences (and hopefully diminishes) the level of uncertainty it has. However, we must not get ahead of ourselves; “…while users at first exhibit very little knowledge of uncertainty this is followed by a considerable rise once they learn to question their products…” (59). Even though this spike in knowledge occurs, over time, knowledge tends to decrease once a threshold of familiarity and experience is reached. So users must remember to stay on top of their game and remain critical and aware of the products they make use of.

This is all relative to the type of decisions being made. Because the nature of decisions has such vast ranges and opposite spectrums of implications – from non-political to political, minimal risk to high risk – uncertainty is more apparent than ever. Heterogeneity, from data to decision implications, will exist; “…regardless of the amount by which uncertainty is reduced the task can never be 100 percent successful simply because no model will ever perfectly reflect the real world” (59). Therefore, we should settle with what it seems to be an unsettling thought. This makes me think of sah’s post. I also believe that we can find an element of comfort in uncertainty.

By “educating users to understand and apply error visualization tools”, will hopefully lead to improved decision-making pertaining to suitable data quality components and visualization techniques for products (61). Thus, a “lack of confidence in the system” (58) can be diminished; even better, confidence may be gained. Vexing characteristics of uncertainty need to be addressed in order for a clearer segue and a stronger link between theory and practice to exist.

-henry miller

GIScience and uncertainty

Monday, January 23rd, 2012

The article was thought provoking, addressing numerous accomplishments, research agendas and challenges. I appreciated the author’s self awareness and frank statements when addressing his own limitations. At times, it was overwhelming as there were a lot of points covered with 20 years of theoretical and empirical accounts of GIScience.

From the challenges mentioned, I found the notion of uncertainty intriguing; a concept that is highly influential yet largely ignored. Goodchild’s conceptual framework for GIScience elucidates how the human, society and the computer are interlinked by many variables (e.g. spatial cognition, public participation GIS, data modelling). “Uncertainty” dominates the middle of the triangle, however 3 out of the 19 papers — that were most cited, and considered classics over the last 20 years — analyzed by Fisher “report work on uncertainty” (9).

The article notes Tobler’s Law and its implication that “relative errors over short distances will almost always be less than absolute errors” (12). According to Goodchild, this has significant implications for the modeling of uncertainty. From this, it can be inferred that we have confidence in addressing an issue due to its proximity, where a relative error is less intimidating than an absolute error. Goodchild further notes the transition made in our thinking about GIScience from “the accurate processing of maps to the entire process of representing and characterizing the geographic world” (11). The emphasis on the GIScience thought process has been shifted away from accuracy on a micro geographic scale in relation to maps, towards a characteristic and representation on a macro, global geographic scale. Moving from a micro to a macro scale will entail more uncertainty, while the aim is to increase accuracy these are contrary in nature.

Despite uncertainty seen as an obstacle to GIScience progress, Goodchild takes note of it as also being a salient factor in “potential undiscoveries” (6). The process of government’s adoption and application of GIScience, and further work on third, fourth and fifth dimensions, and the role of the citizen through neo-geography and VGI are all very exciting and revolutionary.

Goodchild. (2010). Twenty years of progress: GIScience in 2010.

Henry_Miller

A changing definition for “science”?

Sunday, January 22nd, 2012

20 years of progress: GIScience in 2010 (Goodchild)

I thought it was interesting how 2 out of the 3 participants Goodchild interviewed had an issue with the word “discovery” when asked about “the ten most important discoveries of GIScience to date” (7). On one hand Marc Armstrong replaces “discovery” with “transformations”, namely from one medium (paper) to another (computer) while Sara Fabrikant replaces the word with “rediscovery”; to her, GIScience is more about seeing the world from a new light. Further, these 2 participants both emphasize the idea that GIScience is the combination of many disciplines and its research is performed in “… a variety of scientific paradigms” (9). Both participants seem to value GIScience as a field that takes an amalgamation of knowledge we already know and applies it to spatial information to access new knowledge that we otherwise could not. They acknowledge GIScience not as a “new” science per se but as a new science born from previous fields of study.

At this point, Network Science springs to mind. Many things about the relatively recent development of network science are similar to that of GIScience. Network science, like GIScience, is interdisciplinary; it draws from and has relevance to many fields. Although scholars have studied networks long ago, they had few unifying theories to show to it, which motivated the formation of a Network Science. The National Research Council writes:

“Despite the tremendous variety of complex networks in the natural, physical, and social worlds, little is known scientifically about the common rules that underlie all networks. This is even truer for interacting networks. Ideas put forth by scientists, technologists, and researchers in a wide variety of fields have been coalescing over the past decade, creating a new field of thinking—the science of networks…
Does a science of networks exist? Opinions differ” (p. 7).

Perhaps these developments in Network and GI Science support the idea mentioned by Wright et al. of a change in the understanding of what constitute as “science” in the modern world.

National Research Council. (2005). Network Science. Washington, DC: National Academies Press, 2005.

-Ally_Nash

Twenty Years of Debate

Saturday, January 21st, 2012

Posted by sah:

Twenty Years of Progress… to me, this translated to Twenty Years of Debate. While reading Goodchild’s article on the evolution of GIScience, the question that came to mind was really, “Why are we still debating”? GIScience, as it is defined by Goodchild, has evolved as a technology, and perhaps discipline, but also largely as a debate, over the last twenty years—and it would appear that it really has been debate that has dominated this field for its recent history. In class we came up with some interesting reasons as to why the debate may still be raging—legitimacy as a field and science, and thus funding and prestige for practitioners being a large aspect of this. That may be all well and good, albeit a topic for another post, but as a topic of Goodchild’s article, I was a little disappointed.

The debate is surely interesting, but was not, according to the abstract and introduction, what the article was expressed to be about: history AND accomplishments and future advancements. There could have been much more emphasis on the successes and evolution, and not just who deems a success a success. Goodchild’s personal reflections and the institutional accomplishments were most interesting, as well as the final section, Looking to the Future. This encapsulated what I anticipated of the article, and highlighted critical thoughts, most interestingly, the proper education of such a rapidly evolving and increasingly popular [tool, technique, science], and the way it can be used by the public. The proposed advancements raise a lot of questions about how GIS can be applied in the future, and what challenges this may present. In my mind, this could in fact be a reason to continue the debate: will we consider this a tool to be properly taught, or a science to be above the everyday use and understanding of the citizen?

Goodchild, Michael F. “Twenty Years of Progress: GIScience in 2010.” Journal of Spatial Information Science. (2010): 3-20. Print.