Archive for March, 2012

Conservation drones

Sunday, March 11th, 2012

Repurposing the tools of the hobbyist and, later, the war machine for conservation: the conservation drone.

Note: flying drones is no trivial matter. Often you need professional pilots to obtain good images.

Uncertainty and confidence: walking a fine line

Friday, March 9th, 2012

Foody states important end-users tend to underestimate uncertainty. Their perceptions of uncertainty, in turn, could prompt greater problems. Yet, according to the article, “uncertainty is ubiquitous and is inherent in geographical data” (115). Thus, the blame game is hard to play when uncertainty can be argued as ‘inherent’. The topic becomes more convoluted when geographic information (GI) systems are assessed. “…the ease of use of many GI systems enables users with little knowledge or appreciation of uncertainty to derive polished, but flawed, outputs” (116). Thus, it is even more important to think twice about generated outputs. Foody mentions data mining’s methods that do not consider the “uncertainty that is inherent in spatial data” (111). We have a tendency to find patterns in data. Therefore, we could create the wrong patterns, but utilize the outcomes as accurate or next to flawless because of an overconfidence that is produced by the systems and databases we create. In addition, these systems have slippery slopes with potentially irreversible outcomes. It reminds me of the infamous case of the northern cod fishery collapse. Here, the problems in the science of the northern cod assessment entailed overestimation of biomass and underestimation of fishing mortality (by 50 percent!). I believe the overestimation and/or underestimation of variables are important to the way uncertainty is perceived. The awareness of uncertainty matters, as noted in the article. The higher the level of awareness is the better.

ClimateNYC mentions social errors in the ‘visualizing uncertainty’ post, which made me think of representation and scale issues. In particular, the decisions Google Maps makes with regards to representing marginalized communities. The Ambedkar Nagar slum in Mumbai, India is labelled in Google Maps as ‘Settlement’. However, ‘Settlement’ disappears at 200m altitude but is visible at 500m. Adding to the ambiguity, at 50m altitude, the Organization for Social Change is mapped but this point is not visible at any other scale. Who makes these decisions when labelling these areas? Is it political? Or do they simply not care?

-henry miller

Uncertainty: management and confidence

Friday, March 9th, 2012

Hunter and Goodchild’s article presented critical questions regarding spatial databases and uncertainty. One imminent question stood out: “How can the results be used in practice?” In the spatial databases produced, keeping track of errors is vital because it encourages transparency, and initiates a chain reaction. This management feature can lead to an increase in the quality of products. Thus, a better product is created through improved value judgements made. This, in turn, influences (and hopefully diminishes) the level of uncertainty it has. However, we must not get ahead of ourselves; “…while users at first exhibit very little knowledge of uncertainty this is followed by a considerable rise once they learn to question their products…” (59). Even though this spike in knowledge occurs, over time, knowledge tends to decrease once a threshold of familiarity and experience is reached. So users must remember to stay on top of their game and remain critical and aware of the products they make use of.

This is all relative to the type of decisions being made. Because the nature of decisions has such vast ranges and opposite spectrums of implications – from non-political to political, minimal risk to high risk – uncertainty is more apparent than ever. Heterogeneity, from data to decision implications, will exist; “…regardless of the amount by which uncertainty is reduced the task can never be 100 percent successful simply because no model will ever perfectly reflect the real world” (59). Therefore, we should settle with what it seems to be an unsettling thought. This makes me think of sah’s post. I also believe that we can find an element of comfort in uncertainty.

By “educating users to understand and apply error visualization tools”, will hopefully lead to improved decision-making pertaining to suitable data quality components and visualization techniques for products (61). Thus, a “lack of confidence in the system” (58) can be diminished; even better, confidence may be gained. Vexing characteristics of uncertainty need to be addressed in order for a clearer segue and a stronger link between theory and practice to exist.

-henry miller

Rethink Uncertainty in GIS research

Friday, March 9th, 2012

The first step for uncertainty management is the identification or quantitative measurement of uncertainty. In the paper published by Foody 2003, uncertainty is generally categorized into two types: ambiguity and vagueness. The former is the problem of making choice, while the latter is the challenges of making distinction. Then authors talks about the contemporary research in uncertainty, and they also give analysis about the failure to recognize uncertainty. But I think the most important challenge in this filed is how to quantitatively measure uncertainty, with probability theories, statistical tool, to mention a few here. Later author mentions data mining with uncertainty analysis, and we still need a systematic approach to measure the uncertainty in GIS data mining.

If we consider the process of geospatial data collection, data storage, data analysis and visualization, uncertainty analysis can be applied to each of them. Sometimes, transferring a large number of files across Internet can incur great errors, due to the uncertainty on the network and storage. Therefore, this kind of uncertainty in uncertainty should also be taken into account, especially when high reliability is required. Of course, how to analyze uncertainty in uncertainty is another great challenge.

Uncertainty can change with respect to different study areas or different research projects. For example, tiny vagueness in classification process may cause unpredicted problems, while its impact in visualization can be well handled. However, uncertainty should be handled very carefully, as ClimateNYC mentioned in his blog posts.

–cyberinfrastructure

The Right Tool for the Job

Friday, March 9th, 2012

The theme of having specific applications of multidisciplinary tools and concepts has stayed with me so far in this course. (I refer to the debates of GIS as a tool or a science and later particularly to the debate of ontologies being designed in a subject specific manner.) I was pleased again to find mention of the need to identify various domains using a tool and how a tool can and should be adapted for use in that domain. The article by Hunter and Goodchild articulated this quite well when stating “One parameter in managing uncertainty is the variation that occurs in the use and application of spatial databases, as a result of different types of data, procedures and models requiring different approaches to visualizing error.” and through their Table 1. With respect to visualization, every domain would see something different when looking at an image or a map and be able to draw many different conclusions about what is presented or what the displayed data could mean. When dealing with error this is no different and it is necessary to visualize error in a way most relevant to those collecting, analyzing, presenting and making decisions based on a dataset. As such, the right tool must be selected for the job although participation from both the program designer and those working in the domain in question that must contribute to creating such a tool to be able to fully understand the tool created its uses and limitations.

-Outdoor Addict

The Precautionary Principle

Friday, March 9th, 2012

In discussing uncertainty particularly with respect to climate change and methane emissions as Foody does, the precautionary principle was not mentioned in the article. I find this rather curious as it is an important concept existing largely because of uncertainty on what impact will be had by an action. Having discussed the precautionary recently in another class, Analyzing Sustainability, it is also key in relating to Hunter and Goodchild with respect to decision making with recognition of uncertainty and would have been good to see a little more information on.

In relation to sustainability issues, this principle comes up often as one school of thought. Because uncertainty exists and consequences of a decision could potentially be detrimental, an approach that would either not be detrimental in outcome or would be estimated to be less detrimental should be followed. We discussed fisheries in particular as fish populations are unknown and only catch data is available leaving biologists guessing the actual population and needing to make recommendations on fishing policy and allowable catch based largely on their estimates of how viable, large, health etc. the population actually is.

-Outdoor Addict

Scientific advising

Friday, March 9th, 2012

I don’t know why, but I didn’t feel like the Foody article was that useful to me. Of course, it was labelled as a progress report, but had little depth and little to add.

The hunter article also glosses over some details that would have been appreciated. The idea of ‘meta-uncertainty’ was very interesting though, and I think is one of the main reasons we need experts and education in geography and uncertainty. The development of methods to determine error is very important for future projects. However, the communication of error is the other key problem, especially when submitting results/conclusions to political bodies. This is the very reason that people in government appointment advisors from the scientific community to help interpret findings in the scientific community. I don’t, however, think that developing a good way of communicating and displaying uncertainty is going to do away with scientific advisors. There must be more to knowing about uncertainty than  just looking at a table or graph like those examples below. I think that part of truly being able to understand error in results/conclusions is knowing about the processes/methods that have been done to get to the result. How can we measure potential loss of accuracy or other problems coming from a workflow? I don’t think that is going to be fully possible with just a graph or table. Decision makers still need to know, or be communicated, the processes that have been gone through.
The kind of systems mentioned in the papers seem to be geared towards researchers. This still doesn’t necessarily mean that they will be translatable to ‘plain language’ for decision makers (the visualisations being developed here http://slvg.soe.ucsc.edu/unvis.html certainly seem complicated to understand, especially with multidimensional data), and so, perhaps another set of systems/visualisations need to be developed. Maybe we will need a certain set of visualisations and metrics for understanding, and another set for communicating results.
Finally, would making decisions necessarily hinge upon “this conclusion is more accurate, so we’ll choose this one”? Would a decision maker, presented with a certain finding that was deemed to have the least errors always choose to support that conclusion?
Either way, having a better way of understanding and communicating error and uncertainty will surely be beneficial to the scientific community, by making conclusions a little more transparent and understandable to the public.
-?

Transparency is convincing

Thursday, March 8th, 2012

Climate NYC’s post raises some good points. The skeptics who question climate change are probably in the minority in today’s trendy world (although perhaps that is just my opinion because I’m a geography and environment major). Although these opinions are for the most part unhelpful exaggerations, the use and necessity of the skeptic is undeniable. Without the skeptics politicians and others would be able to convince the public of anything. Al Gore exaggerates in the film An Inconvenient Truth and it hurts his reputation. Skeptics call him out on this and bring his study back down to earth.

The initial predictions that the IPCC reported were not very accurate. Their climate models did not do a successful job at predicting the impacts and rise in CO2; in fact the IPCC’s climate models underestimated these variables. That being said, the public does not want vague estimates. The public wants precise numbers. The IPCC therefore feels compelled to provide the public with quantitative information that may potentially break bad habits.

More recently, as shown by Climate NYC, the IPCC offers multiple climate change scenarios. Not only do they provide the graphs shown below, but they also show future estimates regarding temperature rise using different levels of CO2 reduction. These methods of visualizing error and uncertainty have a positive impact on the public. The transparency of these reports counteract the skeptics’ critiques benefit the research of the IPCC more than if they chose to exaggerate as a scare tactic.

Although perhaps not necessary in all academic reports, it would be interesting to see more visualization of error and uncertainty. Granted, some error and uncertainty cannot be quantified, rather more qualitatively listed, but it gives me the impression of a more honest report.

Andrew

 

Climate Models, Uncertainty and Unmade Policy Decisions

Thursday, March 8th, 2012

I think we’d be remiss to cover the topic of uncertainty without thinking about the role that it plays when scientific research or other forms of data are transmitted from the academic or research realms into the public world of policy debates. As Giles M. Foody notes, problems with uncertainty “can lead to a resistance against calls for changes because of the uncertainties involved” (114). I think he’s right but this is a vast understatement.

In the climate change debate now swirling through most of the globe, uncertainty could be described as one of the main factors propelling so-called climate skeptics, naysayers and those generally unwilling to acknowledge that human energy consumption might be influencing the global climate. Just take a look at this skeptic’s blog post. He names uncertainty in climate science as the number one reason not to move too fast on this global, vexing issue. In fact, much of the opportunity for debate on the issue stems from varying points of view on just how certain the hundreds and thousands of climate models out there might be in predicting a warming world. In fact, just try Googling “climate” and “uncertainty” and you’ll find an avalanche of information – some more scientific than others.

Foody does a nice job of summarizing this paradigm when he writes about how “end-users and decision-makers often fail to appreciate uncertainty fully” (115). I couldn’t agree more. What most climate scientists will tell you is that while their models contain a great deal of uncertainty – which varies depending on what type of model your discussing or how it’s been parameterized – the overall trends are pretty clear. Most of the work done in this field concludes that a relationship does, in fact, exist between CO2 emissions and a warming global climate. Yet the importance of uncertainty, here, lies not within the scientific community but with publicly debated policy decisions where uncertainty/error can conveniently become a political football. I mean just look at some of the variation in predictions from climate models in the IPCC’s 2001 report:

Figure 1. A selection of climate models and their prediction of globally averaged surface air temperature change in response to emissions scenario A2 of IPCC Special Report on Emission Scenarios. CO2 is ~doubled present concentrations by year 2100. Figure reproduced from Cubasch et al. (2001).

Yes, there’s some definite variation between models, a degree of uncertainty. But how does this compare with the idea we discussed in class about scale. Can we ever expect to have complete accordance and certainty amongst climate models when the issue operates on such a vast, global scale? Should we expect it on smaller, regional scales with something as complex as the atmosphere’s inputs and ouputs and the sun’s radiation?

–ClimateNYC

The hunt for knowledge and bear Foody

Thursday, March 8th, 2012

Foody mentions uncertainty with knowledge and discovery. I like how, although seemingly opposite, these elements work together. Before science and empirical reasoning became the norm (I’m talking wayyy back), uncertainty was often explained through legends, ghosts or religious anecdotes. These were ways to calm people. Sea monsters used to live far out to sea and if you sailed too far you would fall of the earth. Without the resources available to man (and women), these explanations were all that was available. There were however some people curious enough to explore the uncertain and unknown.

Today, although uncertainty still causes anxiety amongst many people (like when the next zombie attack will occur and where the best place to hide is) it seems that it now fuels the hunt for knowledge and discovery. In academia, the focus is on the acquisition of good, sound information. We as students often take for granted that the information given to us is without error. It has recently been revealed to me that some of the information provided to students can be very false.

I was once told that if I ran into a grizzly bear, that I should not climb a tree—instead I should run down a steep slope, because bears can’t run down hills. I was amazed by this fact and told many people about this escape technique; unfortunately, I was later informed by an expert that this is not the case. Unknowingly, I had spread what I thought to be valid information when in fact the escape technique was false. I hope that my gossip will not cause anyone harm in the future!

The uncertainty touched on in both articles (and bear escape techniques) does not refer to this type of uncertainty/curiosity, but I cannot help but see how related these words are. When looking at uncertainty in a study, I am able to see how awareness of error and uncertainty could inspire further research. I think that it is human nature to be curious and to strive for perfection; the knowledge of uncertainties and error push us to do better research in the future.

Andrew

 

Visualizing Uncertainty for the Layman

Thursday, March 8th, 2012

I want to respond to the ideas put forth by “SAH” and extrapolated on by “Madskiier_JWong” in relation to visualizing error. I agree with “SAH” that most people don’t think about the error in their geospatial data (although they certainly acknowledge its existence), nor even ask about it either. I also think “Madskiier” is on point when he talks about built-in “structural guidelines” for programs to show people how to build in error in a visual manner. In fact, I think this point is definitely something the authors Gary J. Hunter and Michael F. Goodchild talk a bit about in relation to Y. Bedard’s 1987 work establishing how we might strengthen geodetic control networks and define and standardize technical procedures – all in an effort to cut down error. On the converse side, I agree with more with Giles M. Foody’s perspective on uncertainty for the layman. He does a good job of covering this topic too when he writes that “it is, therefore, most unfortunate that the ease of use of many GI systems enable users with little knowledge or appreciation of uncertainty to derive polished, but flawed, outputs” (116).

But I want to take this discussion a step further and, perhaps, out of the cyber-realm of the practitioner or researcher to think about what error, or uncertainty, might actually mean for the layman who uses the end product of these practices. Hunter and Goodchild talk a bit about this in their article under the heaidng of “Progress in Error Visualization.” Particularly, they talk about adding a layer to a map to convey uncertainty, having given cells display multiple types of classes, or even varying data displays to convey a level of uncertainty(55-56). Yet these practices seem far from common, at least to me, in most of the products that average people rely on during their daily lives. The authors note that their colleagues “would like to take this testing further and establish laboratories in which practitioners would be invited to use spatial databases for real-life problems while under observation” (57). Well, not to be rude, but it’s about time.

I realize I may harp on my GPS a bit too much as part of our class (hey, it’s my only form of LBS), but what happens when the data programmed into it contains errors such as changed street names, landmarks or objects at different grid points, or restaurants that no longer exist? I’ll tell you what happens – I get lost or I go hungry. I realize most people accept and know about the uncertainty inherent in their GPSes navigation – and usually compensate for it with street wisdom. Furthermore, these days most people can simply plug there GPS into a computer program that updates the geographic data on their GPS (and some of these programs even have ways for users to help correct errors). But, in attempting to move us away from the technical side of Hunter and Goodchild, I can’t say I’ve ever seen cells with multiple designations or overlaid error maps on my GPS. Heck, in talking to friends, I was trying to figure out if I’d seen any programs out for common consumption that do this. I mean, wouldn’t it be novel to have my GPS tell me that the restaurant might be at that location give or take a 10 % chance of error?

Not to belabor the point, but I thought one more example might help. In the older days before GPS, I  drove to Boston using Mapquest directions (yes, Mapquest, not the now ubiquitous Google Maps). Unfortunately for me, back then Boston use to always be under some kind of street/highway renovation/construction program. Never, never did my directions get me to where I was going. I almost always got lost and, being a novice driver, very lost. Wouldn’t it have been nice to have been informed about this possible uncertainty in my directions before I left my computer behind at home? Well, my contention is that most of these companies don’t want to admit fault. We instead have to rely on outside Web sites to point out physical, digital and (if you see their post on Brazil) socially-incorrect errors.

I ask you all: Do you know of any programs you use on your computer, phone, GPS, etc. that shows uncertainty/error? How do you think these kinds of companies might be able to work such a feature in? Is it feasible?

–ClimateNYC

Hunter and Goodchild and Visualizing Uncertainty

Thursday, March 8th, 2012

Hunter and Goodchild explore current and future ways of visualizing error in GIS. An important distinction is made in the varied audience of these visualization tools, and how different representations should be used to target GIS novices and experts.

            I am a big fan of structural guidelines built into databases to encourage or discourage certain practices. These prompts are fairly unobtrusive and stem errors at the data collection stage, pre-empting any wasted time on faulty analysis. The acknowledgement of variability of data from different data sources is increasingly important in an Internet filled with geospatial data.

The image below shows uncertainty in contour lines by the size of gaps in a contour; larger gaps represent greater uncertainty. To me, this visualization provokes deeper reflection on the limitations of the vector data model in representing a continuous variable such as elevation. However, this thought process is unlikely to be the case for GIS beginners. The eye is still capable of recognizing the general pattern and connecting the dots. Furthermore, it is confusing whether the gaps represent individual spots where it was impossible to obtain elevation data, or whether the entire contour line was measured with low resolution/precision.   

Probability surfaces suggested by the authors are more concrete representations of the various possible outcomes. I believe it is important to show that analyses can still be derived, so that users are not discouraged by the increasing visibility of uncertainty and abandon the tools. A balanced message must be achieved to expose the limitations of GIS in a constructive way.

–          Madskiier_JWong

Image Source: Pang, A. (2001) “Visualizing Uncertainty in Geo-spatial Data” http://www.spatial.maine.edu/~worboys/SIE565/papers/pang%20viz%20uncert.pdf

Foody and Uncertainty

Thursday, March 8th, 2012

Foody’s article offers a fairly general overview of uncertainty and its impact on GIS analyses. I did not find this an enjoyable read as it seemed to touch on various implications of uncertainty on decision-making, but did not go in-depth on ways to handle uncertainty in a GI system. The chief insight brought out by Foody concerned the paradigmatic shift from absolute accuracy of data to its fitness or appropriateness to the research question.

            Sah mentions the importance of making uncertainty explicit at each step. To me, fuzzy sets and fuzzy logic (things can partially belong to different categories, or a percentage likelihood of being a certain class) seem one of the most intuitive tools to computer users to represent spatial uncertainty. “Stratified partitions” have also been used in other cases to track this uncertainty through different scales1. Additionally, fuzzy sets are most valuable for those transitional zones (which analyses tend to be most interested in for emerging phenomena) where uncertainty is highest. Despite these kinds of parametric solutions however, there remain ontological issues of deciding what categories are included in the fuzzy set. Given the increasing amount of data available through data mining, uncertainty needs to be handled in a more robust way that is more interoperable than fuzzy sets with predefined classes. Finally, to naïve users who aren’t familiar with the bits, bytes, and algorithms behind the scenes that drive classification and visualization on a GIS, is fuzzy logic an intuitive way of representing uncertainty?

–          Madskiier_JWong

1. Beaubouef, T., Petry, F. (2010) “Fuzzy and Rough Set Approaches for Uncertainty in Spatial Data” http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA530497&Location=U2&doc=GetTRDoc.pdf

Reviewing Geospatial Database Uncertainty with New Technologies

Wednesday, March 7th, 2012

The paper published by Hunter et al. in 1991 had categorized the challenges of uncertainty in Geographic Information Science (GIS) study into 3 types: definition, communication, and management. Authors begun with the presentation of error visualization, and discussed the uncertainty in visualization. Then, the approaches for error management in geospatial databases were pointed out, and the future research directions as well.

Authors also mentioned it was very helpful to look at the system logs for uncertainty determination. It might be a good solution at the time this paper was published, but currently system log analysis becomes a great challenge in pattern recognition study due to their high dimension. So the new question becomes: What is the tolerance for the uncertainty in system log analysis?

Uncertainty reduction and absorption were proposed as two solutions for error management. Authors utilized several good examples to demonstrate these two solutions. But with new challenges in GIS research (e.g., data intensity), these two solutions should be changed accordingly.

In their paper, authors also mentioned that the main reason that caused the poor utilization was the lack of confidence of the system, due to the fact that user cannot obtain enough information about the quality of the databases and the unacceptable errors. This might be true in the past, but nowadays Bayesian Network technique is utilized to handle this problem, as Lackey et al. presented in (Laskey, K.B., Wright, E.J. & da Costa, P.C.G., 2010. Envisioning uncertainty in geospatial information. International journal of approximate reasoning, 51(2), pp.209–223.)

–cyberinfrastructure

Science is Politics: Whatever

Wednesday, March 7th, 2012

From @Jackstilgoe
Interesting editorial in Nature called Political Science, which critiques Paul Nurse’s Dimbleby lecture. Bottom line: The practice of science cannot and must not divorce itself from politics.

My critique of the critique, which asserts that “although political (and religious) ideology has no place in deciding scientific questions, the practice of science is inherently political.” Rubbish. Whether or not you want to believe ideology has no place in deciding scientific questions, it does (e.g., denying evidence of homosexuality in the animal kingdom). The point is to appreciate that the theory AND practice of science has political assumptions. The scientific community is sufficiently robust to explicate the socio-political assumptions and conduct peer review of the quality of the underlying research.

Canada handling its own e-waste

Wednesday, March 7th, 2012

Canadian recycling of e-waste. I wouldn’t look to Canada as a model, even though we have fairly good governance infrastructure to support it.

Uncertainty and Decision-Making

Wednesday, March 7th, 2012

I really like the article by Hunter and Goodchild (1995) because it thoroughly addressed the different aspects of uncertainty in GIS. Further, the various diagrams were effective in conveying and summarizing the authors’ main points. I especially appreciate the discussion concerning the Variation in Impact of Spatial Databases upon Decision-Making. The authors argue that in order to assess how much impact uncertainty will affect the decisions-making process, one must consider the type of decisions to be made. Hunter and Goodchild conclude that uncertainty should be most critically assessed when decisions “carry political, high-risk, controversial or global implications” (58). Although this seem obvious, it is important to remember the consequences due to inability to handle uncertainty is often confounded by persuasive discourse in the aforementioned types of decisions. Because there is usually high stakes attached to these sorts of decisions, politician or advocates are tempted to play any uncertainty within the product in a way that will support their position. Foody (2003) also recognizes this danger and warns, “emotionally colored language is used in relation to issues associated with a large degree of risk” (114). Furthermore, I would be interested to know more about how and to what degree does uncertainty impact decisions. Is uncertainty the most influential when presented early on in the decision-making process or near the end of it? Finally, does the depiction of uncertainty actually lead actors to make better decisions? Under what circumstances does it lead to worse decision?

There are a few things in the article that I still need clarifying. For example, Figure 4 puzzles me — why does knowledge about uncertainty decrease when a person transitions from the Learning Phase into the Knowledgeable Phase? Also, with respect to uncertainty absorption, the author claim data producers are “preparing data quality statements from which users may make their own evaluation of the suitability of data for their purpose”. Is this just the metadata of how the data was collected? Does this force the decision of uncertainty absorption onto the users? The way in which producers vendors can absorb uncertainty is still vague to me.

 

Ally_Nash

Learning to Read Uncertainty

Tuesday, March 6th, 2012

Foody separates the concept of uncertainty into two types: ambiguity, which is defined as “the problem of making a choice between two or more alternatives” and vagueness, which is defined as “making sharp or precise distinctions” (114). Personally, I think this distinction confusing without more concrete examples or further explanation between how other “terms” used for uncertainty is related to these two groups. For instance, is reliability an issue related to ambiguity? Since I have always thought of uncertainty in terms of a balance between accuracy and precision, am I right to assume accuracy falls under the ambiguity type of uncertainty and precision falls under the vagueness type of uncertainty? Also, it would have been helpful to explain the “sorites paradox” (114).

The article, however, did spark my interest in how naïve users perceive uncertainty especially regarding topics addressed by public policy. Like the author mentions, uncertainty is inherent in data. But even if uncertainty were conveyed through GIS, how would people interpret them? Does “reading” uncertainty require a steep learning curve? Does the representation used to convey uncertainty have an effect on the faith readers place in the conclusion of the study? The image above shows the different ways in which dots can be used to depict uncertainty (MacEachren et al., 2005). It would be very interesting to see what kind of biases we have toward different methods of representation. Perhaps our biases will lead us to believe uncertainty is less “significant” if it is shown by different shades of one color compared to two different colors. Empirical studies from human cognition and geovisualization will be valuable to answer these questions.

Ally_Nash

MacEachren et al. (2005). Visualizing Geospatial Information Uncertainty: What We Know and What We Need to Know. Cartography and Geographic Information Science, 32(3), 139-160.

More to Uncertainty

Tuesday, March 6th, 2012

A point I mentioned in the previous post on uncertainty mentioned the interest I had in Foody’s point on zoning.  In his article entitled “Uncertainty, knowledge discovery and data mining in GIS”, he discusses how uncertainty is inherent in geographical data, and furthermore, suggests it is compounded by the way we use our data.  He says, “[c]hanges in administrative boundaries, for example, significantly handicap population studies, with incompatible zoning systems used over time”.  I said in my last point that it appears uncertainty is another challenge that may be mediated by making assumptions explicit.  I would like to examine this point further.

In saying that explicitness can mediate this problem, I don’t assume that merely laying out steps can “fix” error and uncertainty, but rather, by understanding, accepting, and most importantly, working with these issues at each step is incredibly important.  Similarly to the issue of scale, uncertainty is something that (while not “applied”, as scale is) at least occurs at almost each stage of working with data—observation and analysis.  This can be problematic because of the possibility to compound “layers” of uncertainty at various stages, resulting in even greater amounts of uncertainty by the end of the project.  So I believe by recognizing uncertainty at each scale, it becomes much easier to work with.  And as Foody notes, GIS can be thought of as “a means of communicating information”, which can extend to a means of communicating error.  However, I also believe that by understanding and communicating uncertainty and error, it can lead to better formulated and deeper questions, which may make use of error and uncertainty in data and factor that into the question itself, similarly to how errors in zoning with regards to scale can be used to illuminate beyond the surface of the data.

sah

You need one post per article this week too

Monday, March 5th, 2012

There’s only one lecture this week but two articles for it. Make sure you have a post per article.