Thoughts on “Fisher et. al – Approaches to Uncertainty in Spatial Data”

This article by Fisher et. Al clearly lays out the components and concepts that are part of spatial data uncertainty and explain solutions that have been proposed to counteract their potential consequences on data analysis and interpretation. A better understanding of what uncertainty really is helped me realize that an overwhelming majority of geographical concepts are poorly defined objects, either being vague or ambiguous.

One solution for reducing the effects of discord ambiguity, although maybe not realistic but very practical, would be to create a global lexicon that stipulates how certain statistics need to be calculated and defines concepts on a global scale. This would allow for easier comparisons between regions currently using different approaches and would uniformize the process. However, it is important to note that this could not be applied to every statistical measurement, definition or observations made given the fact there could be biases against certain regions. An example could be that a road is conceptualized differently in one part of the world when compared to another.

On the topic of data quality, the advent of geolocational technologies has propelled geospatial data to the forefront of organizations and businesses aiming to profit from their use. Without trying to be too cynical, wouldn’t private organizations have an incentive to manipulate the data quality at the detriment of others in order to benefit themselves? This is where Volunteered Geographic Information (VGI), an example being OpenStreetMap, comes into play as to balance the playing field, in this case being Google Maps.

Comments are closed.