Throughout reading Elwood’s article, marginalized communities came to mind, mostly because of the certain level of rigidity in her review of emerging geoviz technologies. I found it particularly interesting of the comparison that was made between ‘public’ and ‘expert’ technologies, where the status-quo of GIS comprises of the ‘expert’ (standardization of data) realm is threatened by the ‘public’ (wiki, geo-tagging, Web 2.0, VGI) realm. I agree with Andrew “GIS” Funa’s point on standardization. What is our inherent need to do this with all of our data? And what happens when standardization cannot be applied? More specifically, how relevant is an expert technology to marginalized communities if no one is willing to apply that technology?
There is a mention of ‘excitement’ and high hopes, which authors have for new geoviz technologies to represent urban environments; however the article does not expand any further. The article does, however, note the term ‘naive geography’ and its “qualitative forms of spatial reasoning” (259). Presuming one can safely state that representing marginalized populations is a qualitative problem, ‘expert’ technologies tend to not focus on these issues. According to Elwood, qualitative problems are more difficult than quantitative problems, “where exact measurements or consistent mathematical techniques are more easily handled” (259). So what do we do about unstructured, shifting, context-dependent human thought? So should we not try to digitally represent these data because it may be too difficult to decipher? To draw linkages and discover patterns? Will qualitative data always be at a loss because it will not fit an exact algorithm? I think we should take the spark of hope that MacEachren and Kraak gave us and strive beyond some of the limitations outlined by Elwood.
-henry miller
Tags: elwood, Geovisualization, marginalization, tool, VGI, Web 2.0