Goldberg et al. provides an overview of geocoding and touches on several aspects that relate to GISci. For instance, the authors elaborate on how the evolution of the reference dataset has allowed for more complex and resourceful queries to geocode postal codes, landmarks, buildings and so on. The underlying cause of the shift has been attributed to the reference dataset. Not only have datasets become more complex, the increasing technological power has paved the way for intricate matching algorithms, and interpolation processes. Perhaps the next shift within the geocoding realm will be the increasing integration of context aware technologies. Now that people are progressively contributing geospatial content, issues of cost, money and time are significantly reduced. The authors suggest that once GPS technology is accurate (and affordable) enough to be equipped in all mobile phones, postal codes may become obsolete. But instead of rendering postal addresses obsolete, are we at the moment where LBS and AR can add to the accuracy of reference files, especially if volunteered geographic information is on the rise? I’m curious to see how the new ways in which data is being provided/created (in large amounts, provided by many people) intersects with geocoding today. Now that we’ve seen an evolution in the data models and algorithm complexities that contribute to geocoding, I can imagine that volunteered geographic information from a large number of people can be a new method to geocode but ultimately bringing with it a new set of complexities in data accuracy and reliability issues.


Comments are closed.