Goldberg et al.’s article ” From Text to Geographic Coordinates: The Current State of Geocoding” demonstrates an in-depth view of recent geocoding, its process and the errors involved in geocoding. That said, I wish that more discussed on the errors occurring in geocoding resulting from database managers “correcting” system errors in coding. In essence, when an error is “corrected” by a database manager, future geocoding tries to reconcile the changes and often leads to more error as the system tries to place other data into the “corrected” group or arrange the data to make sense next to the “correction”.
I have experienced this problem first hand when trying to geocode new points within a previous geocoding database. What happened is that a previous manager “corrected” different geocoding errors by manually entering data points as a new category, which conflicted with a several other previous categories within the database. Therefore, when I entered my points they were all coded as the new category, and located in the wrong areas, since the manual fixes superseded the software operational coding for placement when they were not equal to a known location. If I had not gone back to cross reference the points, I would never have found the geocoding discrepancies and “corrected” the points (although this may cause future errors, which were not apparent) .
In the article, the mention of E911 helping with address accuracy is an important step to reducing error, but I believe is irrelevant since technology with GPS sensors are becoming standard on most utility entry points at every address. For example, Hydro-Quebec is installing digital meters with GPS, pinging capability and wireless communications. These devices are then geocoded to a database, and therefore could provide accurate accessible location referencing that is self correcting, and thus reducing error for every address. As such, is the proliferation of intelligent technology reducing the error or adding to the complexity of geocoding?