Archive for March, 2013

geocode all the things

Friday, March 22nd, 2013

Goldberg, Wilson, and Knoblock (2007) note how geocoding match rates are much higher in urban areas than rural ones. The authors describe two routes for alleviating this problem: geocoding to a less precise level or including additional detail from other sources. However, both these routes result in a “cartographic confounded” dataset where accuracy degrees are a function of location. Matching this idea — where urban areas and areas that have been previously geocoded with additional information are more accurate than previously un-geocoded rural areas — with the idea that geocoding advances to the extent of technological advances and their use, we could state that eventually we’ll be able to geocode everything on Earth with good accuracy. I think of it like digital exploration — there will come a time when everything has been geocoded! Nothing left to geocode! (“Oh, you’re in geography? But the world’s been mapped already”).

More interesting to think about, and what AMac has already touched on, is the cultural differences in wayfinding and address structures. How can we geocode the yellow building past the big tree? How can we geocode description-laden indigenous landscapes with layers of history? Geocoding historical landscapes: how do we quantify the different levels of error involved when we can’t even quantify positional accuracy? These nuanced definitions of the very entities that are being geocoded pose a whole different array of problems to be addressed in the future.

-sidewalkballet

To Digitize or Not to Digitize?

Thursday, March 21st, 2013

No process in GIS perfect. There are always limitations, many of which can be ignored, while others must at least be acknowledged. The application of the results of an analysis can have a drastic impact on whether errors are ignored, acknowledged, or painstakingly resolved. Consider  the difference between geocoding addresses at the national level to analyze socioeconomic trends. The power of numbers will outweigh the error generated during the processing, but it is enough to acknowledge the limitation. On the other hand, the example of the Enhanced 911 system requires that all addresses be geocoded as precisely as possible for times of emergency.

As a way of increasing the accuracy of geocoding processes, would it be sufficient to input a number of intermediary points as a way to accommodate for the uneven distribution of addresses within a given address segment. It would essentially act as the middle ground between leaving the results entirely up to geocoding and digitizing all addresses manually. After all, why do the corners of blocks have to act as the only reference point? It’s possible that there is an inherent topology that would be lost if this was to be implemented, but I cannot speak to that.

While reading Goldberg et al., one geocoding nightmare kept running through my head. It surprised me that it was not touched on directly. How has Japan addressed the situation? As an OECD country, it likely possesses sufficient GIS infrastructure. If I’m not mistaken, though, house addresses are not based on location so much as time. Within prefectures, addresses are assigned temporally, whereby the oldest structure has a lower value than a newer structure, even if they are immediately adjacent, two structures can have significantly different addresses. Just a thought.

AMac

Statutory warning: Geocoding may be prone to errors

Thursday, March 21st, 2013

The last few years have seen tremendous growth in the usage of Spatial Data. Innumerable applications have contributed to the gathering of spatial information from the public. Application’s people use every day like Facebook and Flickr have also introduced features with which one can report their location. However, people are not generally interested in geographic lat-long. Names of places make more sense in a day to day life. Hence, all the applications report not the spatial co-ordinates but the named location (at different scale) where the person is. The tremendous amounts of location information generated have not gone unnoticed and several researches have been conducted to leverage this information. But, one issue that is frequently overlooked in researches that use these locations is the accuracy of the geocoding service that was used to get the named locations. Not only is displacement a problem but scale at which the location was geocoded will also have an effect on the study. The comparison of the various accuracy of the available geocoding services done by Roongpiboonsopit et. al. serves as a warning to anyone using the geocoded results.

-Dipto Sarkar

 

Is There a Problem with “Authoritative” Geocoding?

Thursday, March 21st, 2013

Roongpiboonsopit and Karimi provide a very interesting study on the quality of five relatively known geocoding services.  Google Maps is something I use very often, however I never really critically thought about the possible errors that may exist and their consequences.  A study such as this allows us to understand the underlying parameters that go into these geocoding services and how they may differ from provider to provider.  One aspect that was really interesting to me was the difference in positional accuracy of different land uses.  Obviously, there tends to be an “urban bias,” of sorts, when geocoding addresses.  As a result, one is more likely to get an incorrect result when searching for address in rural/suburban areas.  While this makes sense due to spatial issues, I thought that this could theoretically be extending to other criteria.  While LBS becomes more popular and geocoding increases in importance, will certain companies offer “better” geocoding services to businesses that are willing to pay for it?  For example, Starbucks could make a deal with Google to ensure that all of their locations are correctly and precisely geocoded.  Taking it to the extreme, Google could even make a deal to deliberately sabotage the addresses of other coffee shops.  While I think this specific case may be unlikely, it does raise issues about having completely authoritative geocoding services.  As we increasingly rely on these geocoding services, the companies offering them have a large influence on the people who use them.

This leads into the idea of possibly relying on participatory services, such as Open Street Map.  OSM has made leaps and bounds in terms of quantity and quality of spatial data over the past few years.  I am curious to see how it would match up with the five services in this paper.  OSM relies on the ability of users to edit data if they feel it is incorrect.  Therefore, the service is theoretically consistently being updated depending on the number of users editing a certain area.  As a result, errors may be less likely to be consistently missed, as with the case of a more authoritative geocoding service.  It would also be interesting to see the type of buildings that may be geocoded more or less accurately.  As we continue to enter this age of open and crowd sourced spatial data, I believe it has the potential to provide us with even better services.

-Geogman15

 

Unloading on Geocoding

Thursday, March 21st, 2013

Geocoding, like many of the concepts that we study in GIScience, is very dependant on the purpose of the process. The act of geocoding is often confused with address matching, which is sometimes correct, however it can also be georeferencing any geographic object and not just postal codes. This implies that the perception of geocoding will affect the ways that we go about doing it.

 

There are many ways to geocode, as described by Goldberg, Wilson, and Knoblock, and no single one of them is universally correct. Each method uses different algorithms to try to match some identifier to a geographic reference. For example, it might find the length and endpoints of a street and then use a linear interpolation to find the location of a given postal code. The geographical context also bears a great importance in determining which algorithm to use. For example, the method described above may work better in a city with short, rectangular blocks, however it may be less applicable in rural China. These are some of the things that one has to consider when choosing a method of geocoding.

 

The future of geocoding is perhaps less certain than many of the other GISciences, because as technology and georeferencing becomes more ingrained in our society, the algorithms used to match these objects with a geographic location will become less important. Things like GPS are becoming more and more commonplace in many appliances, however this brings up questions of privacy. Ultimately, the future of geocoding will be a balancing acts of tradeoffs between public acceptance of technology and the development of more powerful and purpose-driven algorithms.

 

Pointy McPolygon

 

Initiate the Geocoding Sequence

Thursday, March 21st, 2013

Geocoding is the fascinating process of associating an address or place name with geographic coordinates. Traditionally, Geocoding was solely the realm of specialists, requiring a specific set of skills and equipment. However, with the advent of modern technology, including Web 2.0 applications, Geocoding is now easier than ever for the everyday user. However, despite the multitude of Geocoding services, such as Google and MapQuest, each service uses different algorithms, databases, etc. to code their locations. Therefore, users might not be aware of which services offer the best quality results, or on the contrary, may offer innacurate results. The quality of Geocoding results may in turn affect subsequent decisions, modeling, analysis, etc.

Overall, one of the biggest problems facing Geocoding is the accuracy of the results. In particular, one problem mentioned by the authors was the poor accuracy of addresses located in rural, agricultural, and urban areas. On the other hand, most urban locations tended to be geocoded similarly across platforms. In addition, it was also interesting to note that several platforms consistently offered more accurate results: Yahoo!, MapPoint, and Google. It would be fascinating to investigate what type of geocoding algorythms, databases, etc. these services use, and if they are similar or relatively different.

Another fascinating trend to consider is the future of direction of Geocoding. One possibility could be the standardization of geocoding databases, algorithms etc. On the other hand, this in turn may lead to redundancies in geocoding services, which might not be a realistic outcome. Overall, the future of Geocoding as a useful tool is heavily dependent on how useful and accurate the results can be.

-Victor Manuel

NeoGeocoding?

Thursday, March 21st, 2013

The article of Roongpiboonsopit and Karimi highlights the fact that ubiquitous mapping and new practices facilitated by the technology allow everybody to geocode data without really knowing what is happening ‘behind’ the geocoder tools (Geocoder.us, MapQuest, Google, MapPoint, and Yahoo!).

Coding is defined in the article as “applying a rule for converting a piece of information into another”. But who controls the rules ? Users have little control over the process since they don’t interact with the geocoding algorithms and the reference databases.

The authors’ analysis shows the importance of questioning the tool used to produce data because errors and uncertainties related to the data produced have an impact on further analysis and decision making. These points relate to the neogeography literature and critics of GIS discussed during the class. More specifically the debates over ethics and practices of ‘open participation’ as a democratization or as an exploitation of user-generated production of surplus of data. In VGI, is geographic information being produced by ‘citizen censors’ or by ‘cognizant individuals’ as mentioned by Andrew Turner (in Wilson and Graham, 2013)? The two different terms underline the question of how aware citizens are when they produce data and geocoding information. This leads to the question of accuracy and how much we need accuracy. It is probably not always important to achieve a perfect accuracy. However, I think that it is crucial to be aware of the lack of accuracy and to make the uncertainties explicit in contrast of leaving it ‘behind’ the tools.

Furthermore, the uneven results generated by the geocoding processes depending on location can also be linked to the debates discussed earlier in class about the digital divide in GIScience. Data are more accurate in urban and suburban areas than in rural areas due to the quality of reference databases in urban areas. Again, I think that there is a need to make these differences more explicit to users or/and producers in order to bridge the gap between experts and amateurs’ production of knowledge.

Wilson, M. W. et M. Graham (2013). Neogeography and volunteered geographic information: a conversation with Michael Goodchild and Andrew Turner. Environment and Planning A, 45(1), 10-18.

S_Ram

Geocoding

Thursday, March 21st, 2013

Goldberg et al. provides an overview of geocoding and touches on several aspects that relate to GISci. For instance, the authors elaborate on how the evolution of the reference dataset has allowed for more complex and resourceful queries to geocode postal codes, landmarks, buildings and so on. The underlying cause of the shift has been attributed to the reference dataset. Not only have datasets become more complex, the increasing technological power has paved the way for intricate matching algorithms, and interpolation processes. Perhaps the next shift within the geocoding realm will be the increasing integration of context aware technologies. Now that people are progressively contributing geospatial content, issues of cost, money and time are significantly reduced. The authors suggest that once GPS technology is accurate (and affordable) enough to be equipped in all mobile phones, postal codes may become obsolete. But instead of rendering postal addresses obsolete, are we at the moment where LBS and AR can add to the accuracy of reference files, especially if volunteered geographic information is on the rise? I’m curious to see how the new ways in which data is being provided/created (in large amounts, provided by many people) intersects with geocoding today. Now that we’ve seen an evolution in the data models and algorithm complexities that contribute to geocoding, I can imagine that volunteered geographic information from a large number of people can be a new method to geocode but ultimately bringing with it a new set of complexities in data accuracy and reliability issues.

-tranv

Putting Geography on the Map

Wednesday, March 20th, 2013

Roongpiboonsopit and Karimi’s 2010 comparison study of five free online geocoders is an example of an important process in weeding out poor geocoders and creating one that works accurately and reliably. Considering geocoding is at the heart of many GIS applications, especially those involving the Geoweb, spatial accuracy is key. The authors used empirical analysis to determine the “best” geocoders based on accuracy and a number of other metrics. They concluded that Google, MapPoint (Microsoft) and Yahoo! are all quite effective, while Geocoder.us and MapQuest are less so.

In thinking about geocoding and the development aspects of the geocoding process, I realized that geocoding development is much like development in other topics we’ve covered, such as LBS and AR. As all of these progress and become more accurate and lifelike, they are approaching a level of artificial intelligence that is simultaneously creepy and cool. For instance, if a geocoder like Google’s uses all of the information it already has on us, there will not be any need for formal geographic indicators like street address, coordinates, or even official place names. If the reference database were to include our own vernacular and preferences along with the official names and spatial reference algorithms, then simply typing or saying “home” would pinpoint our location on a map quickly and easily. This isn’t even about spatial accuracy anymore, but rather “mental” accuracy. Perhaps I’m getting ahead of myself, but I see the possibilities with geocoding not just in terms of developing a usable application for plotting points on a map, but also in terms of expanding how we think of space (ontologies) and how we conceptualize our environment (cognition). Integrating new tools into the pre-existing algorithms has and will continue to change how we live our lives.

– JMonterey

Geocoding Errors

Tuesday, March 19th, 2013

Goldberg et al.’s article ” From Text to Geographic Coordinates: The Current State of Geocoding” demonstrates an in-depth view of recent geocoding, its process and the errors involved in geocoding. That said, I wish that more discussed on the errors occurring in geocoding resulting from database managers “correcting” system errors in coding. In essence, when an error is “corrected” by a database manager, future geocoding tries to reconcile the changes and often leads to more error as the system tries to place other data into the “corrected” group or arrange the data to make sense next to the “correction”.

I have experienced this problem first hand when trying to geocode new points within a previous geocoding database. What happened  is that  a previous manager “corrected” different geocoding errors by manually entering data points as a new category, which conflicted with a several other previous categories within the database. Therefore, when I entered my points they were all coded as the new category, and  located in the wrong areas, since the manual fixes superseded  the software operational coding for placement when they were not equal to a known location. If I had not gone back to cross reference the points, I would never have found the geocoding discrepancies and “corrected” the points (although this may cause future errors, which were not apparent) .

In the article, the mention of E911 helping with address accuracy is an important step to reducing error, but I believe is irrelevant since technology with GPS sensors are becoming standard on most utility entry points at every address. For example, Hydro-Quebec is installing digital meters with GPS, pinging capability and wireless communications. These devices are then geocoded to a database, and therefore could provide accurate accessible location referencing that is self correcting, and thus reducing error for every address. As such, is the proliferation of intelligent technology reducing the error or adding to the complexity of geocoding?

Geocoding and public health

Monday, March 18th, 2013

I think Geocoding is one of the most central issues in GIS (science and systems) and yet is probably one of the less well understood issues for lay people and non-experts who use spatial data (along with map projections). As the authors mentioned, there has been research in public health and epidemiology on the accuracy of street addresses. In fact, a PhD student in my lab lead a research project on this very issue here in Montreal (Zinszer et al. 2010, cited below). The team found that address errors were present in about 10% of public health records, the same ones that were used to perform spatial analysis to look for space-time clustering of campylobacteriosis in Montreal. Geocoding has all kinds of repercussions in public health research; while errors are an issue, anyone who performs epidemiological research with administrative databases are prepared to have some amount of error. However, when the error becomes differential with respect to some factor of interest, this can result in a huge problem (bias). For example, as mentioned in the discussion, the accuracy of geocoding was differential between urban, suburban, and rural areas. There is a lot of spatial epi research done with the urban-rural health divide in mind, and differential accuracy in geocoded addresses like this could pose a huge problem. I think papers like this one by Roongpiboonsopit and Karimi are very useful for people outside of GIS because they help us understand the scope of the issue. I also think Roongpiboonsopit is a super awesome name.

Zinszer K, Jauvin C, Verma A, Bedard L, Allard R, Schwartzman K, de Montigny L, Charland K, Buckeridge DL (2010). Residential address errors in public health surveillance data: A description and analysis of the impact on geocoding. Spatial and Spatio-temporal Epidemiology. 1(2-3): 163-168.

 

-Kathryn

Business and LBS

Friday, March 15th, 2013

Before I read this paper, I was not very familiar with the concept of location based services. There are of course lots of commercial uses of geographic information but this direct discussion of spatial information translating directly into business is very interesting and not something I had thought about. This subject seems like another interesting area that intersects with the topics we have already discussed, such as augmented reality, where location can be used via say, GPS, and various software and hardware to augment individuals surroundings that could direct effect sales and be used for marketing. As mentioned in the paper, privacy is of course another theme that has surfaced often in class and again is an issue here, but there is also a lot of potential for creative uses of spatial information in ways that individuals would appreciate. Looking forward to seeing how others perceive the utility of LBS.

-Kathryn

Time geography

Friday, March 15th, 2013

Even though this article is not new, it seems like the field has not advanced incredibly since it was written. Being able to efficiently and effectively store, retrieve and particularly visualize the temporal aspects of data in GIS continues to be a major limitation of spatial analysis. In spatial statistics, models that account for space-time variation rather than static spatial information are usually just logical extensions of the same spatial models. But spatial stats doesn’t have to deal with the same data model issues that GIScience has to address, nor does it have to attempt the visualization aspects that GISystems are expected to handle. I am interested in seeing what the current state of the art is, in case there have been major advanced that I somehow missed since my masters.

-Kathryn

LBS and corporations’ interests

Friday, March 15th, 2013

I feel like the LBS development are focused on marketing purposes and privacy issues might be underestimated in this domain. Profit is a major interest and priority before individual’s privacy concerns. The article of Rao and Minakakis shows that the way that LBS can be profitable and worth the developments in this field is by gathering information, conducting data mining and ‘business intelligence activities’ to extensively understand consumer’s profiles and shopping habits. The authors recognize that there are ethical and privacy issues related to the use and disclosure of information to third parties but do not engage with this topic. I’m wondering if developing business models on one side and leaving the privacy issues to privacy advocates on the other side is a good approach in the development of the technologies and LBS…

S_Ram

Challenges and temporal GIS

Friday, March 15th, 2013

The article of Langran et al. is dated but it represent an important development in the field of temporal GIS.

The space-time composite model breaks from early snapshot models that were ineffective in representing spatio-temporal complexities. The space-time composite model proposed by Langran et al. takes into consideration both database time and world time. Nonetheless, it seems like important challenges in spatio-temporal GIS are not totally addressed in this model. For example, the model is based on the linear conception of time as a progress and doesn’t allow the cyclic perspective of time as a continuity. Thus, representing changes that occur as part of a cycle evolution would not be supported in this model.

Also, spatio-temporal relationships between entities need to be described in order to allow complex queries, for example what happened to a parcel over time?

The space-time composite model might not be appropriate to represent moving objects and queries related to travel distances, speed, etc.

Furthermore, the spatio-temporal composite data model might not deal properly with keeping track and recording continuous changes. For example, when a polygon is split, new entities are created with each distinct attribute histories and so the new polygon is identified with two different names. Difficulties might arise to keep the link between the old polygon and new ones.

S_Ram

Location Based Surveillance

Friday, March 15th, 2013

Steinfield’s ca. 2003 paper reviews the current state and future prospects of Location Based Services—electronic applications which make use of the user’s location to communicate relevant information or take contextual actions. The development that got Steinfield discussing LBS was the increasing ubiquity of mobile phones at the turn of the millennium, which seemed to offer myriad possibilities to keep people connected while away from their desks while regulations in western countries were demanding increased locational accuracy for pinpointing the source of emergency calls.  Unfortunately, at the time Steinfield was writing, LBS had not been very successfully rolled out, and mobile service providers were grappling with which of many user-locating methods to implement.  Today, this question is somewhat settled, since most new smartphones ship with built-in GPS functionality, and there are now fairly reliable ways (though I’m not sure of the specifics) for cellular service providers to triangulate users’ positions using cell towers. For LBS purposes, quickly retrieving user locations has moved from dream to reality in the last 10 years.

All this locational information floating around has become a fertile source for big data applications like transportation planning and monitoring, and has thus opened up many new avenues for geospatial research.  At the same time, the new ubiquity of LBS is challenging the privacy of users.  Steinfield’s analysis made the assumption that the primary gatekeeper for LBS applications would continue to be the mobile service provider, but with GPS now a standard feature of cell phones, anybody whose application has been installed on your device theoretically has access to your location.  Google even knows where I amwhen my phone’s GPS is turned off—they’ve mapped the wifi networks I connect to! A lot of this user information is collected and used in ways that are not fully transparent to the user, for uses ranging from targeted advertising and selling your information (it’s mostly this, actually) to state surveillance.  While Steinfield may have had user privacy at the forefront of his evaluation of location-finding methods, this priority has fallen by the wayside in the pursuit of the next killer app. At our peril?

-FischbobGeo

What will it take for LBS to take the next step?

Friday, March 15th, 2013

Location Based Services can provide many benefits to consumers. Generally, it involves answering 2 questions: how do I get to where I want to go? And once there, what kind of personalized services can be offered? One of the major issues is identifying precisely where someone is, in order to be able to offer them services that are specific to their location. The importance of location is emphasized by the commonly coined phrase in business ‘location location location’.

 

Rao and Minakakis, in their paper on LBS, identify 4 major issues in the implementation of this technology. While I agree with their assertion, I feel like the major issue is that access is not equally distributed among all population groups. While many phones have 3G now, not everyone owns these phones. Also, different phones may have different precisions, enabling different levels of services to be offered. Of course, another issue is privacy. If I want to purchase a phone with high precision capable of offering me more services, should this necessarily mean that I have to surrender my location? This comes back to the question of if you are alright with a machine knowing information about you.

 

Business models will also become crucial in the large scale application of LBS, an industry that is sure to have increasingly large revenues. Of course, the models will have to be sustainable and take into account the caveats listed above. LBS has the potential to combine with many other fields of GIScience to provide an augmented life experience of the future, but only if implemented correctly.

 

Pointy McPolygon

 

There Should be an App for That

Thursday, March 14th, 2013

First of all, expectations are always going to fall either short or long of reality. Rarely, if ever, does anyone get it spot on. Consider the predictions published in 1899 of what the year 2000 would look like (http://gizmodo.com/5939765/what-people-in-1899-thought-the-year-2000-would-look-like). Aside from the fact that everyone is wearing shoes, and heavier-than-air human flight has been developed (in a way), they were dead wrong. The same can be said of the opening statement of Stein, in which he states “location-based services has fallen somewhat short of expectations.” They have come a long way since their infancy, and are continuing to grow. Chances are, development will slow, or cease, due to us running out of time, and not because the perfect device has been created.

Location based services and GIS do not share an evenly balanced relationship. One side takes, while the other side makes. In this case, GIS is responsible for “offer[ing] a range of mapping services and geographically oriented content.” Location based services then take the content and distribute it accordingly. That does not mean that GIS will eventually deplete it’s supply of data, but location based services will become increasingly dependent on higher quality, more diverse, and increasing update rates of data. If a location based service asks the user for information, a GIS is told what the user is interested in regardless of where the analysis is being performed. Furthermore, GIS users have far more control over the spatial data, compared to location based service users. That is, until GIS software is embedded with location based service capabilities, allowing for it track the location of it’s users. Here’s an idea, in the event that GIS platforms become sufficiently portable that software can be taken mobile, a location based service could suggest shapefiles for analysis given previous use habits, and the current location of the user, allowing them to validate their results in real time. There should be an app for that.

AMac

Temporal Topology

Thursday, March 14th, 2013

Location, size, and proximity are just three of many characteristics a feature can be attributed. As complex as they are, the topology and relationships are absolute. Before reading this article I thought it was just a matter of applying the concept of a temporal relationship in a similar manner. I still believe that this is possible. For instance, the questions that the authors answer in Figure 5 could be answered similarly using the equivalent of “Clip” or Raster Calculator. It would be laborious, time consuming, and consist of a rigid framework, but one could still answer the question, “Which areas were fallow land during the last 20 years?”

The framework that Marceau et al. develops is much more dynamic, and thus all calculations can be completed before asking any questions, as opposed to asking a specific question and then answering it after numerous clips and overlays. Generating a user-friendly temporal-spatial model would be a big step forward in answering questions in the fourth dimension. Especially now, considering the ever increasing rate at which data is collected.

Like many problems with GIS, if the data was water and the processing was the pipe through which the water must pass, there will always be a limiting factor. The author’s are of the opinion that spatio-temporal data set availability is lacking, but make progress in further widening the pipe. In the coming years I believe that the limiting factor will again become predominantly the processing of the data as spatial data is collected at an ever increasing rate.

In other news, did anyone else have trouble where the document was missing all text “fi” was missing?

AMac

Temporal GIS do we go back, or only forward?

Thursday, March 14th, 2013

Marceau et al.’s paper on temporal topology in GIS Databases outlines the faults with temporal GIS which seems echoes Marceau’s earlier paper on spatial GIS and its faults within social and natural sciences. In both temporal and spatial GIS, as I compare the two papers, the resolution seems to be one of the main issues affecting the accuracy of topology. To clarify, in spatial GIS, higher resolution reflects more data acquired and more accurate spatial topology; while in temporal GIS, higher resolution reflects a higher rate of sampling versus change in the area and more accurate temporal topology. To simplify, when dealing with temporal GIS, it comes down to the sample rate and what is included within the sample and thus, as talked about in spatial scale the “politics of scale”.

I believe that Marceau et al. tried to address the intervals of sampling, however I believe, from everything I have read on scale and scale changes, that Marceau et al.’s approach may be fatally flawed in that it is too simple to transfer to larger areas with greater variability of change. The method once upscaled will produce uncertainties if not greater uncertainties then what was already have within there study area. In essence, it is the “politics of scale”, were the question on how temporal GIS operates or can operate within a software platform, is mired in uncertainty by the data collected over time and the modification by Marceau et al. during the application of the data within a platform setting.

For temporal GIS, it may be impossible to go back in time to map. GIScience may have to start new from the present time working into the future. Therefore, GIScientists will know, now that platforms exist, that certain data sets need to be created that can represent the change rather then extrapolation into the past, which is inherently uncertain without wide-scale identifiers present (i.e. Land survey archeology or the process of digging into the ground to identify past land-uses for topology identifiers).

C_N_Cycles