Posts Tagged ‘GEOG 506’

TGIS back then…

Tuesday, December 2nd, 2014

From the Yuan’s article, he criticizes that the temporal GIS is not supporting properly the spatio-temporal modeling. He argues that the temporal database systems are being applied by time-stamping techniques to tables, attributes or values to incorporate time factor and therefore lack in proper application. I find this article interesting yet inappropriate.

This article is interesting in a sense that back in late 90’s, assuming from the range of dates of cited articles varies from late 80’s to mid 90’s, despite the lack of technologies that could have point out such problems in temporal GIS, yet he was able to pointed out and therefore one can only imagine the frustration when the methods were not appropriately applied and also the technology was not apt to well incorporate the temporal attribute as well at the time. Which leads to the reason why I call this article inappropriate, because it could have mislead to some people how temporal GIS is being not developed even today, whereas it is, at least in the perspective of technology, more than enough to be ready to be applied and being applied currently. It is all thanks to the advance in technology and crowd-sourcing. As Yuen have mentioned, “GIS will be more effective and precise by representing a fire, fire location, and how the fire spreads rather than burned areas at each point in time.” To relate to the example, nowadays, fire departments and natural scientists are closely monitoring the natural forest fire and the tracking, using temporal GIS. On the other hand, I find the burned areas at each point in time eventually leads to the evidences of fire spreads. However, it is facilitated by today’s technologies to create such connections whereas it was impossible to do so about 20 years ago. Hence, this article may be easily misleading or perhaps even confusing to people who are learning and being used to today’s GIScience technologies, especially without further information/background knowledge on TGIS.

-ESRI

Significance of Geovisualization

Monday, October 27th, 2014

A visual representation of data has always been known to be very helpful to exhibit information to the viewers and non-expert users in any field of study, particularly for study requiring data analysis. For instance, a map is an excellent example of a visual representation of the world or part of the world itself. In the Kraak’s article, the emphasis was on the usefulness of geovisualization and it is also argued that other graphic representations aid in simulating visual thought process, using the Minard’s map of champagne de Russie 1812-1813 with some geovisualization techniques.

 

In this posting I’d like to mention a couple of other examples that underline the importance of visualization to explore data, as well as the advantages from it. Graphic visualization can often reveal patterns that are not necessarily seen on tables and charts. For instance, a thematic map is known for its ability of displaying a connection between specific themes and geographic areas. This type of visualization highlights on spatial variations of one or a small number of geographic distributions. An example of thematic map can be found in the following link:

http://1.bp.blogspot.com/_RpkOLpWs7KA/TFOt-UruAvI/AAAAAAAAAG8/4U1xBqz_d38/s1600/cartogram.jpg

Another visualization tool is an application from ESRI, called Story Map Swipe. It is a tool used to create the story map more interactive and one can use this Swipe tool back and forth to compare one map to another very easily and quickly, therefore the impression one receives may differ or accentuated. An example of the Swipe can be found in the following link:

http://tmappsevents.esri.com/website/swipe-sandy-custom/

 

These types of visualization of data will definitely stimulate the visual thought process of any viewer and user compare to represent same data as a simple table or charts. Furthermore it may lead to a hint for new hypothesis or even an innovative solution to a problem.

ESRI

We are where we live

Monday, October 20th, 2014

Up to now several topics such as public participation GIS, Cyberinfrastructure, Agents, Social Network Analysis methods, etc. were discussed. These topics are pretty much about manipulating geospatial data, since that is what we do in GIScience, I think. And when it comes down to the data, humans are all about providing data. There are many use of such data and because of its enormous quantity, one can collect and retrace back or even create the target’s profile based on tweets, purchases made, photos, friends, used search keywords, etc. Therefore one can use software to estimate the living setting of the user quite accurately using such information available in the web. As for geospatial data, specifically, one can collect, manipulate and sometime visualize the data as a map and one may believe such digitalized representation of a space is in fact identical to the actual space, and it was possible because it is mere a space.

However, in EPBG, a space is not just a space. In my understanding, EPBG argues that there are such a strong relationship with one’s behavior and the way one perceive the surrounding, and not the surrounding itself. It was quite intriguing, because in that sense, if two individuals who are located in completely different location may behave quite similarly if their way of processing the external information for the different settings are equivalent and vice versa.

So far in GIScience, humans are all about providing data. Whereas in EPBG, it is human, specifically the brain, that collects the information from the past experiences, memories, etc. from a specific space and re-create a place, therefore each space representing a unique place for each individual and that leads to the specific behavior.

This reminds me of a quote “You are what you buy”

And make me think that in fact, we are not only what we buy/eat, but we are also where we live. Since there is a clear distinction with people living in North America settings, like us, versus Europe, Africa, Middle East, and Asia, in terms of level of education, daily life style, diet, language, etc. Furthermore, even within the North America, depending on which regions distinguishes us and one can observe such distinctions even within a city like Montreal. And there comes the issue of MAUP.

Long story short, it is not whether we, as humans, our behaviors are shaped by the environment per se nor these shape the environment, but it is rather bidirectional: we perceive the environment the way we have been affected by, therefore both are intrinsically correlated.

This is such a headache because psychology is not my strongest field. Nevertheless, I find this subject quite absorbing.

ESRI

The survival of the fittest

Monday, October 13th, 2014

I was at first worried about the way I felt when I first read this article, but seeing the postings of other people, I am relieved that I was not the only one who thought that this article is too much. I would like to approach this article in a very different angle, and watch out: This may sound weird and very offensive to some people as well.

First of all, when we are using ArcMap Desktop for instance, we use GIS technology to represents what is in fact a 3D, Earth surface, into a 2D, a digitalized map and it is a “representation” that can be stored and manipulated to be used for different projects, and not necessarily contain any meaning more than that. Yeah of course back in the time, European people did assimilated aboriginal population for the resources and the terrain and all, but in 21st Century, I believe they have more realistic interest than willing to take away anything from aboriginal population, or spend decades in research and finance to assimilate the already-so-minority aboriginal population and their culture because they cannot stand it. Why so much hate? Chill!

Besides, this is just a thought that I got it few years ago when I watched couple of documentary videos concerning aboriginal population and their view on how Western people tend to take their culture away and assimilate them with Western cultures: If these aboriginal people are so into the flow of the nature and that everything should flow naturally according to the nature, how come they are excluding themselves? Why can’t they think that the Westerner people assimilating may be just the way nature is according them to do so? Just like some ants species are making war against other ant species and take over their territory. Isn’t that how the Mother Nature always let things happen?  The survival of the fittest?

ESRI

Geospatial Cyberinfrastructure: Past, Present and Future

Sunday, September 28th, 2014

In Yang et al’s article, the authors briefly, yet with enough detail, explains the origin of “Geospatial Cyberinfrastructure” (GCI), various technologies that contributes to its birth and current uses of it.

From this article, GCI is referred as an infrastructure that can support the collection, management and utilization of geospatial data, information and knowledge for multiple science domains based on recent advancements in geographic information, science, information technology, etc.

As a newbie who just started to explore the world of GIS, it was a surprise to learn about an existence of GCI that encompass even the Geographic Information Science, because I found that the concept of GIScience itself was already quite vast when I first learn about it from this course just a couple of weeks ago.

Putting aside my own impressed feelings, as I was reading further in the article, I found it very informative overall and liked the ‘discussion & future strategies’ where the authors even assessed the future studies required for the GCI to improve further. On the other hand, at some point of the reading, it seems like the authors seem to overly emphasize the importance of developing GCI, but I guess that was the whole point of this article anyway.

ESRI

Public Participation related issue

Monday, September 15th, 2014

This article is based on several years of studies and multiple research project that examined through the social change, capacity of PGeoweb to support citizen science, participating in decision-making, etc. This project was conducted my numerous researchers and yet the paper seems to be very biased towards the pro-Geoweb only. Since the very first time I have learned about the VGI, I have always wondered about the issue of accuracy and standard related issue and when I started to read this paper, I kind of hoped that some sort of solution or any suggestion concerning those issues would be mentioned, but it did not.. In addition, when a public participation and/or crowdsourcing issues are concerned, whether web based or not, there used to be always some kind of manipulation issues that arise as well and nowadays, when cyber security is becoming more and more serious social problem, simply encouraging the public participation using web application without mentioning such issue doesn’t seem very convincing to me. Or perhaps I am just being way too skeptical about this…

ESRI

GIScience 15 years later

Monday, September 8th, 2014

In this article where Goodchild reviews his own article 15 years later, he supports his own argument from his previous article about how GIScience would be used to research about GIS to improve the technology and research with GIS to exploit the technology in the advancement of science. Further, he underlines the huge impact of Internet to the use and evolution of GIS.
It feels like 15 years ago, in early 90’s when the distribution of Internet and the mobile technology were not as advanced as today, I wouldn’t be as positive about the idea of GIScience as today, since it would be almost impossible to see its usefulness or need of it to be considered as a ‘science’, not saying that one has to be necessarily very useful in our daily life to be part of a branch of science. However, one cannot deny the fact that the use of Internet, such as using based map pre-loaded by other industries, with GIS software has drastically changed and widened its potential.
Even if I am not fully convinced about GIScience yet, it seems like Goodchild does have very good points about his arguments, especially about its usage growth in diverse domains and Internet involvement that caused GIS to evolve much further.

ESRI

GIS: Tool or Science? – past, present and future?

Monday, September 8th, 2014

As a student who took a couple of GIS courses, it never hit me that some people would argue about defining GIS as system or science. In addition, the fact that the result of such debate may cause huge impact on academia was interesting as well. To be honest, I always thought of GIS as a domain where one uses a particular software to store, manipulate, analyze and visualize spatial or geographical data as well as other type of data involved, and not so much as science.
However, after reading the Wright, Goodchild and Proctor’s article, I started to think that it may not be impossible to consider GIS as science. Then I wondered, what is science exactly? There are several ‘types’ of science, that are divided into natural science, applied science, fundamental science, etc. It was interesting how in Wright’s article the distinction between different branch of science was not covered in detail. I assume that if GIS is to be considered as science, it would probably be part of applied science, where computer science is part of it.
In the past, computer science or neuroscience couldn’t exist until the advancement of technology allowed us to discover and develop it. It seems like GIS is going through a similar process. In the past, GIS was only known by experts in that domain, often geographers. However, nowadays with the wide use of internet and technology that allow people to use GIS and also its implication in other domain such as social and medical for instances, shows that GIS has evolved, evolving and will evolve continuously.
Personally, GIS is a tool rather than science for me so far, however, I wouldn’t be surprised to see myself looking at GIS from different perspective, in near future perhaps.

ESRI

Visualizing Uncertainty: mis-addressed?

Wednesday, April 3rd, 2013

“Visualizing Geospatial Information Uncertainty…” by  MacEachren et al. presents a good overall view of geospatial information uncertainty and how to visualize it. However that said, many parts seemed to convey that all uncertainty must be defined in order to make correct decisions. In my realm of study, although it would be nice to eliminate or place uncertainty in a category, just the recognition that there is uncertainty is often definition enough to make informed decisions based on the observed trends. Furthermore, the authors seem to separate the different aspects of the environment or factors that lead to uncertainty and how it may be visualized. The use of many definitions and descriptions convolutes what is really the factors that result in uncertainty and the resulting issues with visualization. The way visualized uncertainty is presented greatly contrasts the ambiguity of the definitions behind uncertainty and its representation presented by the authors. The studies and ways uncertainty can be visualized is a great help in decision making and the recognition of further uncertainties.

One aspect that would have help in addressing uncertainty and its visualization would have been to integrate ideas and knowledge from the new emerging field of ecological stoichiometry, which looks at uncertainty, the flow of nutrients and energy, and the balance within ecosystems to answer and depict uncertainty. I believe that ecological stoichiometry would address many of the challenges in identification, representation and translation of uncertainty within GIS and help to clarify many problems. This stoichiometric approach falls along the scheme of the multi-disciplinary approach to uncertainty visualization described within the article.  However, as the article is limited to more generally understood approaches, rather than more complex ones, such as stoichiometry, do some of the proposed challenges in recognition and visualization of uncertainty not exist?  I would argue yes, but then again more challenges may arise in depiction, understanding and translation of uncertainty.

C_N_Cycles

Is There a Problem with “Authoritative” Geocoding?

Thursday, March 21st, 2013

Roongpiboonsopit and Karimi provide a very interesting study on the quality of five relatively known geocoding services.  Google Maps is something I use very often, however I never really critically thought about the possible errors that may exist and their consequences.  A study such as this allows us to understand the underlying parameters that go into these geocoding services and how they may differ from provider to provider.  One aspect that was really interesting to me was the difference in positional accuracy of different land uses.  Obviously, there tends to be an “urban bias,” of sorts, when geocoding addresses.  As a result, one is more likely to get an incorrect result when searching for address in rural/suburban areas.  While this makes sense due to spatial issues, I thought that this could theoretically be extending to other criteria.  While LBS becomes more popular and geocoding increases in importance, will certain companies offer “better” geocoding services to businesses that are willing to pay for it?  For example, Starbucks could make a deal with Google to ensure that all of their locations are correctly and precisely geocoded.  Taking it to the extreme, Google could even make a deal to deliberately sabotage the addresses of other coffee shops.  While I think this specific case may be unlikely, it does raise issues about having completely authoritative geocoding services.  As we increasingly rely on these geocoding services, the companies offering them have a large influence on the people who use them.

This leads into the idea of possibly relying on participatory services, such as Open Street Map.  OSM has made leaps and bounds in terms of quantity and quality of spatial data over the past few years.  I am curious to see how it would match up with the five services in this paper.  OSM relies on the ability of users to edit data if they feel it is incorrect.  Therefore, the service is theoretically consistently being updated depending on the number of users editing a certain area.  As a result, errors may be less likely to be consistently missed, as with the case of a more authoritative geocoding service.  It would also be interesting to see the type of buildings that may be geocoded more or less accurately.  As we continue to enter this age of open and crowd sourced spatial data, I believe it has the potential to provide us with even better services.

-Geogman15

 

Initiate the Geocoding Sequence

Thursday, March 21st, 2013

Geocoding is the fascinating process of associating an address or place name with geographic coordinates. Traditionally, Geocoding was solely the realm of specialists, requiring a specific set of skills and equipment. However, with the advent of modern technology, including Web 2.0 applications, Geocoding is now easier than ever for the everyday user. However, despite the multitude of Geocoding services, such as Google and MapQuest, each service uses different algorithms, databases, etc. to code their locations. Therefore, users might not be aware of which services offer the best quality results, or on the contrary, may offer innacurate results. The quality of Geocoding results may in turn affect subsequent decisions, modeling, analysis, etc.

Overall, one of the biggest problems facing Geocoding is the accuracy of the results. In particular, one problem mentioned by the authors was the poor accuracy of addresses located in rural, agricultural, and urban areas. On the other hand, most urban locations tended to be geocoded similarly across platforms. In addition, it was also interesting to note that several platforms consistently offered more accurate results: Yahoo!, MapPoint, and Google. It would be fascinating to investigate what type of geocoding algorythms, databases, etc. these services use, and if they are similar or relatively different.

Another fascinating trend to consider is the future of direction of Geocoding. One possibility could be the standardization of geocoding databases, algorithms etc. On the other hand, this in turn may lead to redundancies in geocoding services, which might not be a realistic outcome. Overall, the future of Geocoding as a useful tool is heavily dependent on how useful and accurate the results can be.

-Victor Manuel

Putting Geography on the Map

Wednesday, March 20th, 2013

Roongpiboonsopit and Karimi’s 2010 comparison study of five free online geocoders is an example of an important process in weeding out poor geocoders and creating one that works accurately and reliably. Considering geocoding is at the heart of many GIS applications, especially those involving the Geoweb, spatial accuracy is key. The authors used empirical analysis to determine the “best” geocoders based on accuracy and a number of other metrics. They concluded that Google, MapPoint (Microsoft) and Yahoo! are all quite effective, while Geocoder.us and MapQuest are less so.

In thinking about geocoding and the development aspects of the geocoding process, I realized that geocoding development is much like development in other topics we’ve covered, such as LBS and AR. As all of these progress and become more accurate and lifelike, they are approaching a level of artificial intelligence that is simultaneously creepy and cool. For instance, if a geocoder like Google’s uses all of the information it already has on us, there will not be any need for formal geographic indicators like street address, coordinates, or even official place names. If the reference database were to include our own vernacular and preferences along with the official names and spatial reference algorithms, then simply typing or saying “home” would pinpoint our location on a map quickly and easily. This isn’t even about spatial accuracy anymore, but rather “mental” accuracy. Perhaps I’m getting ahead of myself, but I see the possibilities with geocoding not just in terms of developing a usable application for plotting points on a map, but also in terms of expanding how we think of space (ontologies) and how we conceptualize our environment (cognition). Integrating new tools into the pre-existing algorithms has and will continue to change how we live our lives.

– JMonterey

Geocoding Errors

Tuesday, March 19th, 2013

Goldberg et al.’s article ” From Text to Geographic Coordinates: The Current State of Geocoding” demonstrates an in-depth view of recent geocoding, its process and the errors involved in geocoding. That said, I wish that more discussed on the errors occurring in geocoding resulting from database managers “correcting” system errors in coding. In essence, when an error is “corrected” by a database manager, future geocoding tries to reconcile the changes and often leads to more error as the system tries to place other data into the “corrected” group or arrange the data to make sense next to the “correction”.

I have experienced this problem first hand when trying to geocode new points within a previous geocoding database. What happened  is that  a previous manager “corrected” different geocoding errors by manually entering data points as a new category, which conflicted with a several other previous categories within the database. Therefore, when I entered my points they were all coded as the new category, and  located in the wrong areas, since the manual fixes superseded  the software operational coding for placement when they were not equal to a known location. If I had not gone back to cross reference the points, I would never have found the geocoding discrepancies and “corrected” the points (although this may cause future errors, which were not apparent) .

In the article, the mention of E911 helping with address accuracy is an important step to reducing error, but I believe is irrelevant since technology with GPS sensors are becoming standard on most utility entry points at every address. For example, Hydro-Quebec is installing digital meters with GPS, pinging capability and wireless communications. These devices are then geocoded to a database, and therefore could provide accurate accessible location referencing that is self correcting, and thus reducing error for every address. As such, is the proliferation of intelligent technology reducing the error or adding to the complexity of geocoding?

Temporal GIS do we go back, or only forward?

Thursday, March 14th, 2013

Marceau et al.’s paper on temporal topology in GIS Databases outlines the faults with temporal GIS which seems echoes Marceau’s earlier paper on spatial GIS and its faults within social and natural sciences. In both temporal and spatial GIS, as I compare the two papers, the resolution seems to be one of the main issues affecting the accuracy of topology. To clarify, in spatial GIS, higher resolution reflects more data acquired and more accurate spatial topology; while in temporal GIS, higher resolution reflects a higher rate of sampling versus change in the area and more accurate temporal topology. To simplify, when dealing with temporal GIS, it comes down to the sample rate and what is included within the sample and thus, as talked about in spatial scale the “politics of scale”.

I believe that Marceau et al. tried to address the intervals of sampling, however I believe, from everything I have read on scale and scale changes, that Marceau et al.’s approach may be fatally flawed in that it is too simple to transfer to larger areas with greater variability of change. The method once upscaled will produce uncertainties if not greater uncertainties then what was already have within there study area. In essence, it is the “politics of scale”, were the question on how temporal GIS operates or can operate within a software platform, is mired in uncertainty by the data collected over time and the modification by Marceau et al. during the application of the data within a platform setting.

For temporal GIS, it may be impossible to go back in time to map. GIScience may have to start new from the present time working into the future. Therefore, GIScientists will know, now that platforms exist, that certain data sets need to be created that can represent the change rather then extrapolation into the past, which is inherently uncertain without wide-scale identifiers present (i.e. Land survey archeology or the process of digging into the ground to identify past land-uses for topology identifiers).

C_N_Cycles

 

Like a Memory Lost in Time

Thursday, March 14th, 2013

Langran and Chrisman (1988) covered various conceptualizations of temporal GIS, stressing temporal topology and the need to visualize temporal structure, trap errors, and minimize data storage requirements. In a topic based on the conceptualization of abstract and multidimensional notions, it was enormously useful for the authors to include simple visualizations of the conceptualizations of time, especially in terms of temporal topology (i.e. how events are connected and related temporally).

One visualization of time that the authors ultimately reject as flawed in several ways is that of showing separate “scenes” or “states” in chronological order. I completely agree that this is a flawed practice primarily because of the hidden topological structure of the “states” (e.g. proof that one state occurred before or after another scene, and the interval between them). However, it is also the simplest for a layperson to understand. The authors drew the analogy of a motion picture, and that is significant because films function much as our own memories do, one scene at a time. We cannot capture events in time as anything other than isolated from one another. When we remember an event, we imagine a single image, followed by another and another if the memory is strong. Likewise, those images are from only one perspective—our own. I mention this as a slight digression only to draw a parallel to GIS. Even if we were to adopt the overlay or multi-polygon methods that the authors recommended, we would be ignoring the fact that mapping (at least traditional, top-down mapping) is drawn from a single perspective and that change, the manifestation of time, happens differently from any given perspective.

– JMonterey

Try and Find Me

Thursday, March 14th, 2013

Rao and Minakakis wrote on the current (in 2004) and future state of location-based services (LBS). In their article, they touched on the growing interest in LBS, the requirements for growth, and the types of LBS. While 2004 was not too long ago, nine years is eons for modern technology, and I’m curious how much has changed since. Nonetheless, it is a clear and well-written piece that thoroughly scans over the important aspects of a field that stands at the heart of modern GIS applications.

Until now, my understanding of LBS has been primarily based in how advertisers could market goods and services to targeted individuals who are in an optimal location to be influenced. Thus, my naïveté had resulted in a cynical view of LBS and its exploitative power. However, it is clear that the field encompasses more than marketing; it serves navigation and personalization functions, as well.

Beyond the technical issues regarding positional accuracy, I am also leery of the privacy concerns inherent in any LBS. In the case of personalization, many consumers may opt in to various services for convenience or for a novel experience (i.e. because it’s cool). But there are countless times when consent to be (essentially) “followed” is given as part of something completely unrelated. For instance, Facebook will show advertisements based in the location of the user’s IP address. Considering the easy accessibility of ad-block software, I would not be surprised if advertisers aim for more explicit consent in the future, likely stressing the convenience and novelty aspects of personalization in a positive spin of location exploitation.

– JMonterey

LBS: Ideas lost to time and tech advances

Thursday, March 14th, 2013

Charles Steinfield’s article, “the development of Location based Services…” is a great summation of location based services (LBS) in the first half of the last decade, however most of the trends and technologies have changed. For instance, the mention of growing “802.11b” wifi networks have gone the way of beta cassettes, mini-disc players and HD-DVDs (all of which I remember using). The article mostly seems to have a historical value to GIScience today, although it does present some of the burgeoning technologies in early generational development we use today in LBSs today. Most of what is mentioned as positioning technology for both indoor and outdoor LBS just seems too impractical and point infrastructure dependent (i.e. Expensive to maintain and maintenance intensive); probably the reason we do not use many of the technologies mentioned.

The application of LBS has matured to a state that any mobile device receives some sort of LBS and the use of LBS has grown beyond Steinfield’s imagination when writting the article. However, LBS although having its benefits, can now hinder what a person is looking for on there device because it provides local services first, rather than the service a user is looking for that may just be out of range.

The belief that Wifi may help LBS was a good point in the article but I believe it is becoming less relevant with 4G mobile networks (and 5G on the horizon) where GPS data in mobile devices (GPS now in all phones) and digital pinging of cell towers are able to communicate, with lightning speeds, resulting in better at predicting of location. That said the idea of security, privacy, and terms of use consent brought up by Steinfield    becomes relevant. Although I think these ideas are concerns, the ability to block location  are restrict location use by the user on most devices (iPhone, most androids) from LBS solves the issues of concern (It should be noted that GPS location services are always available for emergency uses; police, fire, etc.).

C_N_Cycles

 

Time Goes By, So Slow

Thursday, March 14th, 2013

Temporal mapping has always been one of the biggest issues within the GIS community. Although the visualization of spatio-temporal data has advanced dramatically since the time of writing of the article  (1988), it is still relatively challenging to map and present this type of data in a clear way. Spatial-temporal modeling is useful for many modern day applications. For example, in the field of epidemiology, technologies such as GIS and remote sensing have allowed for the development of models that characterize the distribution of infectious diseases. In addition, these models are also useful for tracking the methods of diffusion of the diseases, as well as how effective containment measures are. Spatio-temporal mapping also holds many important benefits for disaster management, environmental studies, and transportation planning. Overall, further developments within the field are necessary in order to incorporate not only changes over “snapshots” in time, but also the processes that occur in between the steps.

-Victor Manuel

 

Scooby Doo, Where are You?

Thursday, March 14th, 2013

Location Based Services (LBS) have become a major component of the marketing strategies of service providers. With the rapid development of technologies, especially in the mobile arena, it is now easier than ever for providers to deliver focused services. Although Rao does a great job of highlighting the potential of LBS, in reality he underestimates the huge impact it has had since the writing of his paper. Rao predicted that by 2006, LBS would reach 680 million global customers. However, in our modern age of connectivity, LBS is most likely available to over a billion people worldwide.

LBS has and will continue to offer every improving potential to both service providers and customers. Rao highlights some of the advantages of LBS to consumers, who can have access up to date information on maps, driving directions, yellow pages, and locator services. As technology has developed, LBS is now integrated into many everyday applications. For example, consumer applications, such as “find my friends”, allow you to track the exact location of other users (with their approval of course). LBS have also become very important to service providers. With their ability to track a customers geographic habits, they have the ability to tailor business concerns, such as advertisements, that may be more pertinent to the customers interest.

Although LBS hold may benefits, as with many new technologies there are privacy issues. As service providers continue to gather and store huge amounts of location based data, it raises several questions such as- What should be done with this data? Who has access? Is it anonymous?

-Victor Manuel