Problems of classification

April 4th, 2013

Since the paper by Wilkinson in 1996 many satellites have been put into orbits and several million GBs of satellite image have been collected. But more importantly, with the coming of the digital camera there has been an explosion in the amount of digital images that have been captured. Consequently, people were quick to spot the opportunity in leveraging the data from the images; hence a lot of research has been conducted in the image processing domain (mainly in biometrics and security). This being said, some of the most successful approaches in other domains have not been as well, when applied to satellite images. And the  challenges outlined in the paper still hold true today.

According to my understanding this is mainly because of the great diversity in satellite images. The resolution is only one part of the equation. The main problem lies in the diversity of the things being imaged. This makes it very difficult to come up with training samples that are a good fit. Thus, traditional Machine Learning techniques based on supervised learning have a hard time. Moreover, the problem is compounded by the fact that when we are classifying satellite images, we are generally interested in extracting not one, but several classes simultaneously with great accuracy. However, the algorithms do perform well when classification is performed one image at a time but significant human involvement is needed to select good training samples for each image. But to the best of my knowledge no technique exists which can completely automatically classify satellite images.

-Dipto Sarkar

GIS&RS

April 4th, 2013

Brivio et als paper presents a case study integrating Remote Sensing and GIS to produce a flood map. After explaining methodology and results of other methods, the paper finds the integrative method to be 96% accurate.

This speaks to the value of interdisciplinary work. While RS applications on their own proved inadequate, a mixing of disciplines gave a fairly trustworthy result. While I understand the value of highly specialized knowledge, having a baseline of capability outside of one’s specific field is useful. I remember in 407 Korbin explaining that knowing even a bit of programming can help you in working with programmers, as understanding the way that one builds statements as well as the general limits of a given programming language will give you an idea of what you are can ask for. The same is true for GIS/RS. Knowing how GIS works and what it might be able to do is useful for RS scholars in seeking help and collaboration and vice versa. I think McGill’s GIS program is good in this respect. I got to dip my toes into a lot of different aspects of GIS (including COMP) and figure out what I like about it. If I end up working with GIS after I graduate, I know that the interdisciplinary nature of the program will prove useful.
Wyatt

Time or Space

April 4th, 2013

Geospatial analysis can be no better than the original inputs, much like a computer is only as smart as its user. In the field of remote sensing, this ideology may be on its way to becoming obsolete. Brivio et. al show from a case study of catastrophic inundation in Italy that they can compensate for the temporal disparity in the capturing of remotely sensed data and the peak point of the flood, a few days before.

The analysis, however, was not completed with the sole use of synthetic aperture radar images. Had it not been for the integration of topological data, it is unlikely that one would be able to obtain similarly successful results.

With any data input, temporal or spatial resolution are limiting factors. Brivio highlights this by acknowledging the use of NOAA thermal infrared sensors, which have a finer temporal resolution, while lacking in spatial resolution. Conversely, the SAR images used in the case study analysis have a relatively higher spatial resolution, but produces longer temporal intervals.

Given Brivio et. al’s successful prediction of flooding extent, it may mean that, if need be, it is advantageous to choose an input with a finer spatial resolution in exchange for a coarser temporal resolution, complementing the temporal delay with additional inputs to compensate.

Break remote sensing down into it’s two main functions: collection and output. One will inevitably lag behind the other, but eventually the leader will be surpassed by the follower. Only for it to happen again some time down the road. Much like two racers attached by a rubber band.

What all of this means for GIS; eventually the output from remote sensing application will surpass the computing power of geographic information systems. At which point, the third racer, processing, will become relevant, if he isn’t already.

GIS and RS: how do we account for variability?

April 3rd, 2013

Brivio et al.’s article “Integration of remote sensing data and GIS… for mapping of flooded areas” presents the very common process of using RS data and GIS  to map flooding and flood plains. Although the article presents how the integration of RS and GIS can accurately map a flood with a concluded method  accuracy of 96%, it only looks at a single event and study site. From my experience, this is not always the case, as  integration methods, even if they are the same, often vary in accuracy from one location to another. Furthermore, event duration, intensity and geologic substates often interfere with flood area prediction from RS data and GIS, as variations can modify water location within minutes to hours. To clarify, one area may be flooded at certain points during the flood period while during other periods dry (i.e. it may transition from wet to dry to wet), which interferes with accuracy of the RS data and GIS prediction. Fundamentally, water changes how the surrounding environment reacts, modifying where floods are. As floods react to the environment, often areas become flooded for only minutes and as such, are never recognized as a flooded area, in both GIS predictions and RS data, as well as human reports (although they were flooded; but only for minutes).

To better predict flood area, TWIs (topographical wetness index) and DEMs (digital elevation models) when compared to flow paths (cost-distance matrix), may in fact, better predict flooded areas when used in conjunction with RS data then just the integration of RS data to cost-distance matrixes. In addition, more data sets and studies would further help to create a more general integration protocol and predictive area estimates for floods. To elaborate, the techniques in the article work well on the study area by may not work on other floods, therefore by adding more data from more types of floods, the technique could be adapted to other situations. The result of multiple integrations with multiple data sets would also reduce error and produce greater accuracy. The “Big” question, however that will still remain unanswered from this article is: how can we account for ecosystem and flood variability within GIS and RS data sets?

C_N_Cycles

Visualizing Uncertainty: mis-addressed?

April 3rd, 2013

“Visualizing Geospatial Information Uncertainty…” by  MacEachren et al. presents a good overall view of geospatial information uncertainty and how to visualize it. However that said, many parts seemed to convey that all uncertainty must be defined in order to make correct decisions. In my realm of study, although it would be nice to eliminate or place uncertainty in a category, just the recognition that there is uncertainty is often definition enough to make informed decisions based on the observed trends. Furthermore, the authors seem to separate the different aspects of the environment or factors that lead to uncertainty and how it may be visualized. The use of many definitions and descriptions convolutes what is really the factors that result in uncertainty and the resulting issues with visualization. The way visualized uncertainty is presented greatly contrasts the ambiguity of the definitions behind uncertainty and its representation presented by the authors. The studies and ways uncertainty can be visualized is a great help in decision making and the recognition of further uncertainties.

One aspect that would have help in addressing uncertainty and its visualization would have been to integrate ideas and knowledge from the new emerging field of ecological stoichiometry, which looks at uncertainty, the flow of nutrients and energy, and the balance within ecosystems to answer and depict uncertainty. I believe that ecological stoichiometry would address many of the challenges in identification, representation and translation of uncertainty within GIS and help to clarify many problems. This stoichiometric approach falls along the scheme of the multi-disciplinary approach to uncertainty visualization described within the article.  However, as the article is limited to more generally understood approaches, rather than more complex ones, such as stoichiometry, do some of the proposed challenges in recognition and visualization of uncertainty not exist?  I would argue yes, but then again more challenges may arise in depiction, understanding and translation of uncertainty.

C_N_Cycles

Error prone GIS

April 1st, 2013

In any data related field great efforts are put into ensuring the quality and integrity of the data being used. It has long been recognized that the quality of results can only be as good as the data itself, moreover, the quality of data is no better than the worst apple in the lot. Hence, for any data intensive field great efforts are put into data pre-processing to understand and improve the quality of the data. GIS is no exception when it comes to being cautious about the data.

The various kinds of data being handled in GIS makes the problem of errors more profound. Not only does GIS work with vector and raster data, it also needs to handle data in forms of tables. Moreover, the way the data is procured and converted is also a concern. Many a times data is obtained from external sources in the form of tables of incidences that have some filed(s) containing the location of the event. Usually this data was not collected with the specific purpose of being analysed for spatial patterns, hence, the location accuracy of the events are greatly varied. Thus, when these files are converted into shapefiles, it inherits the inaccuracy inbuilt in the data-set.

One of the things to remember however is, that the aim of GIS is to abstract reality to a form which can be understood and analysed efficiently. Thus it is important not to lay too much emphasis on how accurately the data fits the real world. The emphasis on the other hand should be to find out the level of abstraction that is ideal for the application scenario and then understand the errors that can be accepted at that level of abstraction.

-Dipto Sarkar

geocode all the things

March 22nd, 2013

Goldberg, Wilson, and Knoblock (2007) note how geocoding match rates are much higher in urban areas than rural ones. The authors describe two routes for alleviating this problem: geocoding to a less precise level or including additional detail from other sources. However, both these routes result in a “cartographic confounded” dataset where accuracy degrees are a function of location. Matching this idea — where urban areas and areas that have been previously geocoded with additional information are more accurate than previously un-geocoded rural areas — with the idea that geocoding advances to the extent of technological advances and their use, we could state that eventually we’ll be able to geocode everything on Earth with good accuracy. I think of it like digital exploration — there will come a time when everything has been geocoded! Nothing left to geocode! (“Oh, you’re in geography? But the world’s been mapped already”).

More interesting to think about, and what AMac has already touched on, is the cultural differences in wayfinding and address structures. How can we geocode the yellow building past the big tree? How can we geocode description-laden indigenous landscapes with layers of history? Geocoding historical landscapes: how do we quantify the different levels of error involved when we can’t even quantify positional accuracy? These nuanced definitions of the very entities that are being geocoded pose a whole different array of problems to be addressed in the future.

-sidewalkballet

To Digitize or Not to Digitize?

March 21st, 2013

No process in GIS perfect. There are always limitations, many of which can be ignored, while others must at least be acknowledged. The application of the results of an analysis can have a drastic impact on whether errors are ignored, acknowledged, or painstakingly resolved. Consider  the difference between geocoding addresses at the national level to analyze socioeconomic trends. The power of numbers will outweigh the error generated during the processing, but it is enough to acknowledge the limitation. On the other hand, the example of the Enhanced 911 system requires that all addresses be geocoded as precisely as possible for times of emergency.

As a way of increasing the accuracy of geocoding processes, would it be sufficient to input a number of intermediary points as a way to accommodate for the uneven distribution of addresses within a given address segment. It would essentially act as the middle ground between leaving the results entirely up to geocoding and digitizing all addresses manually. After all, why do the corners of blocks have to act as the only reference point? It’s possible that there is an inherent topology that would be lost if this was to be implemented, but I cannot speak to that.

While reading Goldberg et al., one geocoding nightmare kept running through my head. It surprised me that it was not touched on directly. How has Japan addressed the situation? As an OECD country, it likely possesses sufficient GIS infrastructure. If I’m not mistaken, though, house addresses are not based on location so much as time. Within prefectures, addresses are assigned temporally, whereby the oldest structure has a lower value than a newer structure, even if they are immediately adjacent, two structures can have significantly different addresses. Just a thought.

AMac

Statutory warning: Geocoding may be prone to errors

March 21st, 2013

The last few years have seen tremendous growth in the usage of Spatial Data. Innumerable applications have contributed to the gathering of spatial information from the public. Application’s people use every day like Facebook and Flickr have also introduced features with which one can report their location. However, people are not generally interested in geographic lat-long. Names of places make more sense in a day to day life. Hence, all the applications report not the spatial co-ordinates but the named location (at different scale) where the person is. The tremendous amounts of location information generated have not gone unnoticed and several researches have been conducted to leverage this information. But, one issue that is frequently overlooked in researches that use these locations is the accuracy of the geocoding service that was used to get the named locations. Not only is displacement a problem but scale at which the location was geocoded will also have an effect on the study. The comparison of the various accuracy of the available geocoding services done by Roongpiboonsopit et. al. serves as a warning to anyone using the geocoded results.

-Dipto Sarkar

 

Is There a Problem with “Authoritative” Geocoding?

March 21st, 2013

Roongpiboonsopit and Karimi provide a very interesting study on the quality of five relatively known geocoding services.  Google Maps is something I use very often, however I never really critically thought about the possible errors that may exist and their consequences.  A study such as this allows us to understand the underlying parameters that go into these geocoding services and how they may differ from provider to provider.  One aspect that was really interesting to me was the difference in positional accuracy of different land uses.  Obviously, there tends to be an “urban bias,” of sorts, when geocoding addresses.  As a result, one is more likely to get an incorrect result when searching for address in rural/suburban areas.  While this makes sense due to spatial issues, I thought that this could theoretically be extending to other criteria.  While LBS becomes more popular and geocoding increases in importance, will certain companies offer “better” geocoding services to businesses that are willing to pay for it?  For example, Starbucks could make a deal with Google to ensure that all of their locations are correctly and precisely geocoded.  Taking it to the extreme, Google could even make a deal to deliberately sabotage the addresses of other coffee shops.  While I think this specific case may be unlikely, it does raise issues about having completely authoritative geocoding services.  As we increasingly rely on these geocoding services, the companies offering them have a large influence on the people who use them.

This leads into the idea of possibly relying on participatory services, such as Open Street Map.  OSM has made leaps and bounds in terms of quantity and quality of spatial data over the past few years.  I am curious to see how it would match up with the five services in this paper.  OSM relies on the ability of users to edit data if they feel it is incorrect.  Therefore, the service is theoretically consistently being updated depending on the number of users editing a certain area.  As a result, errors may be less likely to be consistently missed, as with the case of a more authoritative geocoding service.  It would also be interesting to see the type of buildings that may be geocoded more or less accurately.  As we continue to enter this age of open and crowd sourced spatial data, I believe it has the potential to provide us with even better services.

-Geogman15

 

Unloading on Geocoding

March 21st, 2013

Geocoding, like many of the concepts that we study in GIScience, is very dependant on the purpose of the process. The act of geocoding is often confused with address matching, which is sometimes correct, however it can also be georeferencing any geographic object and not just postal codes. This implies that the perception of geocoding will affect the ways that we go about doing it.

 

There are many ways to geocode, as described by Goldberg, Wilson, and Knoblock, and no single one of them is universally correct. Each method uses different algorithms to try to match some identifier to a geographic reference. For example, it might find the length and endpoints of a street and then use a linear interpolation to find the location of a given postal code. The geographical context also bears a great importance in determining which algorithm to use. For example, the method described above may work better in a city with short, rectangular blocks, however it may be less applicable in rural China. These are some of the things that one has to consider when choosing a method of geocoding.

 

The future of geocoding is perhaps less certain than many of the other GISciences, because as technology and georeferencing becomes more ingrained in our society, the algorithms used to match these objects with a geographic location will become less important. Things like GPS are becoming more and more commonplace in many appliances, however this brings up questions of privacy. Ultimately, the future of geocoding will be a balancing acts of tradeoffs between public acceptance of technology and the development of more powerful and purpose-driven algorithms.

 

Pointy McPolygon

 

Initiate the Geocoding Sequence

March 21st, 2013

Geocoding is the fascinating process of associating an address or place name with geographic coordinates. Traditionally, Geocoding was solely the realm of specialists, requiring a specific set of skills and equipment. However, with the advent of modern technology, including Web 2.0 applications, Geocoding is now easier than ever for the everyday user. However, despite the multitude of Geocoding services, such as Google and MapQuest, each service uses different algorithms, databases, etc. to code their locations. Therefore, users might not be aware of which services offer the best quality results, or on the contrary, may offer innacurate results. The quality of Geocoding results may in turn affect subsequent decisions, modeling, analysis, etc.

Overall, one of the biggest problems facing Geocoding is the accuracy of the results. In particular, one problem mentioned by the authors was the poor accuracy of addresses located in rural, agricultural, and urban areas. On the other hand, most urban locations tended to be geocoded similarly across platforms. In addition, it was also interesting to note that several platforms consistently offered more accurate results: Yahoo!, MapPoint, and Google. It would be fascinating to investigate what type of geocoding algorythms, databases, etc. these services use, and if they are similar or relatively different.

Another fascinating trend to consider is the future of direction of Geocoding. One possibility could be the standardization of geocoding databases, algorithms etc. On the other hand, this in turn may lead to redundancies in geocoding services, which might not be a realistic outcome. Overall, the future of Geocoding as a useful tool is heavily dependent on how useful and accurate the results can be.

-Victor Manuel

NeoGeocoding?

March 21st, 2013

The article of Roongpiboonsopit and Karimi highlights the fact that ubiquitous mapping and new practices facilitated by the technology allow everybody to geocode data without really knowing what is happening ‘behind’ the geocoder tools (Geocoder.us, MapQuest, Google, MapPoint, and Yahoo!).

Coding is defined in the article as “applying a rule for converting a piece of information into another”. But who controls the rules ? Users have little control over the process since they don’t interact with the geocoding algorithms and the reference databases.

The authors’ analysis shows the importance of questioning the tool used to produce data because errors and uncertainties related to the data produced have an impact on further analysis and decision making. These points relate to the neogeography literature and critics of GIS discussed during the class. More specifically the debates over ethics and practices of ‘open participation’ as a democratization or as an exploitation of user-generated production of surplus of data. In VGI, is geographic information being produced by ‘citizen censors’ or by ‘cognizant individuals’ as mentioned by Andrew Turner (in Wilson and Graham, 2013)? The two different terms underline the question of how aware citizens are when they produce data and geocoding information. This leads to the question of accuracy and how much we need accuracy. It is probably not always important to achieve a perfect accuracy. However, I think that it is crucial to be aware of the lack of accuracy and to make the uncertainties explicit in contrast of leaving it ‘behind’ the tools.

Furthermore, the uneven results generated by the geocoding processes depending on location can also be linked to the debates discussed earlier in class about the digital divide in GIScience. Data are more accurate in urban and suburban areas than in rural areas due to the quality of reference databases in urban areas. Again, I think that there is a need to make these differences more explicit to users or/and producers in order to bridge the gap between experts and amateurs’ production of knowledge.

Wilson, M. W. et M. Graham (2013). Neogeography and volunteered geographic information: a conversation with Michael Goodchild and Andrew Turner. Environment and Planning A, 45(1), 10-18.

S_Ram

Geocoding

March 21st, 2013

Goldberg et al. provides an overview of geocoding and touches on several aspects that relate to GISci. For instance, the authors elaborate on how the evolution of the reference dataset has allowed for more complex and resourceful queries to geocode postal codes, landmarks, buildings and so on. The underlying cause of the shift has been attributed to the reference dataset. Not only have datasets become more complex, the increasing technological power has paved the way for intricate matching algorithms, and interpolation processes. Perhaps the next shift within the geocoding realm will be the increasing integration of context aware technologies. Now that people are progressively contributing geospatial content, issues of cost, money and time are significantly reduced. The authors suggest that once GPS technology is accurate (and affordable) enough to be equipped in all mobile phones, postal codes may become obsolete. But instead of rendering postal addresses obsolete, are we at the moment where LBS and AR can add to the accuracy of reference files, especially if volunteered geographic information is on the rise? I’m curious to see how the new ways in which data is being provided/created (in large amounts, provided by many people) intersects with geocoding today. Now that we’ve seen an evolution in the data models and algorithm complexities that contribute to geocoding, I can imagine that volunteered geographic information from a large number of people can be a new method to geocode but ultimately bringing with it a new set of complexities in data accuracy and reliability issues.

-tranv

Putting Geography on the Map

March 20th, 2013

Roongpiboonsopit and Karimi’s 2010 comparison study of five free online geocoders is an example of an important process in weeding out poor geocoders and creating one that works accurately and reliably. Considering geocoding is at the heart of many GIS applications, especially those involving the Geoweb, spatial accuracy is key. The authors used empirical analysis to determine the “best” geocoders based on accuracy and a number of other metrics. They concluded that Google, MapPoint (Microsoft) and Yahoo! are all quite effective, while Geocoder.us and MapQuest are less so.

In thinking about geocoding and the development aspects of the geocoding process, I realized that geocoding development is much like development in other topics we’ve covered, such as LBS and AR. As all of these progress and become more accurate and lifelike, they are approaching a level of artificial intelligence that is simultaneously creepy and cool. For instance, if a geocoder like Google’s uses all of the information it already has on us, there will not be any need for formal geographic indicators like street address, coordinates, or even official place names. If the reference database were to include our own vernacular and preferences along with the official names and spatial reference algorithms, then simply typing or saying “home” would pinpoint our location on a map quickly and easily. This isn’t even about spatial accuracy anymore, but rather “mental” accuracy. Perhaps I’m getting ahead of myself, but I see the possibilities with geocoding not just in terms of developing a usable application for plotting points on a map, but also in terms of expanding how we think of space (ontologies) and how we conceptualize our environment (cognition). Integrating new tools into the pre-existing algorithms has and will continue to change how we live our lives.

– JMonterey

Geocoding Errors

March 19th, 2013

Goldberg et al.’s article ” From Text to Geographic Coordinates: The Current State of Geocoding” demonstrates an in-depth view of recent geocoding, its process and the errors involved in geocoding. That said, I wish that more discussed on the errors occurring in geocoding resulting from database managers “correcting” system errors in coding. In essence, when an error is “corrected” by a database manager, future geocoding tries to reconcile the changes and often leads to more error as the system tries to place other data into the “corrected” group or arrange the data to make sense next to the “correction”.

I have experienced this problem first hand when trying to geocode new points within a previous geocoding database. What happened  is that  a previous manager “corrected” different geocoding errors by manually entering data points as a new category, which conflicted with a several other previous categories within the database. Therefore, when I entered my points they were all coded as the new category, and  located in the wrong areas, since the manual fixes superseded  the software operational coding for placement when they were not equal to a known location. If I had not gone back to cross reference the points, I would never have found the geocoding discrepancies and “corrected” the points (although this may cause future errors, which were not apparent) .

In the article, the mention of E911 helping with address accuracy is an important step to reducing error, but I believe is irrelevant since technology with GPS sensors are becoming standard on most utility entry points at every address. For example, Hydro-Quebec is installing digital meters with GPS, pinging capability and wireless communications. These devices are then geocoded to a database, and therefore could provide accurate accessible location referencing that is self correcting, and thus reducing error for every address. As such, is the proliferation of intelligent technology reducing the error or adding to the complexity of geocoding?

Geocoding and public health

March 18th, 2013

I think Geocoding is one of the most central issues in GIS (science and systems) and yet is probably one of the less well understood issues for lay people and non-experts who use spatial data (along with map projections). As the authors mentioned, there has been research in public health and epidemiology on the accuracy of street addresses. In fact, a PhD student in my lab lead a research project on this very issue here in Montreal (Zinszer et al. 2010, cited below). The team found that address errors were present in about 10% of public health records, the same ones that were used to perform spatial analysis to look for space-time clustering of campylobacteriosis in Montreal. Geocoding has all kinds of repercussions in public health research; while errors are an issue, anyone who performs epidemiological research with administrative databases are prepared to have some amount of error. However, when the error becomes differential with respect to some factor of interest, this can result in a huge problem (bias). For example, as mentioned in the discussion, the accuracy of geocoding was differential between urban, suburban, and rural areas. There is a lot of spatial epi research done with the urban-rural health divide in mind, and differential accuracy in geocoded addresses like this could pose a huge problem. I think papers like this one by Roongpiboonsopit and Karimi are very useful for people outside of GIS because they help us understand the scope of the issue. I also think Roongpiboonsopit is a super awesome name.

Zinszer K, Jauvin C, Verma A, Bedard L, Allard R, Schwartzman K, de Montigny L, Charland K, Buckeridge DL (2010). Residential address errors in public health surveillance data: A description and analysis of the impact on geocoding. Spatial and Spatio-temporal Epidemiology. 1(2-3): 163-168.

 

-Kathryn

Business and LBS

March 15th, 2013

Before I read this paper, I was not very familiar with the concept of location based services. There are of course lots of commercial uses of geographic information but this direct discussion of spatial information translating directly into business is very interesting and not something I had thought about. This subject seems like another interesting area that intersects with the topics we have already discussed, such as augmented reality, where location can be used via say, GPS, and various software and hardware to augment individuals surroundings that could direct effect sales and be used for marketing. As mentioned in the paper, privacy is of course another theme that has surfaced often in class and again is an issue here, but there is also a lot of potential for creative uses of spatial information in ways that individuals would appreciate. Looking forward to seeing how others perceive the utility of LBS.

-Kathryn

Time geography

March 15th, 2013

Even though this article is not new, it seems like the field has not advanced incredibly since it was written. Being able to efficiently and effectively store, retrieve and particularly visualize the temporal aspects of data in GIS continues to be a major limitation of spatial analysis. In spatial statistics, models that account for space-time variation rather than static spatial information are usually just logical extensions of the same spatial models. But spatial stats doesn’t have to deal with the same data model issues that GIScience has to address, nor does it have to attempt the visualization aspects that GISystems are expected to handle. I am interested in seeing what the current state of the art is, in case there have been major advanced that I somehow missed since my masters.

-Kathryn

LBS and corporations’ interests

March 15th, 2013

I feel like the LBS development are focused on marketing purposes and privacy issues might be underestimated in this domain. Profit is a major interest and priority before individual’s privacy concerns. The article of Rao and Minakakis shows that the way that LBS can be profitable and worth the developments in this field is by gathering information, conducting data mining and ‘business intelligence activities’ to extensively understand consumer’s profiles and shopping habits. The authors recognize that there are ethical and privacy issues related to the use and disclosure of information to third parties but do not engage with this topic. I’m wondering if developing business models on one side and leaving the privacy issues to privacy advocates on the other side is a good approach in the development of the technologies and LBS…

S_Ram