Archive for November, 2017

Thoughts on “Empirical Models of Privacy in Location Sharing”

Thursday, November 30th, 2017

I am really interested in ubiquitous computing and location-based technologies so I was looking forward to this paper. In describing their methodology and specifically the concept of “location entropy”, I would have liked a more operational definition of “diversity” of people visiting that space- whether they took into consideration economic, social, ethnic, gender differences and how they qualified those variables. There is an interesting link to spatio-temporal GIS in the observation that more complex privacy preferences are usually linked to a specific time window at a given premises (ie. 9-5 on weekdays on company premises) (pg 130.)

I thought it was a novel approach to focus on the attributes of the locations at which people were sharing their locations rather than the personal characteristics of the individuals which might influence their decision to share their location at one point or another. This inverse format lends itself to generalization across subjects and the formation of universal principles about which kinds of places most inspire location-sharing.

There is an emphasis in the paper on “requests” and the explicit invitation to share one’s location in a social network, but the majority of users supply their location unwittingly or without a formal request. Although this is an important difference, it stands to reason that the authors’ observations about the nature of the request (ie. what app is using the info.) or the context (who the information is broadcast to, whether a network of acquaintances or anonymous gamers), influences an individual’s decision to share their location even in the absence of a formal request.

The Locaccino interface (brilliant branding there) looks very much like Find Friends, an app that I know some of my friends use regularly. It’s great in some ways that we are able to empirically test hypotheses about the kinds of environments and behavioural conditions which promote or discourage location sharing using these real-world datasets.

-FutureSpock

 

 

How fast is a cow? (Laube and Purves, 2011)

Thursday, November 30th, 2017

This paper  by Laube and Purges (2011) truly brings together so many of the concepts that we have explored in class over the last couple weeks: scale (spatial, temporal), temporal GIS and error and uncertainty. In fact, this article addresses Noe’s concerns from last week about temporal scale following his summer research work. What a way to tie up the course!

This article presents a thoughtful (and alliteration heavy) list of problems in movement analysis, but focuses on “granularity grief” (2), which is the problem of temporal scale in movement analysis. My one criticism of the article is the repetition: I am not sure if they were short on words, but it felt like the article’s focus (gap in research, lack of scale research, movement parameters, etc…) were stated too often. That being said, I found the discussion very thoughtful, especially the fact that when it comes to temporal scales “the common assumption ‘the finer, the better’ does not hold” (15). They explain that in case of moving cows, this is because of the limitations of the uncertainty in the movement. This is interesting, as it is drilled into scientists that ‘more data is better data’, but obviously, this is not always the case, and it is important to take into account the other limitations of the study.  This is an important concept to retain in GIScience, where so much data is “big data”. Maybe somebody should share this notion with the cell phone makers/service providers/app developers so that they collect less of our personal/private data!

Thoughts on privacy: Duckham and Kulik (2006)

Thursday, November 30th, 2017

I feel like the issue of privacy has come up in most classes this semester. It is such an important issue that is on most of our minds, most of the time as GIScientists. I find it interesting that the article by Duckham and Kulik was written in 2006, a couple years before everyone and (and their young children) had smartphones. In fact, this article is even more relevant today than it was then, and the issue is on our collective minds. An interesting note on the issue of privacy is the fact that both authors are from Melbourne-based universities, where CCTV is omnipresent. This is also especially relevant given the case currently being heard at the United States Supreme Court (Carpenter v. United States), which is based around the fact that police do not need a warrant to access locational information from cell phone-service providers, and centres on the bigger issue of people not being aware that they are being tracked by their cell phones at all times. Indeed, this issue is increasingly coming to the forefront of social discussions.

What I found most compelling is the discussion of the 5 criteria for regulation, and how they might be enforced or not today. The five criteria are 1) notice and transparency; 2) consent and use limitation; 3) access and participation; 4) integrity and security; 5) enforcement and accountability. These five points could be the basis for Matt’s entire discussion on Monday, so I’ll focus on just one: consent and use limitation. Certainly, we sign contracts with our cell phone providers, and accept the terms and conditions offered to us by Apple or Samsung (or whatever smart phone we have), and give away our right to privacy (which is a basic human right?!). That being said, we don’t really have a choice. If we want to participate in today’s economy/society, it is difficult to do so without a smart phone. For example, the ‘gig economy’ (as Lesley discussed in his presentation) often requires access to a mobile app, be it something like Foodora, Uber, TaskRabbit/Airstasker, even trading groups like BUNZ have location based apps. Do we really have a choice, or is giving away our right to privacy a necessary in our present society? I struggle with this a lot, because on the one hand, I value my privacy and believe we all have a right to privacy, but on the other hand, I am not a luddite and want to be able to have a smart phone. There is so much to discuss, I look forward to Matt’s presentation to learn more about the topic and how it relates to GIScience.

Thoughts on the nature of location privacy (Duckham and Kulik, 2006)

Thursday, November 30th, 2017

Dukham and Kulik’s (2006) article on location privacy and location-aware computing provides a comprehensive overview of issues regarding location privacy and the variety of strategies that can be used to protect this privacy. In particular, I found the authors’ delineation and descriptions of privacy protection strategies particularly fascinating.

The first interesting question that this article brings up for me is whether or not we have the right to location privacy, and how closely this privacy should be protected. As a subset of informational privacy, location privacy is a relatively recent phenomenon that has become relevant as our lives are increasingly dominated by location-based services and location-aware computing. There are an undeniable number of benefits that can come when we are open about sharing out location (eg. personalized directions, awareness of our friends’ locations). While on the surface, our location may not seem like much to share, there are an incredible number of inferences that can be made about us based on our current and past locations. As the authors mention, location is a unique type of personal information in that it can be used to infer identity. Anonymity and pseudo-anonymity is thus much more difficult to maintain. Furthermore, our location patterns can also be used to infer personal characteristics as specific as our political views and the state of our health. Information such as this is incredibly personal and, I believe, should be very closely protected.

However, I think the tension between privacy and openness is interesting to explore. We often think of both as desirable, but these concepts are largely in opposition with each other. In the case of government, for example, we want our leaders to be informed about the populations that they are making decisions for. We also want our governments to be transparent about the information that informs the decisions being made. How can governments practice open and informed decision making while also maintaining the informational privacy of citizens?

A FRAMEWORK FOR TEMPORAL GEOGRAPHIC INFORMATION (Langran & Chrisman, 1987)

Monday, November 27th, 2017

In this paper, the authors discuss the components of cartographic time and describe three methods of conceptualizing geographic temporality. The discussion is heavily based on traditional relation database perspectives (i.e., database lacking temporal considerations). Although it facilitates studying geographic temporality, the limitation of referring to traditional methods is obvious. For example, the consistency of temporally-changeable data is hard to promise even in a database with space-time composite. Moreover, this paper seems old for current research since the traditional database perspective is old.

 

According to the authors, the three important components of cartographic time are the difference between world time and database time, the relationship between version and state, and the interrelationships between object versions. However, the world time and database time nowadays are not much different in real-time data project. The high rate of data collection also blurs the version difference. That said, versions seem not to be aware as states captured in real time. The interrelationships between object versions are more implicit. In other words, huge amount of sequential information collected for the object. The interrelationships are not obvious until we start mining them. Besides, the collected data are not always stored in a database (i.e., the form of datasets may not satisfy the paradigms). Therefore, the traditional methods apply to investigate geographic temporality is not the best choice in most situations. New algorithms and models are play important roles in current temporal geography.

Langran & Chrisman: Temporal Geographic information

Monday, November 27th, 2017

Temporal GIS introduces the concept of temporal topology coupled with the more common spatial topology to allow us to better understand the relevance of time in cartography. Effectively, while time is a constant/infinite progression, maps and cartography can only portray certain glimpses of space along a timeline; whether it dynamic or static maps provide a snapshot/window of time and space. If we consider a map that display location based services applications, we would see that this information only exists for a relatively short period of time on the relative geologic timescale. Even prior to the modern cellphone use, we would see sharp contrasts in the abundance of these services. With this temporal information, it is easy to situate individuals not only in space through LBS but also in time which may be seen as even more invasive to their privacy.

The article does a great job in explaining the core methods in which we apply temporal analysis in cartography. it does not however go into much detail over the limitations of use for applying these methods. I’m curious to what problems or bias could potentially arise while adding time to cartography. I believe that the pros would most likely outweigh the cons in this case but that’s my opinion after only having read this article on temporal GIS. One issue for using temporal GIS could be the increased volume of data resulting from the desire and/or need to use temporal data.

 

Marceau – 1999 – Scale Issue

Monday, November 27th, 2017

Marceau’s article provides a look at how geographers and other social scientists use and understand relative and absolute spatial scale. For geographers, scale may be a more well understood concept compared to others, but that does not necessarily mean that scale is more important to a geographer’s work than to an engineer for example. Scale is crucial in understanding how process and effects occur differently on different scales – the project needs to take scale into consideration to effectively its issues by tailoring the work to a certain extent. In the case of a study on geosurveillance & privacy, scale is important to understand the area you are evaluating to ensure that all areas within the extent are relevant.

The article highlights well the issues that should be considered in terms of spatial scale, however it frames them all as ‘issues’ or ‘problems’. I would’ve like to see the author comment on why these seem to be inherently bad things. As far as I’m concerned these concepts are good things; they allow us to better understand the space we are working in while providing a level of focus to projects. Although they are hurdles to deal with; if dealt with properly, it may ensure that the results minimize the uncertainties and redundancies that would otherwise occur.

The article is 18 years old but the concepts of scale are just as important to consider today even with the web 2.0 platform. If anything, it has become more important to deal with these issues since the variety of data has increased through big data and may result in MAUP issues.

Scale in a Digital Geographic World (Goodchild & Proctor, 1997)

Monday, November 27th, 2017

This paper discusses the problem of characterizing the level of geographic detail in digital form. That said, the traditional representative fraction seems useful but has many problems. Among these problems, I think assessing the fitness of data sets for particular use is most critical in practice. The authors argue that it is necessary to identify metric of level of geographic detail, but there is no perfect one can handle all the issues raised by the “legacy problem”. For example, for analyzing big data, traditional methods may be replaced by scale-free methods for segmentation. The evolving rate of technologies is much faster than 20 years ago, so the “legacy problem” will be more severe and frequent. Therefore, another requirement for the metric is sustainability. The metric itself should be readily updated to adapt to the new geographic environment.

For moving away from paper maps, having correspondent metaphors to the proposed metric is necessary, but it is harder than constructing the metric. To satisfy the requirements of being understood by a user lacking knowledge of conventions, the metaphors should be strictly straightforward. However, there is no rule for guiding the design of new metaphors. Following the traditions usually more efficient in practice although it will inherit the limits from paper maps. Metaphors for digital geographic world cannot be separated from its metric, but complete novel metaphors are not acceptable at this moment. In the transition from paper maps to digital maps, we always need to make a trade-off. Perhaps, when transition is completed, there are new technologies we need to adapt to. We will be always in the transition.

A Framework for Temporal Geographic Information, Langran and Chrisman (1998)

Monday, November 27th, 2017

Langran and Chrisman (1998) discuss the antecedents of temporal GIS, its core concepts, and a number of ways in which temporal geographic information is conceptualized. The map/state analogy was helpful for my understanding of the spatial and temporal parallels. I suppose the stage concept of time is fairly intuitive, but I appreciated its connection to maps explained explicitly. The authors seem comfortable with the convention of representing spatial boundaries as distinct lines, but I can imagine how similar concerns for vagueness and ambiguity might arise in temporal data as well.

The authors did a good job of presenting the advantage and limitations of geographic temporality concepts. At the beginning they mentioned how the “strong allegiance of digital maps to their analog roots” was inadequate for spatiotemporal analysis, but I’ll admit that I didn’t think that the two concepts they presented really subverted this allegiance very much. Still, maybe I’m spoiled by the ways  people are re-imagining maps on the geoweb–an unfair comparison for a 1998 paper.

It was interesting to get a glimpse of historical temporal GIS research. It’s clear that one of the biggest concerns in the implementation of a temporal GIS framework is temporal resolution. If I could hazard a guess, I would think that such concerns might evolve from interpolating between temporally distance information into the question of handling large amounts of data collected in rapid succession. With the advent of big data, namely by way of social media, I can imagine how the application of temporal GIS has and will proliferate since the time the article was published.

Thoughts on Geovisualization of Human Activity… (Kwan 2004)

Sunday, November 26th, 2017

The immediate discussion of the historical antecedents for temporal GIS by Swedish geographers uses the 24-hour day as a “sequence of temporal events” but I wonder why this unit of measurement was chosen as opposed to 48-hours or a week to illustrate the periodicity of temporal events, which may not be captured at the daily scale. It is interesting to note the gendered differences that are made visible by studies of women’s and mens spatio-temporal activities. As the authors note, “This perspective has been particularly fruitful for understanding women’s everyday lives because it helps to identify the restrictive effect of space-time constraints on their activity choice….” I am curious about how much additional data researchers must collect to formulate hypotheses about why women follow certain paths to work or are typically present at certain locations at certain times. I am also curious about how this process is different when trying to explain the spatiotemporal patterns observed in men’s travel behaviour.

One of the primary challenges identified by the authors is the lack of fine-grain individual data relating to peoples’ mobility in urban environments, such as in transportation systems or their daily commutes. This paper was written in 2004 and now, with the rapid increase in streaming, GPS from mobile devices, and open big data sets for most large cities, this is less of a concern. The big challenge these days is probably in parsing the sheer quantity of data with appropriate tools and hypotheses to identify key trends and gain usable insights about resident’s travel behaviour.

The methodology used by the researchers for their study of Portland relied on self-reported behaviour in the form of  a two-day travel study. There are many reasons why the reported data might be unreliable or unusable, especially given the fallibility  of time estimation and tendency to under or over report travel times based on mode of transport, mood, memory of the event, etc. That being said, this is probably the most ethical mode of data collection and asks for explicit consent. I would be interested to know how the researchers cross referenced the survey data with their information about the Portland Metropolitan Region, as well as the structure of the survey.

-FutureSpock

 

 

Goodchild and Proctor (1997) – Scale in digital geography

Sunday, November 26th, 2017

As might be expected, Goodchild and Proctor provide an insightful and lucid evaluation of how conceptions of scale should translate from paper to digital maps, and their analysis remains pertinent in the face of two decades of rapid digital cartographic development. They argue that the representative fraction, as traditionally used by cartographers to represent scale, is outdated for use in digital platforms.

Firstly, I think the representative fraction struggles on a simpler level. In absolute terms, we’d probably find it hard to distinguish 250,000 from 2,500,000, so maybe the large numbers involved with representative fractions would be less preferable to those present in alternatives, such as graphical scales, which visually show the relationship between distances on the map and the real world (as used in Google Maps).

It is interesting to revisit the problems outlined in the paper that have been faced by web map makers. A significant advance in the navigation of scale in digital environments has been in the development of tiled web maps. By replacing a single map image with a set of constituent raster or vector ‘tiles’ loaded by zooming and panning through a user interface, this method facilitates levels of detail that vary with zoom level and position in the map. The appearance and disappearance of certain features (e.g. country names vs town names) has formed another metaphor for scale recognition.

I’m still finding it hard to reconcile the idea of scale as used in everyday language (to represent the range of spatial extents that a phenomena operate within) with its scientific/ GISc definition (as a broader metric for the level of geographic detail, as well as extent). Positional accuracy, resolution, granularity etc are fundamentally important across disciplines, but do they correlate with what people think of when they talk about scale? (sorry Jin)
-slumley

Kwan & Lee : Geoviz of Human Activity Patterns using 3D GIS

Sunday, November 26th, 2017

 

Having covered my talk on VGI and the implications of real-time tracking of individuals in space time, I found Kwan & Lee’s (2003) use of temporal GIS quite refreshing and a very unique and insightful study. In overtly using temporal GIS with such a large study group (7,090 households), this data collected goes from quantitative x,y & timestamp data, to very nuanced qualitative data when paired with contextual information, and compared against different study groups. I found this comparison between men/women and minority/Caucasian  everyday paths fascinating, and see how it could be used in a critical GIS lens to further analyse why these trends occur, and to empower these under represented groups in the realm of GIS.

I also found the use of 3D visualization very interesting (though to be expected) as you move from a traditionally planar form of GIS (x and y coordinates), to adding a third temporal attribute on the z axis. The papaer then delves into the intricacies of dealing with appropriate ways to display essentially a new form of GIS in an effective visualization, which poses a whole new range of issues vis à vis our Geovisualization talk by Sam. However, this extra z-attribute of time can be used for many new analyses using kernel functions to generate density maps to standardize comparisons of movement between individuals. This collection of movement data and analysis behind I find amazing, though also very scary when paired with the knowledge that such analysis could (and probably is) collected on a daily basis for not-so-critical or academic reasons, though rather targetted advertising and defense reasons in a form of coerced VGI.

All in all however, I find temporal GIS could be it’s own field in the creation of highly detailed datasets that can reveal much more than just location, and could aid in the creation of many tools and make for very rich data.

-MercatorGator

Marceau (1999)-The scale issue in social and natural sciences

Saturday, November 25th, 2017

This article is very interesting, and addresses what I think is a major issue within GIScience-the scale issue. Marceau (1999) lays out the “scale problem” (2), and provides a thorough review of solutions (and their limitations) from the literature. I also enjoy the before last paragraph of the paper, which suggests that the “methodological developments are certainly contributing to the emergence of a new paradigm: a science of scale” (12). While reading the paper, I wondered how this fit into the tool/science debate, and though I would tend to think of it as an important component within GIScience, I might not have considered “the science of scale” on its own, so it’s nice to see how the author clearly feels.

This issue seems omnipresent throughout geography (human and physical), and I know that I’ve had to deal with it within my own work. For example, my data collection will consist of me flying a UAV at a specific height (in order to achieve maximum photo resolution), thereby taking photos at specific scales. I will then create a model to make maps at specific scales. Beyond this, the maps I make will hopefully tell me things about the morphology of the landscape: will this be true only of Eureka Sound, or will it be generalizable to all of Ellesmere Island, or even all of the Canadian or International high Arctic? I do not find that any of the methods described in this paper provide a clear way to give a definitive answer on cross-scale inferences, which is to be expected. I think that as researchers, we must do our best to limit our inferences to the analyzed scales, and resist temptations to overgeneralize our results for increased importance. I am curious how things have changed in the nearly 20 years since this article has been published, what strides have been made, and what remains to be done.

Scale (Goodchild & Proctor 1997)

Saturday, November 25th, 2017

Prior to reading this paper I went in knowing scale was a key concept of geography, and one of much debate. After reading Goodchild & Proctor 1997 however, I feel this was an understatement. The authors extensively cover a much needed recap of traditional cartography, and the initial concreteness of scale and the common metrics used (i.e. buildings aren’t typically shown at a 1:25000 scale). I found this part especially interesting as it’s something that I never encountered in my GIS/geography classes, even though they’re key concepts in cartography. This becomes especially interesting when paired with their allusion to current day GIS acting as a visual representation of a large database (like OSM), and interestingly I thought of how OSM must have studied these concepts in creating their online mapping platform, as to only incorporate points of interest at a certain zoom level versus streets. The paper then goes to explain how concepts as such are needed in modern day digital maps in the form of Minimum Mapping Units (MMU), though how issues like raster resolution begin to define scale as the smallest denomination of measurement.

Another key point to the paper was the use of metaphors to describe how scale comes to play in traditional versus modern maps, and how is often redefined (such as in fields like geostatistics). I feel that the term scale should be kept as simple as possible to avoid running into issues like the modifiable areal unit problem, and appropriateness of scale. Scale will always be an important part of GIScience, as it’s inherantly associated with distance and visualizing geographic space, and I feel that extensive research into issues of scale like this paper will be needed in the future when mapping goes further and further from its traditional cartographic roots, into the new realms of GIS like VGI, location based services, and augmented reality.

-MercatorGator

Kwan and Lee (2004) – Time geography in 3D GIS

Saturday, November 25th, 2017

In this article, Kwan and Lee (2004) explore 3D visualisation methods for human movement data. In the language of time-geography, which borrows from early C20th physics, space-time paths describe movements as sets of spacetime coordinates, which (if only two spatial dimensions are considered), can be represented along three spatial dimensions. These concepts have become a fundamental part of recent developments in navigation GIS and other GIScience fields. For instance, Google Maps considers the time at which a journey is planned to more accurately estimate its duration. 

While their figures represent a neat set of 3D geovisualisation examples, it might have been worthwhile to have discussed some of the associated challenges and limitations (e.g. obstructed view of certain parts of the data, the potential for misinterpretation when represented on a 2D page, user information overload, the necessity for interactivity etc.). Further, how does 3D visualisation compare with other representations of spacetime paths, such as animation?

More broadly, I didn’t fully understand the claim that time-geography (as conceived in the 1970s) was new in describing an individual’s activities as a sequence occurring in geographic space (i.e. a spacetime trajectory). Time hasn’t been entirely ignored in Geography contexts in the past (e.g. Minard’s map), neither has it been ignored in other disciplines. So does time-geography purely emphasise the importance of the time dimension in GIS research/ software, or does it provide a set of methods and tools that enables its integration into the geographic discipline? Is time-geography done implicitly when researchers include a time dimension in their analysese, or does it represent a distinct approach?
-slumley

Thoughts on Langran and Chrisman

Friday, November 24th, 2017

I found this conversation about temporal GIS to be a particularly interesting introduction to the topic of temporal GIS. This notion of GIS adds an extra dimension to my shifting idea of it being primarily based on the representation of maps to a tool in retaining and displaying data. Sinton (1978) notes that geographic data is based on theme, location and time, so it is interesting to note that all these notions can be reduced to digital quantification.
The authors do offer a hint at philosophical musings, but don’t delve deep into it. The simple notion of linear time was enough to spark the conversation of three separate ways of displaying temporal change. Though their idea of time is not marked exclusively by linearity, but they associate the concept into a topological understanding, based around temporal relationships one may have to one another. The three methods that were discussed seemed to meld representations of temporality with spatiality with each new method. The melding of the two dimensions may lead to an interesting discussion, as they are inseparable in GIS.
The authors choose not to delve into the topic of visualization, leaving it as ‘a problem to leave to future discussions’. I am doing my project on movement, so this notion of graphic representation felt like a clear framework to deal with the kind of data I’m handling.

Thoughts on Goodchild and Proctor – Scale in a Digital Geographic World

Friday, November 24th, 2017

Goodchild begins with a notion that originally shocked me. The metric scale was supposedly unsuitable in digital cartography. I had never considered this notion before, but he nonetheless present a convincing explanation of his reasoning. Confusion over what he means by scale is widespread, which either significance the spatial extent of the map or the level of granularity that the data represents. There has also been issues over what kind of information is appropriate to represent along this axis of scalability. Goodchild proposes a new dimension of scales that is more appropriate for computer programming.
He offers two different bases of scale.
The 1st is object model scales, which is based on the choice of objects that the GI scientist wishes to study. Typically, the smallest object studied would figure clearly on this map. The second would be the field models which would simply be the size of the pixel’s fields.
While I read the description of these last two models, I felt as if I’d reached the climax of some detective novel. Obviously, the object model sounded very much like vector, and the field model sounded very much like raster. I had never considered these two ways of data representation as a type of scale.
These two scales are most useful when handling data, but they are typically misunderstood for interpretation by non-experts. The metric system is still important for the visualization of this data, which often appears as exclusively spatial extent when finalizing a map and making it legible for a general public.
It is very interesting to see a paper creating a tangible link between traditional and digital cartographic models.

Lang ran and Chrisman (1988)- Temporal GIS

Friday, November 24th, 2017

I found the discussion on temporal geographic information by Langran and Chrisman (1988) interesting, as it teetered between being relevant, and being dated. On the one hand, time is still a difficult thing to express and visualize, both attached and unattached to spatial data, but it is still extremely important to convey. On the other hand, I wonder how much has changed with advancements in interactive and online maps, which can very easily show different temporal layers one after the other, or move through time on command. Moreover, technological advances in surveillance, like UAVs used in police surveillance or traffic control, will create a wealth of spatio-temporal data greatly surpassing the kind described in the paper, which would require much more sophisticated processing and computing. I wonder how much of this discussion is embedded  geovizualisation, and whether “temporal GIS” is a standalone subject, or rather an important component of Geovis/Critical GIS/VGI/PPGIS/data mining/UAVs/etc…

I enjoyed the very beginning of the article, where the ‘nature’ of time is discussed (time is infinite and linear) and cartographers (GISers?) “can sidestep debates on what time is, and instead focus on how best to represent its effects” (2). I would argue that the way in which it/its effects are represented can, in fact, inform and serve as an interpretation of time. If a spatial map attempts to represent a “ground truth”, can’t a temporal map represent a “time truth”?

This is one of my favourite memes, with a quote on time from HBOs True Detective (with Mathew McConaughey and Woody Harrelson)- very meta.

 

time-is-a-flat-circle

Marceau (1999) and Scale

Friday, November 24th, 2017

Marceau’s (1999) article does an excellent job of highlighting the significance of scale across both human and physical geography. This article made me think more deeply about the impacts of scale that I every have had to before, which points to a significant gap in my education in geography. While I have been taught about the MUAP and other various impacts of scale, I feel that these issues have been addressed in isolation or as a mere sidebar to other concepts. As indicated by this article, scale is a fundamental spatial consideration (and often a problem) that should be thoroughly addressed for any research project that considers space. The fact that all entities, patterns, and processes in space are associated with a particular scale (or the “scale dependent effect”) means that it cannot be ignored.

I was particularly interested by Marceau’s discussion of how scale differs between absolute and relative space. The operational and clearly defined idea of scale in absolute space is addressed much more often than the more ill-defined concept of scale in absolute space. I’ll admit that, even after rereading the paragraph several times, I’m still not sure what Marceau means in defining relative scale as “the window through which the investigator chooses to view the world” (p4). If this definition was not explicitly linked to scale, I would consider it to be referring to something more like investigator bias or investigative lens. How is this “window” connected to space? I would have appreciated an example to further clarify this.

Scale Issues in Social and Natural Sciences, Marceau (1999)

Thursday, November 23rd, 2017

Marceau (1999) describes the significance of and solutions to the issue of scale as it relates to social and natural sciences. The articulation of fundamental principles was helpful in demonstrating the importance of scale as a central question in GIS. It’s clear that the question is particularly important now as we continue to develop a more nuanced appreciation for how observed trends might vary across different scales of analysis.

The discussion of domain of scale and scale threshold stood out to me. I can imagine how differences in the patterns observed between scales would be helpful for organization and analysis. I’m curious about how these observed thresholds would manifest in reality. Are they distinct? Vagueness in our conceptualization of geographic features and phenomena seems to be so prevalent throughout the built and natural environment. I would think that these concepts would somehow shape our analysis of scale in some way that would favour vagueness in the spatial scale continuum. Still, it’s conceivable that sharp transitions could be revealed through the process of scaling unrelated to any vague spatial concepts. An example might’ve made the existence of scale thresholds more obvious to me.

It was an interesting point that an understanding of the implications of the Modifiable Areal Unit Problem took notably longer to develop in the natural science community–perhaps because GIScience as we it now was only in it’s infancy? In any case, it’s another reminder of how significantly spatial concepts can differ between geographies.