Archive for the ‘506’ Category

Thoughts on Location privacy and location-aware computing (Duckham & Kulik, 2015)

Sunday, November 24th, 2019

This is a book chapter from Duckham and Kulik, serves to introduce location privacy for both the problem and possible solutions. They highlighted the privacy concerns of revealing too much personal information due to the real-time location-sharing with service provider as well as information collector. As well as the abusive use of location information, which could potentially leads to harmful reality to individuals. The three possible strategies to protect location privacy are: 1. regulatory strategies; 2. privacy policies; 3. anonymity.

The most intriguing part of this article for me is the protection for location privacy. I do agree with the regulatory and privacy policies approach. I doubt, however, is there true anonymity on the internet now? Since our devices are designed to gather information from us, with enough information gathered from location-aware devices, even with masking technology, it is still not hard to identify individual with current technologies. Thus instead of focusing on anonymity, which for me, seems like a backward from information evolution, it is more important to control the access for the information fed by location-aware devices. Regulatory and privacy policies are certainly necessary and effective, however, a effective and authoritative surveillance system must be established to ensure the functionality of regulatory and privacy policies.

Online policing, information giant surveillance, hacking control, and information access control etc, needs to work under a public trusted and relatively transparent yet fair system. A system only serves political agenda to national or regional benefits (such as US government’s accusation on information security of Huawei product) does not fit the criteria. Thus, there is an emergence need to establish information control agencies, works smart and fair, serves international and planetary benefit, to reinforce our location privacy.

A geoprivacy manifesto

Sunday, November 24th, 2019

The article presents a review on the individual location privacy issues by telling a little story of a couple’s one day life which is really interesting and easy for readers to understand how location privacy issues take place in everyone’s daily life. It is really a good paper explaining the basic concepts of geoprivacy and how it is produced and utilized, together with some profits and threatens that individual location brings about. From my perspectives, it is apparent that individual location information would bring great profits for politics, businesses, academics etc. because it provides the data sources for further human driven research analysis, prediction and other uses, helping understand the human behaviors for further profit-making applications. However, the debate about how sharing location information harms individual privacy keeps emerges. As it is referred to in the article, one big difference in geoprivacy and other privacy issue is the claim of individuals to determine whether to use or share the location information. Usually when people open an app or other services requesting location permission, they always decide whether to use the location services though there may be risks of utility for their individual location information. Depending on advantages of what it brings, people usually agree to share their location information. Thus, what I think matters most is to tell the users what kinds of risks and potential use of their location information and to regulate more on who will have the access to those information and restrict rules on proposal of using public individual location information. How GIScience would assist in those regulations and help if it is possible to conduct better location based human studies without revealing individual privacy inforamtion.

Theorizing with GIS: a tool for critical geographies?

Sunday, November 24th, 2019

This paper written by Marianna in 2006 did a detailed literature research on qualitive and quantitative research aspects in GIScience, by proposing that placing GIS in quantitative group is misleading and raising up attention on qualitive research in GIS would help develop the foundation of social theory (processes and relationships) with spatial complexity. In the beginning, the author discusses how GISystem technologies be applied in quantitative research where GIS acting as a profound tool for discovering knowledge and patterns in purpose of making profits. However, quantitative research in geography are never independent of qualitive research and there are emerging debates on the role of these two kinds of researches. Furthermore, qualitive research and quantitative researches are now more associated and mixed when dealing with critical geographic issues.

What really interests me is that it is argued that qualitive shows up as a continuum of quantitative, but personally I think qualitive researches may be more fundamental than traditional quantitative research because without the understanding of quality issues about objects that further analysis would base on, quantitative should be nothing reliable. What’s more, it is discussed that qualitive methods are simple and easy but producing vague information which do not contribute much to theory development. Thus, how could qualitive research be widely accepted and gaining more attention and though it is fundamental to qualitive but how to conduct critical qualitive research and is there any universal regulation?

The paper explores deeply about the associations between qualitive and quantitative research in GIS and I am still a little confused of what the “critical geography” means from this paper.

Last, as stated in the paper, gap of quantitative research and qualitive research is being narrowed in the field of human geography, for better interpreting social processes and relationships, but how the mixture of these two kinds of research be applied in the fields of physical geography study? Or does it still matter a lot in physical geography studies?

Thoughts on An Introduction to Critical Cartography (Crampton & Krygier, 2006)

Sunday, November 24th, 2019

This is a paper thoroughly reviews the foundation, revolution, current states, and possible future of critical cartography. Crampton & Krygier started with the essential of critics, emphasize the scientific, political, and resistant nature of critiques. Then they continued to examining the evolution of how cartographic critique evolved and conflicted. At last, after summarized the current stage of critical cartography, they focused on the five possible areas that further critics on cartography may thrive: arts of mapping, everyday mapping, maps as resistance, map hacking, and theoretic critique.

To me, my interests on critical cartography focuses more towards the theoretic critiquing part. I do agree that all critiques essentially helps the development of a discipline, but theoretic critics might help the most. As critical cartographers critiquing and re-thinking Robinson’s modern cartography theory, they helps promote a paradigm shift, a new way of thinking in cartography, which changes the discipline from its very foundation. Also, as the theories in current cartography and GIScience becoming more diverse and seeking to explain complex phenomenons, the critics on theories helps reveal and understand the complexity behind previously complex topics, which is I perceive as the most fundamental meaning of geographic research.

Also, their interpretation on map artists is very intriguing, since art and science traditionally have different yet somehow interactive epistemology. It is interesting to see how artists will contributes their unique perspective and ideology into cartography, which is a discipline arguably started from arts and evolved into science, or in some case, it is still an art. At the end, cartography seeks to present knowledge of the world, there is good opportunities of getting inspiration from the world via critics.

Thoughts on “Theorizing with GIS: a tool for critical geographies?”

Thursday, November 21st, 2019

This piece is a good introduction to issues regarding GIS and its role on quantitative and qualitative research. It all resonates back to a theme we discussed at the beginning of the course, which is whether GIS is a science or a tool. This article is advocating for GIS as a GIScience, stating that GIS is a method, not a distinct quantitative or qualitative tool. Pavlovskaya, the author, further expands that quantitative and qualitative methods are also not as simple or opposite as some believe.

Pavlovskaya’s piece could have benefited from a solid introduction and definition of the term critical GIS, as to let readers unfamiliar with the topic better understand her arguments regarding the quantitative-qualitative divide. I agree with her concluding point that GIS should be better utilized in qualitative research in order to improve representation. 

This reminds me of a piece I read while researching my own topic, VGI, called “Crossing the qualitative-quantitative chasm 1: Hybrid geographies, the spatial turn, and volunteered geographic information (VGI)” by Daniel Sui and Dydia DeLyser. In this article, Sui and DeLyser argue that VGI can be a means to cross this chasm, with which I agree. Increasing qualitative research through public participation is not only a means to address top-down research tendencies, but also a way to gain data that was previously not possible to obtain. VGI and the neogeoweb give non-experts the opportunity to contribute geographic information, which is also shifting the historically-dominant role of professional geographers; I believe this shift falls right in line with critical GIS principles and its challenges to the status quo.

It’s also interesting to place Pavlovskaya’s paper in temporal context, as it was published in 2005, and I believe that in the past decade and a half there has been a lot more research that combines quantitative and qualitative research within GIS. However, this is just my perception of the field, and I do not have any hard facts to back up this belief.

Thoughts on Web Mapping 2.0: The Neogeography of the GeoWeb (Haklay et al. 2008)

Sunday, November 17th, 2019

This paper overviews the development of Geography in the Web 2.0 era, where neogeography is founded based on the blurring boundary of geographers, consumers, technicians, and contributors. The case study of OSM and London Profiler showed how technology allows geography and cartography embrace new form of data sources, interaction with general public, as well as the access to geographical knowledge and information.

The most intriguing part for me from this paper is the debate of whether Web 2.0 brings public participation in Geographic information manufacturing and researches a “cult of the amateur” or “mass collaboration”. From my point of view, this is what professionals from the realm of geography is needed. Data is neutral, but not their providers, contributors, or manufacturers. However, no one is going to complain about the abundance of available data, especially it is detailed user generated data that serves as a complement of what was missing before. It is up to the expert to decide what to do and how to interpret with the data, instead of blaming the source of the data.

From there, my point being there is nothing wrong with the information, along with its provider. What only matters are the people that interpreting, making profit, and discovering knowledge from it. Thus whether Web 2.0 is facilitating a “cult of the amateur” or “mass collaboration” solely depends on how the professionals who is trying to use the data. Even some seemingly useless data in one research can be extremely important for another field of research. Thus, geography as an open-minded multi-disciplinary science, drawing lines and rejecting what is novel, trendy and probably the future, is not supported by the ideology of the subject. Adaption and evolution should be the key for a science subjects.

Le GéoWeb

Sunday, November 17th, 2019

The advent of the Internet followed by the arrival of Web 2.0 have no doubt changed the way geographic information is obtained and shared, a fact that is well described by this article by Haklay et. Al. Without saying the age for paper maps is behind us, the internet has propelled us into an era where internet and geography are combined like never before.

Although the Internet has allowed many to navigate on a digital Earth for the first time, several issues have appeared from the combination of the internet and the discipline of geography. The field has been democratized by the increased online accessibility to geographic tools and mashups, which increases the visibility of geography but also could be seen as being reductionist, geography essentially being non-experts having fun by geotagging pictures. Speaking of geotagging, the social media boom has lead to people going en masse to previously unvisited areas, a phenomenon that has led to an explosion of visitors to the Horseshoe Bend in Arizona for example, something that has drastically affected the local environment.

Researching Volunteered Geographic Information: Spatial Data, Geographic Research, and New Social Practice

Sunday, November 17th, 2019

Sarah Elwood et al. wrote a fundamental paper about Volunteered Geographic Information (VGI) from aspects of definition(s) of the VGI, domains researches of VGI, frameworks within or beyond the VGI, impact of VGI on spatial data infrastructure and geographic researches, quality issues of VGI data and concerning methodologies. VGI does contribute to the data explosion nowadays and expand the research contents in new fields, but there still plenty of issues exist confronted with creation and application of VGI data. I am really interested in the quality issues of VGI data which I think is the most important issue when conducting researches using VGI data though there are a host of situations when VGI is beneficial even if the quality is hard to assess. The authors do point out four aspects of insights concerning the quality issue in VGI that it is quite different from the traditional data qualities, however, I am still curious about how to deal with those difference and difficulties in uncertainty and inaccuracy of the VGI data. How could new developed AI technologies help in determining the basic quality assessment rules when filtering useful information? And I think education in geography would be really helpful for further better developing the high quality VGI system.

Moreover, it is argued that VGI is a paradigmatic shift of data collection, creation and sharing in GIS. Does that mean the traditional types of data like raster, vector, object based, etc. would be changing forced by the development of VGI? Also, Web plays an important role in VGI and does that mean VGI would not be developing independent of WebGIS, so what is the relationship between WebGIS and VGI (both would be presented next week)?

Thoughts on Citizens as sensors: the world of volunteered geography (Goodchild, 2007)

Sunday, November 17th, 2019

This is a Goodchild works that serves as a brief introduction to the topic of VGI. Although it is done in 2007, when computational power and artificial intelligence was just in the start-up phase. We do see how VGI serves as a main data source in the field of Cartography and some geographic related fields. However, by highlighting the contributions of VGI, he also pointed out the limitation of relying on VGI as a source of geographic data – the validity, accessibility, and authority of data.

Nowadays, we see OSM and Google Maps are used as major sources of many spatial analytical researches, especially in a larger extent when primary data collection became time- and human-consuming. Just as Goodchild argues, from the perspective of researchers, the availability of spatial data can be extracted from VGI sources is promised, there are questions need to be asked about synthesizing and validating VGI data to increase the accuracy of data.

Who contributes to the data? This is the question unsolved even after 12 years after he wrote this paper. This particular question asks the coverage of population that VGI data might represents, the area it covers, and the scope it uses. Why do people do this? Another question relating to the bias and incentive of VGI data, which potentially influencing the result from researches using VGI data. Also, with various available VGI data sources, how we can incorporate them together to cross-validate, references each other to generate better accuracy for our objective is the question I would like to seek for an answer. As well as how to cross referencing different sources (other than VGI) to VGI data to increase its validity, and somehow gives them authority is another interesting topics I am eager to learn.

Thoughts on “Goodchild – Citizens as sensors”

Sunday, November 17th, 2019

This article by Goodchild lays out the foundation of Volunteered Geographic Information (VGI) by explaining technological advances that helped it develop as well as how it is done.

The widespread availability of 5G cellular network in the upcoming years will drastically improve our ability as humans to act as sensors with our internet-connected devices given improved upload/download speeds as well as lower latency. These two factors will greatly help in the transfer of information, for example allowing for more frequent locational pings or allow more devices to be connected to the internet as 5G will allow more connections compared to 4G.

Although VGI provides the means to obtain information that might otherwise be impossible to gather, the reliability of the data can be questioned. An example could be with OpenStreetMap, where anyone is free to add, change, move or remove buildings, roads or features as they please. Although most data providers do so with good intentions, inaccuracies and errors can slip in, affecting the product. As other websites or mobile applications use data on OSM to provide their services, it becomes important for users and providers to have valid information. As pointed out in the article, the open-nature of VGI allows malevolent users to undermine others’ experience. An example of such an event would be with people recently taking advantage of the VGI nature of OSM to change the land coverage of certain areas in order to gain an advantage in the mobile application Pokemon GO.

Finally, there is also an issue with who owns the data. Is it the platform or the user that provided the data? Who would be responsible if an inaccurate data entry leads to an accident or a disaster? As with any other growing field closely linked to technological advancements, governments will need to further legislate on VGI in order to allow for an easier regulation.

Thoughts on Simplifying complexity: a review of complexity theory (Manson 2000)

Sunday, November 10th, 2019

This paper thoroughly reviewed and examined the field of complexity. By diving and recognizing three different kind of complexity theories: 1) algorithmic complexity; 2) deterministic complexity; 3) aggregate complexity. The author systematically explained each complexity theory with different implication and future research opportunities, opens a new door for me as a urban researcher.

I do agree with the author that complexity needs to have more attention from geographers and planners, since from my first class of urban geography, I have been taught and agreed that cities are open systems that the public and academics have yet found a way to understand. Thus, to better simplify cities and urban research areas, understanding the complexity is the first step. Although, the majority of urban researchers seek to simplify urban environments to reach a empirical theory/statement/knowledge. However, simplification needs to be done after fully understanding the complexity of the existing study objects. In urban geography and planning, I doubt anyone had ever thoroughly comprehend all the underlying components that makes a city work. Thus, there is necessity for urban researches and GIScientists to study the algorithmic, deterministic, and aggregate complexity before proposing a simplified models. In the realm of urban related study, the need for complexity research is urgent, before this study area became a palace build on the cloud.

In addition, for GIScientists especially, understanding and studying algorithmic complexity might be the future trend of study, regardless the field their study objective landed at. Since the discipline’s technological foundation makes GIScientists easier to be aware of such issue, as the capability of addressing algorithmic complexity is advantageous compare to researchers from other spatial related disciplines.

Thoughts on Class Places and Place Classes – Geodemographics and the spatialization of class (Parker et al. 2007)

Sunday, November 10th, 2019

First of all, after reading the whole article, I am very confused on the structure of it. It is not well-structured, from my personal perspective, which lead to some confusion of myself about the topic of geodemographics. Also, throughout the article, it is hard for me to get anything related to GIScience, since except of some hint of using information technology to classify population, the whole article focuses on sociology perspective of geodemographic rather than GIScientists view of the topic.

Part of the reason that this article is not that related to GIScience is probably the time they wrote it. 2007 seems in the period of transforming GIS to GIScience, while GIS itself were not strongly bonded to other disciplines as it is now at 2019. On the other hand, although the article focuses more on the classification methods, it is less concerned with existing debate from computational classification method, rather than discussing more supervised heavily human-intensive classification works.

However, the article definitely gives a brief introduction of what is geodemographics is. What the major debates is on the field of geodemographics, from the sociology perspective. I do wander how GIScientist sees their fellow sociologists’ classification methods, as well as ontologies for geodemographics derived from the sociological classification methods.

Furthermore, I wonder that in the age of big data analysis, how the combination of big data and GeoAI contributes to the field of geodemographics from a GIScientsts perspective. Since so far in this paper, the authors stated that sociological classification still heavily focus on Census and commercial data, which are specially designed to study demographics. On the other hand, I would like to learn more about what the reactions are, from included/excluded population in geodemographic classification. As well as some ethical discussion related to the process of classifying population.

Thoughts on Parker et al., (2007) “Class Places and Place Classes: Geodemographics and the Spatialization of Class”

Sunday, November 10th, 2019

The article “Class Places and Place Classes: Geodemographics and the Spatialization of Class” by Parker et al., (2007) introduced the concept of geodemographics and on a small research study that focused on geodemographic classification, the relationship between ‘class places’ and ‘places classes’ and their co-construction.

I found this article to be quite interesting, particularly the parts that touched on the merge of this type of data with web technologies. This would allow for data to be interacted with by a larger portion of the population with greater ease.

After reading this article I was left with a few questions. One of these being what the effects of this type of data are. Taking the case of this article and the research study done, this sort of generalization of residents in urban neighborhoods to create a classification seems problematic. As the information gets used, people’s perceptions of places are based on census data. I feel this highlights socioeconomic differences in urban settings and further divides populations based on differences.

Thoughts on “Turcotte – Modeling geocomplexity?: “ A new kind of science .””

Saturday, November 9th, 2019

This article by Turcotte emphasized the importance of fractals in the understanding of geological processes as opposed to statistical equations, which cannot always explain geological patterns.

Although this reading provided insight into how various situations are modeled and how statistical modelling plays an important role into understanding the geophysics or our planet, geocomplexity as a whole still remains a rather abstract concept to me. The article provided some illustrations that greatly helped my comprehension, but more would be necessary to better comprehend some concepts. Illustrating complexity may be complex in itself, but

Will we find new statistical formulas to model problems we couldn’t model in the past? How we understand and conceptualize Earth plays a vital role into how GIScientists are able to push for further knowledge. Recent technological advances in quantum computing, artificial intelligence and increasing supercomputing capabilities open the door for further innovation in the field. For example, geological instability could better be understood. In those scenarios, could weather or earthquakes become more predictable? Further advances in related fields such as geophysics and geology will also greatly contribute to GIScience.

The concept of chaos theory is also very intriguing to me, a theory I’d never heard of before. A quote from Lorenz greatly helped me understand the concept: “When the present determines the future, but the approximate present does not approximately determine the future”, meaning small changes in the initial state have an effect on the final state of a particular event.

Reflection on “The Impact of Social Factors….On Carbon Dioxide Emissions” (Baiocchi et al., 2010)

Saturday, November 9th, 2019

In Baiocchi et al.’s piece they analyze geodemographic data to better understand the direct and indirect CO2 emissions associated with different lifestyles in the UK. They open the piece by listing criticisms in the field of environmental input-output models, namely that there is too much literature dependent on top-down classification, too much emphasis on consumer responsibility, too much literature with entirely descriptive analyses, and that the term ‘lifestyle’ is defined by expenditures, which ignores human activity patterns. Using geodemographic data as a basis for their study mitigates the potential harm from these criticisms.

One thing I noticed about this paper was how it used geodemographic data as a way to create a bottom-up procedure for their research. Historically, the fields of geography and cartography have been very top-down in nature, with little, if any input from “non-experts”. One of the ways GIS has been so revolutionary and popular is that it is redefining how and what people can contribute, and today there is ample opportunity for “non-experts” to be involved. As geodemographic data was around long before GIS existed, I did not initially realize how it could contribute to more bottom-up approaches. Now, I know, among other reasons, that there is open data almost everywhere, making it much easier to access and understand, and that GIS technology in general is much easier to access and understand than ever before.

I’ll end my reflection with a few general questions about geodemographics. Specifically, what is the difference between demographics and geodemographics? Doesn’t all demographic data have some sort of location/geographical component? 

 

The Impact of Social Factors and Consumer Behavior on CO2 Emissions in the UK

Saturday, November 9th, 2019

This is an interesting case study using geodemographic data to analyze social economical factors’ impact on carbon dioxide emissions. Regardless of how carbon dioxide emissions affected by different social economic determinants, I am curious about the original geodemographic data used for further analysis. The study uses geodemographic data in ACORN database and conducts research based on lifestyle classification, and my question is what lifestyle exactly means and what rules are based on for ACORN classification defined. Also, the socioeconomic variables used in regression analysis are from wider categories of housing, families, education, work and finance, etc. Are these the typical variables or objects which geodemographic theory usually deals with and are these the research contents that demographic researchers focus on? Moreover, the geodemographic data are coded at the postal code level which could be explained as scale that the data built on. Is there any possibility that regression analysis results of what impact CO2 emissions would change if the scale changed? Does policy districts or postal code allocation rule play a role of noise in the analysis.

Another thing I want to point out is that we did learn about human mobility in last week seminar and could movement theory study be applied into studying geodemographic in aspect of changing over time, or it does not matter in developing the geodemographic theory.

Thoughts on “Parker et. al – Class Places and Place Classes: Geodemographics and the spatialization of class”

Friday, November 8th, 2019

As with a wide variety of other research fields within the confines of GIScience, it will be interesting to see how geodemographics may change with technological advances in machine-learning. An example could be with the delineation of boundaries between clusters, which could be fractured or combined based on reasoning that could be quite difficult to understand for humans. These geodemographic generalizations of space could also be continuously computerized in a not so distant future, which could lead to an ever changing assessment of neighborhoods on a very short temporal scale. Micro-level analysis could also allow for a better representation of a neighborhood based on recent population inflow or outflow data, data that becomes increasingly accessible in the era of the Internet of Things (IoT).

The thresholds used to assess whether a neighborhood is more closely related to x rather than to need to be defined quantitatively, which forces a certain cutoff and brings in a little subjectivity. An example could be demonstrated with the occurrence of a natural disaster in a hypothetical neighborhood, which could lead to a sufficient devaluation of houses to warrant changing how the neighborhood is characterized. In that case, a population possibly once seen as energetic and lively (or as defined by Parker et. al as a live/tame zone) could be completely changed to a dead/wild zone from one day to the next. Although these would be reassessed at some point in time by corporations or the government, technological advancements grant the ability to reassess neighborhoods much more rapidly.

As someone not well versed in the conceptualization of geodemographics, it becomes apparent that a balance needs to be made between the number of classes needed and the level of representativity desired; after all, every household could be considered unique enough to warrant its own neighborhood. Future advances in the field might incorporate a three-dimensional analysis of neighborhoods in densely populated urban centers, as residential skyscrapers present vertical spatial clustering.

Simplify complexity: A review of complexity theory

Friday, November 8th, 2019

It is really a good paper that reviews the principle research fields in complexity theory with clear structure and simplified explanation, uncovering the nature of complexity. Generally, complexity theory describes objects with nonlinear relationships between changing entities with qualitative characteristics examined and how interactions change over time, etc. And the author breaks the complexity theory into three major parts: algorithm complexity, deterministic complexity and aggerate complexity. However, I am still a little confused about why it should be divided into those three parts? Is that possible if we think about and explain well complexity theory in time complexity and spatial complexity?

Should most of research question have to take into account the complexity theory because most of objects in natural environment and human society do have the general characteristics that complexity theory deals with? And it is really interesting when talking about self-organized system that will receive balance between randomness and stasis like peatlands ecosystem. But how complexity theory that helps explore the self-organization in physical geosicence be applied in social economical study is really appealing. There is still some unclear space in complexity theory study. How could new developing techniques like GeoAI and spatial data mining that extract more hidden knowledge help complexity step further? These are all interesting and exciting questions to be answered in the future.

Thoughts on “Simplifying Complexity” (Manson, 2001)

Thursday, November 7th, 2019

As someone with very little knowledge on complexity theory before reading this article, I think Manson’s piece offers a solid introduction to the concept. I can see how complexity theory directly relates to geographic and GIScience problems. It all comes back to Tobler’s First Law of Geography, as geography creates complexity not replicated in other disciplines. 

“The past is not the present” and “complicated does not equal complex” are two concepts we have discussed at length in class. Regarding the first statement, complexity theory looks at entities as in a constant state of flux, and could thus reduce the problems associated with the “past = present” assumption; for instance, a common issue here is assuming that the locations of past actions will be the same as the present ones, among others. Regarding the second statement, this article was written in 2001, before big, complex, data was around like it is today, especially concerning its variety, veracity, value, volume, and velocity. Big data is complex, but not complicated. There are methods and technologies to more easily analyze this data; however, technology and complexity theory must keep up for researchers to continue to adequately analyze it. 

the unified theory of movement is here

Sunday, November 3rd, 2019

This is the only blog post I’ve actually wanted to write.

All things are dynamic. In our last class, Corey showed that even though we were equipped to portray a river as a dynamic feature, we did so statically. I bet we did this because of the numerous ways we are told to think of our world as static. Relationships are inherently dynamic, but we have static statuses to represent sometimes extensive periods. We take repeated single-point observations to measure some natural phenomena, then interpolate to fill in the blanks. But what are these blanks: evidence of dynamism. Since all phenomena are actually dynamic; falling down some temporal gradient — not dissimilar to Miller’s space-time cubes concept.

Miller brings up scale issues in movement. Traditional movement scientists like kinesiologists, physiotherapists, or occupational therapists think of movement on completely different scales than do movement ecologists. In fact they have a different semantic representation of movement as well, often related to the individual irrespective of the environment. Geographers and human mobility researchers have their own ideas about drivers and detractors of movement that run contrary to ecologists conceptualizations. So, how do we move toward an integrated science of movement? The best option is to start thinking about movement as fractal patterns. There’s a primatologist at Kyoto studying just that in penguins (which are not primates ….) to get an understanding of interactions of complexity, scale, movement, and penguin deep-diving behaviour. Think about this: this researcher is interested in how movement is invariant across scale and can explain behaviour as a complex phenomena. There’s already a unified theory of movement — it’s called fractal analysis of movement.

I am optimistic about the potential of merging scale-invariant disciplines: if physicists could accept Newton’s law of universal gravitational attraction, even when it couldn’t explain solar systems with more than 2 planets, why can we not accept that movement unifies us even if it cannot predict each time-step for each species taking whatever method of transport. It’s a narrow-minded perspective to say that we can’t have unified movement theory because some people take bicycles, while others prefer the Metro. Algorithms cooked up by silicon valley are already capable of differentiating this movement — doesn’t that mean these are already unified in the neural network’s internal representations of movement? Train a neural network to detect directionality of some moving object. Assuming you did the requisite pre-processing, chances are that algorithm will tell you the direction of any moving object. That’s unified movement theory. Not convinced? Take the first neural network and perform transfer learning for another object. The transferred network will outperform a network that didn’t ‘see’ the first objects movement/directionality. This is unified movement theory. There’s a team of researcher’s studying locomotion in ants who strapped sticks onto the legs of ants. They found their ants on stilts would walk past their intended destinations. Doesn’t this indicate that regardless of the interaction between ant and environment (the ecology), movement could be characterized using common conceptualizations: be they step-length, velocity, or the ant’s internal step count?

This paper came about as a discussion Miller had with various mobility/movement researchers; what’s clear is that people don’t have answers. It’s not as simple as ecologists neglecting scale or geographers neglecting behaviour: our silo-ed approach to science is undermining our ability to comprehend unifying phenomena. And I bet movement is that unifying theory. Can you think of anything that’s truly static?