Archive for the ‘506’ Category

Tipping the scale toward “science”

Thursday, February 14th, 2013

Marceau’s sums up issues pertaining to the variability in scale including scale dependence, scale domains and scale thresholds. At the crux of the article is an illustration of “a shift in paradigm where entities, patterns and processes are considered as intrinsically linked to the particular scale at which they can be distinguished and defined” (Marceau 1999). The need in any science to be wary of the scale at which the given work is conducted or phenomenon observed is absolutely (and relatively) critical. Different phenomena occur at different scales, and significant inaccuracies in the data exist if this is not accounted for.

I have no qualms with most of Marceau’s article. However, I would like to address another little assertion the author makes in her conclusion: the shift in paradigm once more toward a “science of scale.” After our discussion a few weeks ago regarding rethinking GIS as a science, in addition to a tool, this struck me as particularly interesting. In its broadest sense, science is a body of rationally explained and testable knowledge. Understanding scale as a scientific field in this regard is difficult. I have no problem with comprehending and accepting scale as a basic property of science, but separating out scale as its own entity?

That said, accounting for all of the work involved in understanding thresholds and dependence and the role that a varying scale can play on the world is not trivial. I simply feel that whereas there are laws of physics, for instance, there is no singular body of accepted knowledge, as far as I know, surrounding scale, with the exception that scale is a property of a phenomenon that must be noted and maintained as much as possible.

– JMonterey

Scaling Issues

Wednesday, February 13th, 2013

Scale is an important issue in regards to most academic investigations, particularly within the framework of the investigation of natural phenomena. The biggest problem when dealing with these phenomena, especially environmental problems, is that these occurrences happen at various levels of scale. In addition, a single phenomena might have a particular effect on a local scale, but a completely different effect on a regional or global scale. As a result, issues arise as to best way to conduct an investigation: What scale should I use? Is the phenomena I am studying multi-scalar? How do I aggregate my results?

Marceau does a great job of identifying some of the key concepts behind scale, as well as the issues that have arisen since its evolution; most specifically within the natural and social sciences. One of the most interesting concepts identified throughout the paper was the modifiable are unit problem (MAUP) , which encompasses both the scale problem and the aggregation problem. Marceau concludes that the effects of MAUP are starting to become better understood; and this process in turn is contributing to the emergence of “scale as a science”.

The issues of MAUP bring to mind a case study I recently reviewed. This study encompassed an investigation of the effects of climate change on the Scandinavian country of Norway. An analysis of various effects such as economy, biodiversity, health etc. was performed at a multi-scalar level (national, regional, local). In their conclusion, the authors noted that at a national scale, the country was well off towards adapting to climate change. However, they noted that as scale was decreased to the regional and local scales, localized threats were discovered. This investigation serves to highlight the problem of main issues of MAUP, and how further development is needed within the “science of scale” in order to more effectively manage data at multi-scalar levels.

-Victor Manuel

Spatial data mining: a discovery or a re-classification of knowledge

Tuesday, February 12th, 2013

Guo and Mennis speak on how data information has increased in availability making it difficult to extract the useful data, however I believe that it not just a present day problem. To clarify, although data in many fields may have once been hard to access, some fields have had an over abundance of data for decades. For example, earth related sciences have had a variety of data sets from maps and cross sections to areal photos and digital models since the 1960s, readily available. As such, earth sciences and other spatial fields of study have been data-rich for decades with vast high resolution spatial data sets. This amount of data led to the finding data a problem even before the use of digital databases and indexing.  In light of these issues, the authors may to have not considered earth science database sets when writing or had never really looked at the amount available for earth sciences, but this is just speculation.

I do have to agree, though, that the creation of data mining techniques have made data easier and more accessible to use of none experts and experts alike in many fields, even if there has been high quality spatial data for years. On the topic of fields with pre-existing data, are we then truly discovering new information or are we just re-classifying and changing the format of data to be more accessible elaborating the problem of data retrieval today. The suggestion of a framework to how data should be manipulated, stored and retrieved would solve many issues with pairing old and new data, and retrieving the data one is seeking.

C_N_Cycles

How to handle scale?

Tuesday, February 12th, 2013

Any discussion in the initial stages of a GIS project has an episode where people argue about what should be the exact scale at which to carry out the analysis. The paper by Danielle J. Marceau gives a great overview of the various ways in which space and scale is conceived and how scales affect the results of analysis. However, many things in nature do repeat themselves very regularly with scale. An entire field of mathematics called Fractals deals with things that are self-similar at different scales. So, a set of formulas can define them very precisely and those formulas are all that is needed to reproduce it at any scale.

So, is it accurate to say that many things in geography appear entirely different at different scales? Or does it change gradually with scale? If so, probably we can view these things as a continuous function of scale. Then it is possible that we will come up with equations that explain this gradual change.  All we would require then will be an equation to describe the process at a particular scale, and another equation to describe how the process changes with scale, and we would be able to reconstruct how the object or phenomenon will look at any required scale.

– Dipto Sarkar

Whither Spatial Statistics?

Friday, February 8th, 2013

After a good half-century of quantitative developments in geography and geographic developments in statistics, Nelson takes stock of the field of spatial statistics by asking eminent spatial statisticians (statistical geopgraphers?) for their take on how the field has developed, current and future challenges and seminal works that should be on any quantitative geographer’s bookshelf.  She synthesizes the researchers’ responses to get at the broader trends characterizing spatial statistics.

A key shift discussed by Nelson is the state of advances in methodology vis-à-vis data availability and size. As spatial statistics grew in tandem with the Quantitative Revolution in the mid-20th century, geostatistical methods were in many ways ahead of the available data and technology: computers and automated data management technologies were still nascent, limiting the quantity of data that could be analyzed to what could be managed by hand or by using punchcard systems.  Meanwhile, data collection and organization was onerous and typically manual.  Today, we have the opposite problem: we have TONS of processing power to perform complex calculations, and programming languages to implement new methods easily, so much so that our technology is now ahead of most conventional methods of spatial analysis.  Of particular importance is the new problem of how to work with big data, which may provide more comprehensive samples (even data for the entire population!), in a finer temporal resolution and a richer detail than ever before.

Rising up to the challenges presented by big data and stagnant methods will be paramount for the continued relevance of spatial statistics into the future.  However, today’s cohort of geography students may be falling behind the curve in their technical ability to respond to these challenges.  Mathematics and computer science, absolutely crucial to working in advanced spatial statistics, are receiving less and less of a focus in our Geography departments (though this is starting to change with computer science, specifically in relation to GIS curricula).  Indeed, the very core of quantitative methodology in geography has been shaken by the Cultural Turn.  While qualitative approaches are doubtlessly important to a rich understanding of geographical processes, there is a risk that geographers will lose their quantitative toolkit in a policy context where, increasingly, ‘numbers talk’.  Spatial statistics itself may also suffer if geographers are unable to bring their nuanced views of spatial considerations to the table.

When thinking about these issues, I come back to Nelson’s Figure 1, the Haggett view of progress in geography and helical time. It is still an open question whether we are on the cusp of a second Quantitative Revolution in geography, or whether spatial statistics and geographic thought will continue to drift away from each other, with potentially dire consequences.

-FischbobGeo

Why should we think about ontology?

Friday, February 8th, 2013

“It is ironic that ontology is proposed as a mechanism for resolving common semantic frameworks, but a complete understanding and a shared meaning for ontology itself are yet to be achieved”
–Agarwal

Agarwal, a computer scientist, takes on the challenging task of applying the concept of ontology to Geographic Information Science.  After introducing the concept, the author discusses some applications of ontologies in GIScience, stressing that ontologies underpin much of the ‘scientific’ potential of the field.  She then outlines and discusses the considerations and challenges associated with developing and implementing a common ontology for GIScience.

It is clear from the article that the concept of ontology is a complex one, grounded in classical philosophical thought.  It is also apparent that ontologies are of the utmost importance to applications such as artificial intelligence, but why exactly should GIScientists drop what they’re doing and consider ontology?  Agarwal gives some reasons.  First, ontologies require us to come to a consensus on terms in the discipline that may currently be fuzzily or ambiguously defined.  Additionally, ontologies make the underlying assumptions and relationships of a GIS data and/or analysis model more explicit and transparent.  These advantages lend themselves well to the overall goal of data interoperability, the achievement of which will make everyone’s lives easier in the long run.

More fundamentally, Agarwal asserts that “the lack of an adequate underlying theoretical paradigm means that GIScience fails to qualify as a complete scientific discipline.”  Fighting words!  Is the legitimacy of GIScience, or even geography in general, rendered null and void by its lack of a unifying ontology?  Agarwal contends that the conceptual fuzziness and ambiguities of many terms and processes in GIScience and geography, as well as the difficulties associated with the concepts of scale and spatial cognition, not only make ontologies very difficult to develop, but should be considered problematic to the entire geographic discipline.  Problems of scale and other disciplinary ambiguities are typically handled by geographers arbitrarily based on the researcher’s individual research question: do we need a more strictly defined framework?  I would argue that it is not possible to get rid of geography’s messiness; that it is the complex subjectivities of geography that are part of what distinguishes it as a discipline and it is not really possible to separate these subjectivities from GIScience.  If anything, the challenge of applying ontology to GIScience should be a humble reminder that making sweeping and universal conclusions is problematic, but it shouldn’t stop us in our tracks.

-FischbobGeo

No Anselin and Getis (1992) are not out-of-date!

Friday, February 8th, 2013

I disagree with the statement that Anselin and Getis’s article is completely irrelevant today. I think that the point they are making about being careful with the toolbox is even more crucial today. Yes, tremendous steps have been made in terms of tools and methods in spatial stats but the questions behind the computerized processes still need to be well asked. We cannot let ourselves be led by the GIS tools. The authors give great examples of wrong maps being made because of misinterpretation of data. We can do statistics blindly with any data, and we will have results at the end no matter what. However, it’s not guarantee that the results will mean anything. The analyses have to be well supported to give meaning and interpretation to the results. I will end by quoting the authors: ”the technology should be led by theoretical and methodological developments in the field itself.”

S_Ram

The mountain doesn’t just get in the way

Friday, February 8th, 2013

In a largely philosophical discussion of ontology and perceptions of existence, Smith and Mark drive at some of the underlying and fundamental assumptions of cognition and geography. With the framing question “Do mountains exist?” (also the article’s title), the authors tear apart understandings of existence—boundedness, independence, universal acceptance—and conclude that how we approach that simple question lies at the base of how we perceive, and therefore visualize our environment.

This article is a fairly fascinating discussion that lends a psychology, as well as a philosophy, to GIS, a field that is largely empirical and filled with concepts we take for granted. For instance, the authors write, “Maps…rarely if ever show the boundaries of mountains at all…[capturing] an important feature of mountains…namely that they are objects whose boundaries are marked by gradedness of vagueness” (Smith et al. 2002). For something to exist, does it have to be independent, bounded, and universally accepted as such? We know that there is a mountain in a given place, but can we easily demarcate its boundaries? If not, can we truly say that the mountain exists or that it is a feature of the surrounding landscape?

The truth is that in an empirical analysis, i.e, for policy makers, these notions matter immensely, but from a geographic and informal perspective, we can understand the mountain as an object in a larger system. Thus, the mountain can exist, but its exact location does not matter and perhaps should not be of primary concern in a visualization of the landscape.

– JMonterey

Spatial statistics analysis integration with GIS

Friday, February 8th, 2013

Anselin and Getis authored a 1992 paper on spatial statistics and the problems that persisted in the critical analysis portion of GIS. They divide GIS into four stages – input, storage, analysis, and output – and discuss the interface between storage and analysis and output and analysis. This area of GIS is where, according to the authors, many of the pitfalls of the process of transforming reality to visualization occur. One issue of concern is the integration of a database with the tools to analyze the data. Anselin and Getis offer three possibilities, including full integration in the GIS software, tools that analyze from outside the GIS, and a common format to easily switch between the spatial analysis and the GIS.

One of my primary concerns in reading an article on the fallbacks of technology is that technology changes so rapidly as to make the article nearly obsolete not long after its publication. I mention this here because in the 21 years since the publishing of the article, GIS has advanced leaps and bounds in its toolbox and user interface. Regarding the database-spatial analysis concern noted above, for instance, ArcGIS now includes at least two of the three offered possibilities. Arc is able to read several spreadsheet formats that allow for the integration of Excel if desired. Otherwise, Arc comes with a multitude of analysis tools, ranging from simple geometry calculation to more complex map algebra and interpolation methods. While databases are, for the most part, easier to organize within Excel, there is often little need to work outside of the GIS at all. Arc (and other software) comes with complicated and useful spatial algorithms that make many of the issues noted by Anselin and Gertis largely antiquated.

– JMonterey

Integrating Spatial Statistical Analysis into GIS

Thursday, February 7th, 2013

Anselin & Getis give a good overview of the issues that pertain to the integration of Spatial analysis into GIS. Although it is quite a dated paper(1992), it does a good job of highlighting exactly why better integration wold be beneficial for the entire field of GIS. As noted by the authors, one of the key functions of a GIS is the analysis portion, which in turn encompasses spatial statistical analysis. They correctly identify this function as vital for more complex and in depth case studies in the future.

Technology has evolved a great deal since the time Anselin & Getis wrote their review. Modern day GIS now include many spatial statistic tools built right into the system. For example, for several courses I have used the “spatial analyst” toolbox in ArcGIS to perform statistical analysis of raster datasets. This toolbox holds a wide array of functions, ranging from calculating the statistics of objects in a raster (zonal statistics), to combining different rasters based on the measure of central tendency of the data (cell statistics).  In addition, there are now even excellent standalone programs made to specifically analyze statistics. Some of these programs, such as R, allow the user to perform complex statistical analysis of datasets. In addition, more complex programs, such as STATA, include a spatial component that allows the user to perform spatial statistical analysis on datasets.

Overall, the authors do a good job of providing an overview of the problems, as well as the benefits of better integration between spatial statistical analysis and GIS. However, many of the issues raised throughout their paper have been solved over the years, with the evolution and growing complexity of GIS. Spatial statistical analysis is an important component in any GIS. In the present day we have the ability the perform complex spatial analysis within GIS programs such as ArcGIS, something Anselin & Getis could only dream about at their time of writing their paper.

-Victor Manuel

Do Mountains Exist? Do I exist? And What is Love?

Thursday, February 7th, 2013

In their article “Do Mountains Exist: towards an ontology of landforms”, Smith & Marc question the existence of mountains as everyday objects. While this question seems ridiculous at first, they point out that everyday objects can either be organisms or artifacts – of which mountains are neither. They do not have a distinct boundary from their surroundings, nor do they have any characteristics that differentiate them from other similar landforms like hills. Suddenly, the question seems like an interesting one.

 

Philosophical debates aside, mountains do exist as geographic landforms. Answering questions like “being” and “existing” for these landforms seem somewhat irrelevant when we know that, as the authors point out, beetles will congregate on mountain tops nonetheless – and these beetles are surely not debating the meaning of their home, they just know that they want to be there. This in itself gives it meaning as an object; however, it perhaps is not an everyday object. In the same way one might tell someone to sit in a specific chair, it is very difficult to do this with specific landforms unless a comprehensive ontology is developed. A complete ontology would incorporate both geography, and philosophy.

 

These geographic features are not specifically defined in terms of the geographic ontology.  For this to be done, the exact scientific nature and history of this planet would need to be assessed. While a mountain is easily visible when standing before it, it is not so obvious when incorporating the irregular shape of the earth and many slight changes in elevation features along its surface.

 

Landforms are also neglected in philosophical ontology, since they are not distinct entities. For these reasons, ‘mountains’ do not appear in geographic databases or in scientific models. While all of these philosophers and geographers dispute what makes a mountain a mountain, it remains that if you ask a 5 year old to point to the mountain, he will, no questions asked.

 

Pointy McPolygon

 

Ontologies & Information Systems

Thursday, February 7th, 2013

Ontology has often been a topic of heated discussion. Specifically, the ontology of whether certain objects, theories, etc. often raises complex ethical questions. However, ontology with regards to Information Systems differs in that it focuses on the study and clarification of certain concepts, with the objective of formulating them into frameworks that are both logical and well understood.

Ontology plays an important, and often unrecognized part within Information Systems. As different people do research within their field, the way in which the gather, record, and organize their data is shaped by their onologies. More specifically, their beliefs, values, ideas, etc. influence what they perceive as important, resulting in datasets that are often unique and idiosyncratic. These idiosyncrasies make it difficult to standardize data across fields, which in turn hampers cross-field research and analysis. Therefore, an important step going forward will be to develop a sort of “Master Ontology”, a standardized and universally accepted framework. This framework would synthesize the various conceptualizations of different communities of data users in order to make data organized, standardized, and transferable.

In their conclusion, Smith and Mark remark that a complete ontology of the spatial world is needed not only to comprehend both primary theory (common sense), as well as field based ontologies that are used to model natural phenomena such as runoff and erosion. I believe this to be extremely important, as people from diverse fields must collaborate to build comprehensive, and most importantly, standardized databases in the field.

-Victor Manuel

Trends in Spatial Statistics

Thursday, February 7th, 2013

This article outlines the progression of the geography and spatial statistics since the quantitative revolution. Moving from net importers of techniques to net exporters in spatial analysis methods indicates how geographers have made their mark in academia. The book list offered by the author also demonstrates how vast the field has grown in multiple directions where few books were cited by multiple interviewees. Does this indicate that the breadth of knowledge in the field is so massive that a commonality such as a seminal book/article does not exist? It was noted that interviewees expressed their concern about the blurred line between GIS and spatial analysis. I’m not convinced that a clear distinction can be made – while seeing a “cluster” may be obvious, those who seek to understand the reasoning, and causes of the spatial distribution of a phenomenon will require specialized knowledge in theory and methodology to make those distinctions.

How society understands spatial data has also changed. As the proliferation of user friendly geospatial tools continues to pique the interest of the public to the discipline, it appears that the field has positively responded by providing freeware GIS tools which allow the distribution of more advanced spatial analysis techniques to further inform their understanding. As the discipline of geography continues to be enriched by new perspectives from several disciplines, spatial data will continue to evolve and become larger spatially, temporally and in volume. Handling these datasets require a reassessment of the tools we have to use and determine whether an adjustment can be made to address the new challenges in spatial statistics or if a new formulation needs to be made. But if we can address these issues and capitalize on the integrating spatial statistical with GIS to confirm hypothesis and move away from GIS as exploratory tool then the S moves from “system” to “science”.

-tranv

Do Mountains Exist?

Thursday, February 7th, 2013

The deep question with which the paper starts delves into the definitions of existence and comprehension of geographic features around us. The coming of predicate logic was the first attempt to consolidate questions about existence in a scientific framework, thus binding existence to a variable. However, to answer questions about categories and objects, predicate logic faces a challenge as these definitions are by nature recursive. As rightly pointed out by Barry Smith and David M. Mark the question then becomes two folds: “do token entities of a given kind or category K exist?”  and “does the kind or category K itself exist? ”. Predicate logic in itself is good at explaining logical entailment but fails to take into account the how humans perceive things. Thus, it may be right to say that mountains exist as they are part of the perceived environment.

Information Systems on the other hand adopted a different definition of ontologies. It considers ontologies as a set of syntax and semantics to unambiguously describe concepts in a domain. The objects are hence classified by information Systems in to categories and the categories are in turn arranged into a hierarchical structure. However, such an arrangement was futile in describing things like mountains, soils or phenomenon such as gravity. One central goal of ontological regimentation is the resolution of the incompatibilities which result in such circumstances. Hence the concept of fields was developed to efficiently categorize these “things”.

However, there are still doubts with naming of such “things” like mountains. Obviously, Mt. Everest exists because all the particles making Mt. Everest exist but exactly what particles are called Mt. Everest. This is the inherent problem in dealing with fields which are by nature continuous, lacking discrete boundaries.

Ideally the entire field of Ontology should be able to explain the entire set of things which are conceptualized and perceived with no ambiguity. This requires tremendous insight and reflection about why do the things exist in the first place.

– Dipto Sarkar

 

Ontologies and GIScience

Thursday, February 7th, 2013

“Ontological considerations in GIScience” by Agarwal states the issues of ontologies and the communication of the data contained within the various ontology types is problematic to understanding. One way I have worked with in the past to resolve the issue of relating different ontologies is setting a accessible reference grid over the different ontology of the same area and creating a cross-referencing database that links both the human and scientific data. For example, if I selected a grid a side bar would open that would display human based data (impressions, oral histories, etc.) and any other data for that same location. In addition, neighboring grid cells can be linked to see if they represent similar data, and a referencing system based on type of ontology characteristic created. This last part is similar to what I believe Agarwal maybe alluding to in the last part of his article.

Although grouping of data may be useful, the variety of ontologies and the evolution of how humans see the world and use of GIS makes this difficult and a problem for GIScience to resolve. I believe the problem is not how we group data but how databases are mostly static in form and unable to expand to new information. That is one reason why creating grids for an area to set a standard may be the best solution, where they are related to dynamic ontology data sets. To simplify, ontologies are dynamic with many ontology layer types, set to a geo-referenced grid.

C_N_Cycles

 

 

Spatial Statistic Skills Being Lost?

Thursday, February 7th, 2013

Nelson’s article, “Trends in Spatial Statistics”, although providing a good summary of past trends seems to be disconnected with what is really happening, or in other words the reasons behind the needs and education within GIS. Geographers are mentioned as  users and not producers of GIS, but this is only because industry, looks at the short term needs and therefore education teach for those needs rather then get more in depth. As a result of geographers looking at industry needs the more advanced courses that are required for spatial analysis are often considered as “too much training” and students do not ask or take these “advanced courses” in a large enough number for them to be offered as geography specific.

I believe that to solve the problem of spatial analysis, more open source courses online that are not time limited are needed to solve the current issues with spatial analysis skills. These courses would allow professional and public assess to the tools needed in the globalization of GIS and the application of spatial analysis techniques. In addition to courses, much of the technical background of GIS program statistical functions are hidden. It may be beneficial to create a functional window that can display the statistical function’s equations and code.

Finally, geography may have once been the only home of spatial analysis, but in today’s global environment other fields may be better suited with their specialization for analysis then geography. I believe that no one can be a master of all disciplines, therefore the skills that geographers thought were important, may not be today for them. In the global context of knowledge and specialization of skills, people are now becoming a cooperative of learning where goals can be achieved to greater success together through an interdisciplinary approach, rather than a single discipline approach.

C_N_Cycles

 

“O”ntology, “o”ntology, ontologies, ontoloGIeS

Wednesday, February 6th, 2013

As the interpretation of ontology varies within different disciples, this reminds me of how the interpretation of spatial phenomenon varies depending on context and what the researching is looking to uncover. A reoccurring theme in this course has been to recognize that the way in which we think, discover, and analyze phenomenon is context specific. How we decide to apply SDSS, visualize space, or determine whether something falls in the category of tool, toolmaking or science depends on the researchers intention and what they wish to convey/uncover. And as expected, the type of ontological approaches and methods is also context specific. As the overarching goal of ontological research is to create “a shared understanding of a domain that is agreed between a number of agents” (Agarwal, 2004) then is it possible to have several ontologies within the boundary the researcher demarcates since shared understandings exist on several levels between humans (ethnically, culturally, gender, age)? As individual positionality is unique to every person, whose shared understanding is being agreed upon?

From what I understand reading this article, GIScience is still in the stages of defining a common set of notions or concepts such that geo-ontologies can only be high level abstractions. Agarwal suggest that the issue is the interdisciplinary nature of GIScience such that the same terminology is conceptualized differently brings excess disciplinary “baggage” to the table. I don’t necessarily see this as a negative – if a shared common understanding can be made within several disciplines (or at least a shared understanding of what it is not) then doesn’t it allow for a more robust definition that can be accepted by more people? And if such a consensus can be achieved, then GIScience can move to the next step in creating this common vocabulary to allow interoperability.

-tranv

Statistics and GIS- a lot has changed

Tuesday, February 5th, 2013

A lot has changed in the last 2 decades since the paper on “Spatial Statistical Analysis and Geographic Information Systems” was published by Anselin and Getis. Today, the central focus of GIS is on spatial analysis and the rich set of statistical tools to perform the analysis. Today the GIS database and analysis tools are not looked upon as different software. Spatial analysis is fully integrated in GIS softwares like ArcGIS and QGIS. Furthermore, for very specialized applications, the modular or the loosely coupled approach is often employed. Software like CrimeStat uses data in established GIS Sofware format, perform analysis on them and produce results for use in GIS softwares.

When it comes to the nature of spatial data, two data models have been widely accepted namely Object based model and field/raster based model. Extensive set of analysis tools have been developed for each of them. Data heterogeneity and relation between the objects are also taken into account by slight improvements over these two models.

Exploratory Data Analysis and model driven analysis have progressed hand in hand and complement each other. While new and innovative visualization and exploration tools help in understanding the data and the problem better. Software has evolved over time to perform complex non-linear estimations required for model based analysis.

However, Statistics and GIS is an ever evolving field and newer methodologies and techniques are developed everyday which pushes the boundary of cutting edge research further and further. Newer challenges in statistical analysis include handling Big Data and community generated spatial information. How these new challenges evolve will be very interesting to observe.

-Dipto Sarkar

accurate maps?

Saturday, February 2nd, 2013

Here’s a link to an index of maps of novels: http://io9.com/5980739/an-index-of-dozens-of-maps-from-epic-fantasy-novels?tag=this-is-awesome

I think maybe some would say ‘art’… But I say maps. It’s about representing the geography of the story told in a book. What do you guys think?

S_Ram

The progress of geovisualization; still lots of work to be done

Friday, February 1st, 2013

In MacEachren and Kraak’s article titled Research challenged in Geovisualization, they argue that the main issues in geovisualization lies in representation, integra- tion with knowledge construction and geocomputation, interface
design, and cognitive-usability issues. The role of geovisualization has changed drastically within the last 10 years, along with developments in other software’s and technologies such as web 2.0.

 

Represent this dynamic data is a far cry from paper maps, where the database and map were static, and connected. One of the other challenges that has surfaced in representation is that of the geovisualization of 3-d and 4-d (time) space. This challenge in representation is still being addressed as the technology grows more powerful. Great strides have been made with interfaces, where 10 years ago many things were only useable by experts, now everyone and their grandma are using them.

 

 

Many of these problems have not yet been solved, however some of them have. Storing of the data is no longer as much of a problem due to cloud storage, and the web2.0 has become much more user-friendly, making it useable by users with limited expertise.

 

Pointy McPolygon