Geographic Information Science- Goodchild (1992)

September 14th, 2015

Goodchild’s (1992) article is centered on the fear that unless GIS makes the transition from being considered a system to a science, it will soon fade away as another technological fad.  One of Goodchild’s main concerns at the time was that GIS, though inherently useful, was restrained by its problematic integration into other fields (i.e. spatial analysis).  He attributed this to a focus on data management rather than analysis—a result of the lack of motivation to develop the necessary technology due to the “lucrative” yet “unsophisticated” needs of GIS in the commercial world (1992:38)—and also to the sheer obscurity of spatial analysis as a technique.  Currently, spatial analysis functions as an extension of ArcMap and is fundamentally a part of GIS as I know it.

Though Goodchild’s article was riddled with unanswered questions (at the time) I think he played an integral role in developing the case for GIScience by highlighting how multidisciplinary the field really is.  For example we look at spatial analytics, and with the wisdom of the future I believe this specifically was an important point to broach.  Bearing in mind that this system vs. science debate is still ongoing, I think the development of a concrete tie to analytics was beneficial for those making the case for GIS as a science.  Goodchild, an avid member of the academic community, recognizes academia’s requirement of a certain level of ‘rigor’ for a field to be considered a science and spatial analytics, now at the heart of GIS, absolutely brings this edge.

 

-BannerGrey

 

GIS: Tool or Science? – Wright

September 14th, 2015

Over the past twenty years, opinions on what “’doing GIS’” entails have been debated. In “GIS: Tool or Science?” (1997), Wright was able to categorize “64 postings from 40 individuals” on a GIS-L forum into three groupings of what “’doing GIS’” means (346-347). Regardless of whether GIS should be defined as a tool, a toolmaker, or a science, Wright was able to present all three positions in an unbiased way, allowing the reader to determine their own opinions about the debate.

Wright mentions that there is not necessarily one ultimately correct or authoritative position, rather the entire “continuum” between tool and science should be equally acknowledged because all viewpoints use GIS for different purposes (358). Hence, maybe it does not even matter if “’doing GIS’” is correctly defined, maybe there should be a consensus that GIS can be applied in different ways and be accepted for different reasons.

Although the debate is still occurring, I believe that the article was outdated because technological advancements within the past twenty years have been tremendous. With smartphones, apps, and improvements in software, new GIS applications have been introduced; resulting in a larger group of people becoming more involved with GIS. It would be interesting to compare how Wright’s subsets’ define GIS to how current people conceptualize GIS.

-MTM

 

Twenty Years of Progress: GIScience in 2010

September 14th, 2015

Goodchild (2010) provides an overview on geographic information science (GIScience) and its development as a discipline in the past twenty years (1990-2010). He then opens up discourse on how GIScience can be improved and applied in the future. What I was most interested in was not necessarily GIScience’s accomplishments as a discipline, but rather how its theorizing elements can be applied and improved for the future, especially with technology constantly changing. For instance, Goodchild questions how GIScience will handle large quantities of data that are being produced from new devices. He additionally questions how security will be managed. I admire Goodchild’s ability to raise unanswered questions because it shows that there are still many issues in GIS that need to be addressed. His discourse on the “role of the citizen” also spiked my interest because I would like to research how local people in underdeveloped countries/cities can contribute data (13-14).

What I think is unique about modern GIScience is its ability to mix with other disciplines (i.e. “geography, computer science, or information science”); however, contrary to what Goodchild states, I believe it is this reason that GIScience is not “well-defined” (16). Since it crosses over many disciplinary boundaries it is hard to establish itself as its own entity. I do not doubt the importance of GIScience, but it is still not that well known because, in my own opinion, spatial awareness is not a distinct enough feature to separate itself from other disciplines. It is especially hard to maintain a solid definition of GIScience because of how much it is evolving with technological advancements. For these reasons, I see GIScience being complementary to other larger disciplinary fields (just like how GIS tool applications can be used to help support research).

-MTM

 

GIS: Tool or Science?

September 14th, 2015

Wright et al. (1997) compound the debate amongst the GIS-L community as to whether GIS should be regarded as a tool or as a science.  In doing so Wright et al. also delve into profounder topics such as defining “doing science”.  Wright et al. identify three positions, on a sliding spectrum between tool and science: GIS as a tool, GIS as a toolmaking, and, GIS as a science.

My experience with GIS thus far is to utilize it as a tool for answering geography related questions; nevertheless, I fully understand and accept the view of GIS as a science when looking at research within the field.  Encountering this dichotomy in my everyday suggests the importance of how GIS is used on a case-by-case basis.  While Wright et al. do address this; I believe it was underemphasized in the article (and similarly the debate) as this is essentially the basis for the argument for a necessary shift from the “black-and-white” to a “fuzzier continua” of descriptions for conceptualizing GIS (1997: 358).

Secondly, this article brought to my attention the merit of regarding something as a science simply as a means of maintaining “academic legitimacy” (1997: 354).  Not all usages of GIS will be regarded as science, however this should not devalue it as a discipline.  I think this dilemma as presented to us by Wright et al. is extremely important to defy and not only in the fields of geography and GIS.  The idea that something is automatically more reputable because we somewhat arbitrarily categorize it under “doing science” means we are likely limiting our advancements as a society and I applaud Wright et al. for raising this issue.

-BannerGrey

 

TGIS back then…

December 2nd, 2014

From the Yuan’s article, he criticizes that the temporal GIS is not supporting properly the spatio-temporal modeling. He argues that the temporal database systems are being applied by time-stamping techniques to tables, attributes or values to incorporate time factor and therefore lack in proper application. I find this article interesting yet inappropriate.

This article is interesting in a sense that back in late 90’s, assuming from the range of dates of cited articles varies from late 80’s to mid 90’s, despite the lack of technologies that could have point out such problems in temporal GIS, yet he was able to pointed out and therefore one can only imagine the frustration when the methods were not appropriately applied and also the technology was not apt to well incorporate the temporal attribute as well at the time. Which leads to the reason why I call this article inappropriate, because it could have mislead to some people how temporal GIS is being not developed even today, whereas it is, at least in the perspective of technology, more than enough to be ready to be applied and being applied currently. It is all thanks to the advance in technology and crowd-sourcing. As Yuen have mentioned, “GIS will be more effective and precise by representing a fire, fire location, and how the fire spreads rather than burned areas at each point in time.” To relate to the example, nowadays, fire departments and natural scientists are closely monitoring the natural forest fire and the tracking, using temporal GIS. On the other hand, I find the burned areas at each point in time eventually leads to the evidences of fire spreads. However, it is facilitated by today’s technologies to create such connections whereas it was impossible to do so about 20 years ago. Hence, this article may be easily misleading or perhaps even confusing to people who are learning and being used to today’s GIScience technologies, especially without further information/background knowledge on TGIS.

-ESRI

Temporal GIScience

December 2nd, 2014

 

It is often said that space, place, and time are three fundamental foci of geographical inquiry. GIScience shares the spatial outlook, significance of place, and temporal way of thinking, however, the latter is relegated more often than not. Yuan (****) notes that many geographic information systems lack the tools needed to conduct spatio-temporal analyses. This presents a significant research gap in the realm of GIScience.

Yuan highlights that temporal datasets are subject to a unique problems intrinsic to the factor of time. Modeling and visualization are particularly challenging. Yuan points out that time is not a singular structure – there is linear, cyclic, parallel, and branching structures of time. Additionally, time occupies multiple dimensions, corresponding to validity, transaction, user-definition, and institutions. The latter set of examples was an entirely new set of concepts for me and I would have appreciated an unpacking of these concepts. Yuan’s acknowledgement of temporal complexity was certainly required, however, the diverse features of time come across as an afterthought in this paper.

Temporal GIS is certainly an intriguing subfield of GIScience, however, there seems to be a steep theoretical learning curve. Unfortunately, this article did not make my introduction to temporal GIS any easier.

-BCBD

Temporal GIS

December 1st, 2014

In “Temporal GIS and Spatio-Temporal Modeling”, Yuan describes the different issues of temporal GIS.  The main problem is that geographic information systems were built off of the tenets of cartography, and still follow the logic of static maps. Many different data models are presented in this paper, and whilst all of them have benefits, none of them can be applied universally. Researchers have been building models for their specific needs, and therefore they don’t fit most datasets. The article, as well as the field of TGIS, is convoluted and there is no clear answer. Yuan shows that none of the models answer the needs of spatiotemporal analysis completely.

Time seems like such an obvious factor to consider in any kind of analysis, yet it is still hard to incorporate into classic data models for GIS and computer science. Creating a working temporal model for geographic information systems should be a top priority for everyone involved in the field.

It seems unlikely that the best method for spatiotemporal analysis is comparing different layers manually? Temporal GIS will revolutionize the study of processes, changes, trends, flows, etc.

-IMC

 

Temporal G.I.S.

December 1st, 2014

The article, Temporal G.I.S. and Spatio-Temporal Modeling by May Yuan, was a challenging introduction to the field of temporal G.I.S. It was one of the readings which I enjoyed the least so far, since I felt that the author didn’t really do a great job of clearly explaining the concepts or the diagrams provided very well. Perhaps it will become clearer after tomorrows presentation. Through reading the article, it became clear that it is a major challenge to successfully implement temporal aspects into a G.I.S. This was interesting because it reminded me of the article on geovisualization which discussed Minard’s map of Napoleon’s campaign into Russia. There was a strong temporal characteristic to that map, as well as in subsequent versions. Clearly, geographers have been occupied with combining temporal and spatial data for a long time. It is also clear that the problems and issues associated with doing that have not been solved, which is fascinating (although the technical challenges associated with G.I.S.’s are presumably more complex). Time is a very important variable to take into account for any geographic process and to be able to do it successfully incorporate it into a G.I.S would provide a boost to all fields of G.I.Science which rely on temporal analyses (which is a lot).

-Benny

(Not-so-big) BIG DATA

December 1st, 2014

Crampton et al. (2013) Beyond the geotag: situating ‘big data’ and leveraging the potential of the geoweb.

 

Crampton et al. (2013) presents an analytical study of the intersection of big data and GIScience via the medium of geolocated tweets. The authors recognize that atheoretical analyses of the geoweb is limiting the expansion of GIScience research and identify five ways through which to remedy this shortcoming.

Using the tweets accumulated through the #lexingtonpolicescanner (also #LPS), Crampton et al. assessed the spatio-temporal phenomenon of data production. They observed spatial and temporal spikes corresponding to how the event evolved over time and through the channels of social media.

Although the bridge to geography and GIScience is established, the connection to big data is much more tenuous. Crampton et al. remark that their analysis “emphasize[d] that the informational richness of these data is often lacking” in typical big data analyses –those which focus on issues of volume, velocity, and statistical complexity (2013: 137). Perhaps this is my own naïveté when it comes to the subject of big data, however, from my understanding the lack of informational richness (due to noise), unfathomable data quantities, and overwhelming speed of dataset creation are the definitive features of big data. Big data is supposed to challenge the conventional methods of statistical analysis due to these unprecedented conditions. I doubt 12 590 tweets would constitute a ‘big’ dataset. Indeed, Crampton et al. repetitively state that their analysis may not qualify as ‘big data’. I suspect the authors were caught up in the hype of big data. If true, this very article would exemplify the phenomenon of ‘trend spam’ through which unrelated—or loosely related—subjects ride along in the wake of a popular label.

One of the more intriguing findings of this article is that certain agents, particularly popular websites, may act as catalysts in the (re)production of data. Crampton et al. note that publicity on Mashable.com sparked created a noticeable spike in the use of the #LPS hashtag. It would seem that Tobler’s first law of geography is applicable to both physical and virtual space.

 

-BCBD

 

Big Data

December 1st, 2014

“Beyond the geotag: situating ‘big data’ and leveraging the potential of the geoweb” describes the possibilities of advancement in the geoweb with big data analysis, by using the example of the #LexingtonPoliceScanner trending topic on Twitter.

One of the most important things I found in this article is how inaccurate the spatial data can be. As the author points out, the location of the user is usually self-reported and can be a fictional place. Moreover, even if they correctly identify their home location, it does not necessarily reflect the location of the user at the time the information was sent. However, we know that most apps record the user’s location, whether they choose to post it or not; therefore it should be possible to have more precise information.

This paper reflects the need for GIScience to move away from static points on maps. The three maps representing tweets at different times in the night could be helped by a functioning temporal GIS and geovisualization.

It is very relevant that this is one of the last papers of the semester, as it ties together many of the fields studied in GEOG 506.

-IMC

 

#BIGDATA

December 1st, 2014

Using a case study that examines a stream of tweets related to late-night celebration events following the victory of the University of Kentucky Wildcats NCAA championship game Crampton et al. challenge us to look beyond the obvious when dealing with geotagged big data. Although the #LexingtonPoliceScanner hashtag identified the tweets related to the incidents that took place that night, a simple map showing the hotspots of twitter traffic over space and time merely scratch the surface of what is possible – leaving many unanswered question, multiple unexplored avenues.

It is important to note that this example featured the use of user generated content, introducing the risk of false information, and repeated informations facilitated through retweets: in essence noise. The location of the tweet is unclear as less that 1% of tweets were provided with GIS coordinates from where the tweet was posted, exposing a gaping hole in misleading information, location however could be derived from the user-defined location information, While the advent of big data is a sea of opportunity I would repeat the cautiousness of the authors with regards to the confidence with which the information can be used. Before the use of any data, patterns and trends need to be extracted through the exercise of data mining, a process that the authors argue can be enhanced through the addition of ancillary data to inform the data drawn from the big data source.

While the article clearly demonstrates the ways in which big data requires a broader look in order to tackle the research questions it can answer, I question choice of the case study presented as an example to illustrate this. I find that other examples would be far more appropriate in representing what big data really entails.

The example however provided the opportunity to present the concept of social networks and the idea of measuring distance not only over physical space but also through a social network structure. Social networks such as twitter offers the ease of access to UGC data that moves through the social network structure. I do wonder what potential would lie in another data set example removed from the category of geotagging.

– Othello

 

Time After Time

December 1st, 2014

Upon reading Yuan’s work on Temporal GIS and Spatio-temporal modelling, I have come to the realization that our current GIS systems are very much ill equipped to handle the storage and processing of all kinds of spatio-temporal models. Spatial is specials, yes, but all spaces are placed in time. We cannot ignore the fact that temporal is special too, everything happens in space and time. Most GIS data represents spatial information that is fixed in time, a slice of the full reality. Name me one process or object that hasn’t changed over time and I’ll bake you a cake of your choice.

The complexity of the six major types of spatial temporal changes in geographic information spoke to my unfamiliarity with such concepts. After 6 years of experience with GIS the subject of temporal GIS has not been addressed adequately to reflect the need to deeper development in this field. The need for further research is all too clear. The current GIS systems are incapable of modeling the temporal changes that most accurately describe phenomena such as forest fires or the development of storm systems in which objects change and transform over space and time. THe lack of support for such basic concepts echoes the incompatibility of GIS with other ways of knowing and the epistemology of the indigenous populations of northern Canada in particular.

The question remains: how will traditional GIS tools as we know them adopt and change to better address the potential that resides in these spheres, more importantly still how will implementation take place in the field? The introduction of a new data model that allows for the efficient and effective temporal analysis of phenomena would cause an overhaul in data structures as we know them, such changes would be well worth the effort. To continue as we currently do would be unfortunate.

With the advances in computational power that have led to rapid advances in geovisualization, the same tools should be directed towards the development of capability to handle all forms of temporal GIS.

– Othello

Big Data & Spatial Relationships

December 1st, 2014

I found this article, ‘Beyond the geotag: situating ‘big data’ and leveraging the potential of the geoweb’ by Crampton et al., to be an interesting introduction to the topic of Big Data. I thought it was fascinating how they attempted to progress past the simpler research methods previously used to analyse big data and build more robust and detailed frameworks. They made it clear that, while studying big data was useful, there is a need for more nuanced avenues of research which could help advance the field of big data and the geoweb. One such idea that I found interesting was that, as the authors claimed, there is often a paradoxical lack of data, or at least value in the data, that is being analyzed. For example, only 34% of the tweets studied were geocodable by user-defined location information (Pg. 134). They also note that people can use different location data or tweet about events that are not happening in their stated location. Examining how much of an impact this has on the accuracy of studying big data would be interesting to look at.

On the topic of locational relationships, the maps provided on page 135 got me thinking about the potential for people to tweet/create content on the Internet about issues that happen a large physical distance away from them. Does this data still follow Tobler’s First Law of Geography? If a larger number of people in New York tweet about events in Ferguson than people in Ferguson, what sorts of implications does this have in terms of analyzing big data’s spatial patterns? With the power to consume information about distant events, does proximity even matter anymore? When studying big data, I would imagine that this would be an important question to answer.

 

-Benny

 

Temporal GIS

November 30th, 2014

May Yuan’s article regarding temporal GIS was an excellent introduction to the topic.  Displaying temporal data in a GIS is difficult to do effectively because of the platform’s focus on space and representation.  Two main methods of incorporating temporal data use either time-stamped layers or, time-stamped attributes.  Both have their advantages and disadvantages.  Personally, I have used time-stamped raster layers to assess land use change and it seemed effective though upon reflection I would hardly say it is ideal.

Change over time is a complex concept, more so than change over space.  Does GIS’s emphasis on representing geographic data make incorporating temporal data impossible?  One thing is clear, improved temporal data capabilities for GIS is necessary if GIS is to be used to confront difficult issues.  The GIS Wiki identifies epidemiology, transportation planning, and disaster management as key areas for temporal GIS but it could be used in many other fields.  Climate change, gentrification, urban decay, migration, better analysis of Big Data.  I would be interested in hearing in class how temporal GIS relates to Geovisualization as they seem like they could be related.  Also I’m sure that their are specific metadata issues associated with using temporal data sets.

GoOutside

Big Data

November 30th, 2014

Crampton’s article on Big Data, its limitations, and possible methods of improving the usability of the geoweb is certainly one of my favourite articles so far.  I found the article to be informative, entertaining, and well written.

In a class discussion earlier this semester we discussed whether or not Twitter and other social media platforms were creating usable, interesting data or, if they were creating ‘noise’.  This article discusses some of these very issues.  How does one find usable data when ‘bots’ are hijacking trending topics to sell iPhones? Is any of the data usable anyway? The fact that Twitter is largely comprised of ‘re-Tweets’ lead the author to conclude their data set was fairly thin.

I think the author made a good point when pointing out that observations cannot be generalized to society-at-large because of the very small usage of Twitter, and the small proportion of Twitter users who actually Tweet.  The lack of explanatory power within some of these very large datasets was also identified as a major restriction of using ‘Big Data’, specifically Twitter.

The suggestions of using the geoweb as ancillary data seemed reasonable given the limitations of many Big Data datasets.  To me the most interesting representation of the LPS data was the Tweets over time graph.  I would be really interested in seeing how other trending topics peak and fall over time.

GoOutside

Big Data

November 27th, 2014

I really enjoyed “Beyond the geotag: situating ‘big data’ and leveraging the potential of the geoweb” by Crampton et al (2013). It explores the possibilities and limitations associated with the analysis of Geo-web generated Big Data. They advocate for a more comprehensive ontology of space, one that goes beyond xy coordinates to include a temporal and relational dimension, as well as a more complete understanding of geo-web generated data to include ancillary non-human produced data, as well as non user-generated data.
This article debunks the promise of the explanatory potential of Bid Data analytics and stresses the importance of choice of data and analytical techniques to provide meaningful findings. Moreover, they emphasized that Big Data created from the Geoweb represents information from a small subset of the population, undermining the external validity of findings, and potentially marginalizing ‘unplugged’ individuals.
The implications for GIScience are significant: How do we realize the potential of geospatial Big Data? What are the societal implications of Big Data analytics? How do create metadata for Big Data efficiently and comprehensively? How do you effectively geovisualize Big Data to both explore and represent data?
Geospatial Big Data gives rise to questions that build upon those generated by ‘small data’ analytics. Issues of quality are still relevant and are not supplanted by large volumes of data. GIScience will evolve rapidly and will likely intersect with other disciplines to fully realize the potential of Big Data.

Fan_G

Temporal GIS

November 27th, 2014

“Temporal GIS and Spatio-Temporal Modeling” by Yuan explores the development of temporal GIS and its applicability to support spatio-temporal modeling, highlighting the challenges associated with temporal modeling of spatial information and pointing to avenues for future research.
Before reading this article, I had not fully appreciated the complexity and difficulty of creating a temporal GIS. The importance of creating a database structure that effectively anticipates the types of spatio-temporal questions to be investigated was well emphasized. While this paper focused on exploring spatio-temporal physical phenomena, similar issues are relevant in the investigation of human or behavioral phenomena, such as spatial social network analyses, migration patterns of time, changes in urbanscapes etc.
The relationships of temporal GIS with other topics in GIScience are numerous. For example, ontologies of time and space need to be properly established to construct a good temporal GIS that facilitates data analysis, and enables interoperability between datasets. Moreover, concerns over geo-spatial metadata are compounded when you consider geo-spatial temporal data. Is temporal information captured in the metadata or is incorporated in the data structure (and is such a distinction (between data and metadata) even meaningful in this context). Issues of temporal scale become important: To what level of precision is temporal data recorded, and what are the implications of temporal aggregation in accurately conveying temporal dimension of a phenomenon?
The questions that arise when exploring the temporal component of spatial phenomenon emphasize the scientific dimension of GIS and highlight the importance of studying GIS as science as instead of as a tool.

Fan_G

GIS Implementation

November 25th, 2014

When institutions adopt a new technology there are bound to be issues associate with that technology’s implementation.  This paper described two instances where different institutions began using GIS as a decision-making aid.  Despite using an identical technology, the two government organizations had very different approaches to how to integrate their new tool.

The North County’s organization-wide adoption and promotion of the new tool sought to ensure all employees were fluent in the use of ArcInfo which attributed to the software’s success in this county.  Furthermore, the North County’s employees had a history of working with geospatial data making the new software an easy transition.  In the South County however, the software was only used by a very small group of computer scientists / analysts who were unfamiliar with geospatial data.

I think when adopting any new process or technology there has to be significant attention paid to how well an organization is structured to handle change.  In the North, analysts were use to changing technology regarding how geospatial data would be handled.  In the South, a rigid system overly reliant on one group of computer specialists resulted in a missed opportunity to have a staff team better trained in the handling of geographic data.

As this paper was written almost twenty years ago I would be interested in knowing how other organizations have implemented the use of GIS in more recent years.  I assume, though I could be incorrect, that as younger, more computer savvy people enter the workforce that barriers to the implementation of certain technologies will decrease.

GoOutside

G.I.S. Implementation and Society

November 25th, 2014

I found this article, Organizational context, social interpretation, and the implementation and consequences of geographic information systems, by Sahay and Robey, to be a fairly interesting read. I was first intrigued during the introductory sections of the paper when they discussed the theoretical foundations of Interpretations and Implementations (pg. 258). I was imagining the discussion pertaining to larger social and cultural groups, for example between Western and other cultures. Instead, the study focused on the implementation differences between counties, a much smaller scale difference. While now it seems logical, I wasn’t aware that there would have been such noticeable difference between two entities which I would have thought were homogeneous, especially in terms of implementing a new technology.

It would be interesting if the issues that the authors noticed at this scale (ex. Funding disparities between departments, lack of technical expertise and training) could also be ones that societies as a whole would face if they were examined at a societal level. For example, if a few educated individuals in a society possess the knowledge required for operating G.I.S., would that create conflict or have negative impacts on the progress of the society in general? If one has to pay for G.I.S. services, would that influence who uses the services and would certain groups get the proverbial short end of the stick? I think this would be an interesting avenue of research. I imagine this would be very difficult to study, however.

-Benny

GIS implementation

November 25th, 2014

In “Organizational Context, Social Interpretation, and the Implementation and Consequences of Geographic Information Systems”, Sahay & Robey (1996) analyse the implementation of Geographic Information Systems (GIS) technologies in two unidentified North American counties. The result of this implementation was closely to each county’s context, specifically the “structure” and the “capability”.

In the North County, the “unified structure facilitated the implementation of GIS as a distributed system because the departments shared a common mission” (266). In the South County, because of the decentralized agencies structure, the Office of Computer Services controlled the GIS and worked on a contractual basis for the users in other departments. Moreover, the users in the North County included many geographers who were used to working with spatial data. This was not the case for the Computer Services personnel in the South County. The structure and capability of the North County allowed its GIS implementation to flourish, and caused South County to falter.

Because people’s backgrounds affect how they use GIS — “People with data processing backgrounds emphasized technical standards, controls, formats, and product appearance in terms of menus and screen layouts. […] Users were more interested in the capabilities of the system to support their needs” (271) — and GIS implementations have different results depending on the context, it seems that standards are especially important. Is it possible that the different counties could yield such different data that they would be incompatible?

Given that this article was written 18 years ago, I wonder how GIS implementation has changed. With the average person becoming more tech-savvy and commercial GIS package being more use-friendly, are GIS implementations easier to carry out? Do the same factors still affect the implementation?

-IMC