Kuhn reading and developing a new Ontology

Kuhn’s article is critical of existing ontologies as overly focussing on objects and their attributes. He details the development of a new ontology that embraces processes and actions in addition to objects. He advocates for the formalization of such a system to better represent the geo-spatial component of phenomena. As an example, he uses a German traffic code as a case study to demonstrate innovations in ontology structure.

A contentious point in this article was Kuhn’s assertion that teaching the programming, engineering, and ontology-generating tools to domain experts is useless because these tools are incompatible with the theories. I believe that familiarizing and engaging domain scientists in critical review of these tools (much like what this seminar class is doing) exposes the built-in assumptions and limitations of the ontology; an ontology that originates from computer science and information systems. Furthermore, contemporary neo-geographers and the geospatial web are blurring the distinction between geographers and programmers. Intuitive tools such as search engines and even programming languages that increasingly approximate natural language are emerging.

There is merit in Kuhn’s call for a shift of structuring data to better represent the geographic component of geographical data. However, I believe that it is entirely too abstract to create a new system from scratch. Much like how our understandings of things change over time, is it not possible to allow our ontologies to change organically and incrementally? Such a transition would allow for more recycling of collected data, more exchanges of dialogue, and is probably more feasible.  

– Madskiier_JWong

One Response to “Kuhn reading and developing a new Ontology”

  1. sieber says:

    You hit on a particular challenge to ontologists. Theirs is a very top-down labour-intensive process of interviewing numerous domain experts. Neogeographers, others come in and say, it all can be aggregated from the hundreds, millions of contributions on the web and the way that non-experts are tagging content. These relations neogeographers and Google, for that matter, contend that this is a completely bottom up procedure. However, what happens when there are dominant, non-scientific views abundantly on the web? What meaning is extracted? That’s could be a problem.