I found this article, “Metadata and data catalogues” by Guptill, to be an interesting, if a bit tedious, read. Section 2.6, especially, was a little dry and technical, but overall the author made the text engaging. One idea that seemed interesting was where the author discussed the practical limits which faced metadata collection. At the time of writing, this limit was “often reached by the effort required to collect the information” (pg. 680). If there were technological advancements, perhaps in the field of spatial cyber-infrastructure, this limit could be surpassed and metadata collection might improve. It is also possible that if metadata was able to be associated with each piece of data or attribute, the end user of the metadata could be inundated with un-necessary noise and the process could be hindered. I would imagine a balance would need to be struck somewhere along the way.

Another theme I thought was important in this paper was that of standardization and interoperability. Since there seemed to be many different structures for metadata, as well as efforts to consolidate these disparate activities, having similar ontological approaches would probably be an important step in ensuring successful interoperability. If one metadata format defined their dataset information, for example, differently than another format did, it would be difficult to “create crosswalks”, as the author says (pg. 684), between metadata standards. Perhaps there have been efforts in subsequent years to standardize ontologies.

As an aside, reading this article reminded me of this xkcd comic:



Comments are closed.