This article written by S.C. Guptill seems to be a well-established introductory article for anyone who has barely any background knowledge in metadata. It contains basic definition and context where metadata is required and also explains the content and its importance thoroughly. In addition, it explains how one can collect metadata and smoothly brings up the issues and importance concerning the standardization of metadata. At this point, it is possible that for those who are completely naïve in the domain of computer science may struggle with couple of terms, but it is not impossible to understand as long as Google and Wikipedia are there, just like they were for me. As the article treats the data clearinghouse, it underlines the importance of standardization and quality of the metadata once again, with the example of ‘ice and Colorado’ and the pretty interface used by the Master Environmental Library of the US Department of Defense (MEL). This part of article display how a user can specify the desired content in detail to minimize the noise in its query results. However, such a pretty looking interface would be useless if the metadata is not well structured and lacks in content.
The metadata and its issue of standardization is prominent as the amount of data, especially geospatial data, are vastly expanding on daily basis. As the technology advances, it is expected that the processing time for anything is to be reduced with increasing efficiency. To do so, the quality of and capacity for one to find the appropriate data are likely to have an increasing importance. However, no matter how it is obvious looking or “logical” to act in a way, it won’t make any progress unless someone comes up with an idea to produce capital out of it.
ESRI