Metadatabase
Metadatabase is a database model for (1) metadata management, (2) global query of independent databases, and (3) distributed data processing.[1][2][3][4][5] The word metadatabase is an addition to the dictionary. Originally, metadata was only a common term referring simply to "data about data", such as tags, keywords, and markup headers. However, in this technology, the concept of metadata is extended to also include such data and knowledge representation as information models (e.g., relations, entities-relationships, and objects), application logic (e.g., production rules), and analytic models (e.g., simulation, optimization, and mathematical algorithms). In the case of analytic models, it is also referred to as a Modelbase.[6]
These classes of metadata are integrated with some modeling ontology[7] to give rise to a stable set of meta-relations (tables of metadata). Individual models are interpreted as metadata and entered into these tables. As such, models are inserted, retrieved, updated, and deleted in the same manner as ordinary data do in an ordinary (relational) database. Users will also formulate global queries and requests for processing of local databases through the Metadatabase, using the globally integrated metadata. The Metadatabase structure can be implemented in any open technology for relational databases.
Significance
The Metadatabase technology is developed at Rensselaer Polytechnic Institute at Troy, New York, by a group of faculty and students (see the references at the end of the article), starting in late 1980s. Its main contribution includes the extension of the concept of metadata and metadata management, and the original approach of designing a database for metadata applications. These conceptual results continue to motivate new research and new applications. At the level of particular design, its openness and scalability is tied to that of the particular ontology proposed: It requires reverse-representation of the application models in order to save them into the meta-relations. In theory, the ontology is neutral, and it has been proven in some industrial applications.[8] However, it needs more development to establish it for the field as an open technology. The requirement of reverse-representation is common to any global information integration technology. A way to facilitate it is in the Metadatabase approach is to distribute a core portion of it at each local site, to allow for peer-to-peer translation on the fly.
References
- Hsu, C., Bouziane, M., Rattner, L. and Yee, L. "Information Resources Management in Heterogeneous, Distributed Environments: A Metadatabase Approach", IEEE Transactions on Software Engineering, Vol. SE-17, No. 6, June 1991, pp. 604-624.
- Babin, G. and Hsu, C. "Decomposition of Knowledge for Concurrent Processing," IEEE Transactions on Knowledge and Data Engineering, vol. 8, no. 5, 1996, pp 758-772.
- Cheung, W. and Hsu, C. "The Model-Assisted Global Query System for Multiple Databases in Distributed Enterprises," ACM Transactions on Information Systems, Vol. 14, No.4, Oct 1996, Pages 421-470.
- Boonjing, V. and Hsu, C., "A New Feasible Natural Language Database Query Method," International Journal on Artificial Intelligence Tools, Vol. 20, No. 10, 2006, pp. 1-8.
- Levermore, D., Babin, G., and Hsu, C. "A New Design for Open and Scalable Collaboration of Independent Databases in Digitally Connected Enterprises," Journal of the Association for Information Systems, 2009.
- Hsu, C., Service Science: Design for Service Scaling and Transformation, World Scientific and Imperial College Press, 2009. ISBN 978-981-283-676-2, ISBN 981-283-676-4.
- Hsu, C. Tao, Y.-C., Bouziane, M and Babin, G. "Paradigm Translations in Integrating Manufacturing Information Using a Meta-model: The TSER Approach," Information Systems Engineering, France, Vol.1, No. 3, pp.325-352, 1993.
- Cho, J. and Hsu, C. "A Tool for Minimizing Update Errors for Workflow Applications: the CARD Model," Computers and Industrial Engineering, Vol. 49, 2005, pp 199-220.