Interoperability
Interoperability is a characteristic of a product or system, whose interfaces are completely understood, to work with other products or systems, at present or in the future, in either implementation or access, without any restrictions.[1]
While the term was initially defined for information technology or systems engineering services to allow for information exchange,[2] a broader definition takes into account social, political, and organizational factors that impact system to system performance.[3] Hence interoperability involves the task of building coherent services for users when the individual components are technically different and managed by different organizations[4]
Types
If two or more systems use a common data formats and communication protocols and are capable of communicating with each other, they exhibit syntactic interoperability. XML and SQL are examples of common data formats and protocols. Lower-level data formats also contribute to syntactic interoperability, ensuring that alphabetical characters are stored in the same ASCII or a Unicode format in all the communicating systems.
Beyond the ability of two or more computer systems to exchange information, semantic interoperability is the ability to automatically interpret the information exchanged meaningfully and accurately in order to produce useful results as defined by the end users of both systems. To achieve semantic interoperability, both sides must refer to a common information exchange reference model. The content of the information exchange requests are unambiguously defined: what is sent is the same as what is understood. The possibility of promoting this result by user-driven convergence of disparate interpretations of the same information has been the object of study by research prototypes such as S3DB.
Cross-domain interoperability involves multiple social, organizational, political, legal entities working together for a common interest and/or information exchange.[5]
Interoperability and open standards
Interoperability imply Open standards ab-initio, i.e. by definition. Interoperability implies exchanges between a range of products, or similar products from several different vendors, or even between past and future revisions of the same product. Interoperability may be developed post-facto, as a special measure between two products, while excluding the rest, by using Open standards. When a vendor is forced to adapt its system to a dominant system that is not based on Open standards, it is not interoperability but only compatibility.
Open standards
Open standards rely on a broadly consultative and inclusive group including representatives from vendors, academics and others holding a stake in the development that discusses and debates the technical and economic merits, demerits and feasibility of a proposed common protocol. After the doubts and reservations of all members are addressed, the resulting common document is endorsed as a common standard. This document is subsequently released to the public, and henceforth becomes an open standard. It is usually published and is available freely or at a nominal cost to any and all comers, with no further encumbrances. Various vendors and individuals (even those who were not part of the original group) can use the standards document to make products that implement the common protocol defined in the standard and are thus interoperable by design, with no specific liability or advantage for any customer for choosing one product over another on the basis of standardized features. The vendors' products compete on the quality of their implementation, user interface, ease of use, performance, price, and a host of other factors, while keeping the customer's data intact and transferable even if he chooses to switch to another competing product for business reasons.
Post facto interoperability
Post facto interoperability may be the result of the absolute market dominance of a particular product in contravention of any applicable standards, or if any effective standards were not present at the time of that product's introduction. The vendor behind that product can then choose to ignore any forthcoming standards and not co-operate in any standardization process at all, using its near-monopoly to insist that its product sets the de facto standard by its very market dominance. This is not a problem if the product's implementation is open and minimally encumbered, but it may as well be both closed and heavily encumbered (e.g. by patent claims). Because of the network effect, achieving interoperability with such a product is both critical for any other vendor if it wishes to remain relevant in the market, and difficult to accomplish because of lack of co-operation on equal terms with the original vendor, who may well see the new vendor as a potential competitor and threat. The newer implementations often rely on clean-room reverse engineering in the absence of technical data to achieve interoperability. The original vendors can provide such technical data to others, often in the name of 'encouraging competition,' but such data is invariably encumbered, and may be of limited use. Availability of such data is not equivalent to an open standard, because:
- The data is provided by the original vendor on a discretionary basis, who has every interest in blocking the effective implementation of competing solutions, and may subtly alter or change its product, often in newer revisions, so that competitors' implementations are almost, but not quite completely interoperable, leading customers to consider them unreliable or of lower quality. These changes can either not be passed on to other vendors at all, or passed on after a strategic delay, maintaining the market dominance of the original vendor.
- The data itself may be encumbered, e.g. by patents or pricing, leading to a dependence of all competing solutions on the original vendor, and possibly leading a revenue stream from the competitors' customers back to the original vendor. This revenue stream is only a result of the original product's market dominance and not a result of any innate superiority.
- Even when the original vendor is genuinely interested in promoting a healthy competition (so that he may also benefit from the resulting innovative market), post-facto interoperability may often be undesirable as many defects or quirks can be directly traced back to the original implementation's technical limitations. Although in an open process, anyone may identify and correct such limitations, and the resulting cleaner specification may be used by all vendors, this is more difficult post-facto, as customers already have valuable information and processes encoded in the faulty but dominant product, and other vendors are forced to replicate those faults and quirks even if they could design better solutions, for the sake of preserving interoperability. Alternatively, it can be argued that even open processes are subject to the weight of past implementations and imperfect past designs and that the power of the dominant vendor to unilaterally correct or improve the system and impose the changes to all users facilitates innovation.
- Lack of an open standard can also become problematic for the customers, as in case of the original vendor's inability to fix a certain problem that is an artifact of technical limitations in the original product. The customer wants that fault fixed, but the vendor has to maintain that faulty state, even across newer revisions of the same product, because that behavior is a de facto standard and many more customers would have to pay the price of any break in interoperability caused by fixing the original problem and introducing new behavior.
Government
eGovernment
Speaking from an e-government perspective, interoperability refers to the collaboration ability of cross-border services for citizens, businesses and public administrations. Exchanging data can be a challenge due to language barriers, different specifications of formats and varieties of categorizations. Many more hindrances can be identified.
If data is interpreted differently, collaboration is limited, takes longer and is not efficient. For instance, if a citizen of country A wants to purchase land in country B, the person will be asked to submit the proper address data. Address data in both countries include full name details, street name and number as well as a post code. The order of the address details might vary. In the same language, it is not an obstacle to order the provided address data; but across language barriers, it becomes more and more difficult. If the language requires other characters it is almost impossible, if no translation tools are available.
Hence eGovernment applications need to exchange data in a semantically interoperable manner. This saves time and money and reduces sources of errors. Fields of practical use are found in every policy area, be it justice, trade or participation, etc. Clear concepts of interpretation patterns are required.
Flood risk management
Interoperability is used by researchers in the context of urban flood risk management.[6] Cities and urban areas worldwide are expanding, which creates complex spaces with many interactions between the environment, infrastructure and people. To address this complexity and manage water in urban areas appropriately, a system of systems approach to water and flood management is necessary. In this context, interoperability is important to facilitate system of systems thinking in flood management, and is defined as: “the ability of any water management system to redirect water and make use of other system(s) to maintain or enhance its performance function during water exceedance events”. By assessing the complex properties of urban infrastructure systems, particularly the interoperability between the drainage systems and other urban systems (e.g. infrastructure such as transport), it could be possible to expand the capacity of the overall system to manage flood water towards achieving improved urban flood resilience.[7]
Military forces
Force interoperability is defined in NATO as the ability of the forces of two or more nations to train, exercise and operate effectively together in the execution of assigned missions and tasks. Additionally NATO defines interoperability more generally as the ability to act together coherently, effectively and efficiently to achieve Allied tactical, operational and strategic objectives.[8]
At the strategic level, interoperability is an enabler for coalition building. It facilitates meaningful contributions by coalition partners. At this level, interoperability issues center on harmonizing the world views, strategies, doctrines, and force structures. Interoperability is an element of coalition willingness to work together over the long term to achieve and maintain shared interests against common threats. Interoperability at the operational and tactical levels is where strategic/political interoperability and technological interoperability come together to help allies shape the environment, manage crises, and win wars. The benefits of interoperability at the operational and tactical levels generally derive from the fungibility or interchangeability of force elements and units. "Technological interoperability" reflects the interfaces between organizations and systems. It focuses on communications and computers but also involves the technical capabilities of systems and the resulting mission compatibility or incompatibility between the systems and data of coalition partners. At the technological level, the benefits of interoperability come primarily from their impacts at the operational and tactical levels in terms of enhancing fungibility and flexibility.[9]
Public safety
Interoperability is an important issue for law enforcement, fire fighting, EMS, and other public health and safety departments, because first responders need to be able to communicate during wide-scale emergencies. It has been a major area of investment and research over the last 12 years.[10][11] Traditionally, agencies could not exchange information because they operated widely disparate hardware that was incompatible.[12] Agencies' information systems such as computer-aided dispatch systems (CAD) and records management systems (RMS) functioned largely in isolation, so-called "information islands." Agencies tried to bridge this isolation with inefficient, stop-gap methods while large agencies began implementing limited interoperable systems. These approaches were inadequate and, in the US, the lack of interoperability in the public safety realm become evident during the 9/11 attacks[13] on the Pentagon and World Trade Center structures. Further evidence of a lack of interoperability surfaced when agencies tackled the aftermath of the Hurricane Katrina disaster.
In contrast to the overall national picture, some states, including Utah, have already made great strides forward. The Utah Highway Patrol and other departments in Utah have created a statewide data sharing network using technology from a company based in Bountiful, Utah, FATPOT Technologies.
The Commonwealth of Virginia is one of the leading states in the United States in improving interoperability and is continually recognized as a National Best Practice by the Department of Homeland Security (DHS). Virginia's proven practitioner-driven governance structure ensures that all the right players are involved in decision making, training and exercises, and planning efforts. The Interoperability Coordinator leverages a regional structure to better allocate grant funding around the Commonwealth so that all areas have an opportunity to improve communications interoperability. Virginia's strategic plan for communications is updated yearly to include new initiatives for the Commonwealth – all projects and efforts are tied to this plan, which is aligned with the National Emergency Communications Plan, authored by the Department of Homeland Security's Office of Emergency Communications (OEC).
The State of Washington[14] seeks to enhance interoperability statewide. The State Interoperability Executive Committee[15] (SIEC), established by the legislature in 2003, works to assist emergency responder agencies (police, fire, sheriff, medical, hazmat, etc.) at all levels of government (city, county, state, tribal, federal) to define interoperability for their local region.
Washington recognizes collaborating on system design and development for wireless radio systems enables emergency responder agencies to efficiently provide additional services, increase interoperability, and reduce long-term costs.
This work saves the lives of emergency personnel and the citizens they serve.
The U.S. government is making an effort to overcome the nation's lack of public safety interoperability. The Department of Homeland Security's Office for Interoperability and Compatibility (OIC) is pursuing the SAFECOM[16] and CADIP and Project 25 programs, which are designed to help agencies as they integrate their CAD and other IT systems.
The OIC launched CADIP in August 2007. This project will partner the OIC with agencies in several locations, including Silicon Valley. This program will use case studies to identify the best practices and challenges associated with linking CAD systems across jurisdictional boundaries. These lessons will create the tools and resources public safety agencies can use to build interoperable CAD systems and communicate across local, state, and federal boundaries.
Commerce and Industries
Information technology and computers
Desktop interoperability
The desktop interoperability (also known as interop) is a sub-section of the software interoperability. In the early days, the focus of ‘interop’ was to integrate web-applications with other web-applications. Over time, open-system ‘containers’ were developed to create a virtual desktop environment in which these applications could be registered and then communicate with each other using simple pub/sub patterns. Rudimentary UI capabilities were also supported allowing windows to be grouped with other windows. Today, the desktop interoperability has evolved into full-service interop platforms which include container support, basic exchange between web and web, but also native support for other application types and advanced window management. The very latest interop platforms also include application services such as universal search, notifications, user permissions and preferences, 3rd party application connectors and language adapters for in-house applications.
Information search
Search interoperability refers to the ability of two or more information collections to be searched by a single query.
Specifically related to web-based search, the challenge of interoperability stems from the fact designers of web resources typically have little or no need to concern themselves with exchanging information with other web resources. Federated Search technology, which does not place format requirements on the data owner, has emerged as one solution to search interoperability challenges. In addition, standards, such as OAI-PMH, RDF, and SPARQL, have emerged recently that also help address the issue of search interoperability related to web resources. Such standards also address broader topics of interoperability, such as allowing data mining.
Software
With respect to software, the term interoperability is used to describe the capability of different programs to exchange data via a common set of exchange formats, to read and write the same file formats, and to use the same protocols. (The ability to execute the same binary code on different processor platforms is 'not' contemplated by the definition of interoperability.) The lack of interoperability can be a consequence of a lack of attention to standardization during the design of a program. Indeed, interoperability is not taken for granted in the non-standards-based portion of the computing world.[17]
According to ISO/IEC 2382-01, Information Technology Vocabulary, Fundamental Terms, interoperability is defined as follows: "The capability to communicate, execute programs, or transfer data among various functional units in a manner that requires the user to have little or no knowledge of the unique characteristics of those units".[18]
Note that the definition is somewhat ambiguous because the user of a program can be another program and, if the latter is a portion of the set of program that is required to be interoperable, it might well be that it does need to have knowledge of the characteristics of other units.
This definition focuses on the technical side of interoperability, while it has also been pointed out that interoperability is often more of an organizational issue: often interoperability has a significant impact on the organizations concerned, raising issues of ownership (do people want to share their data? or are they dealing with information silos?), labor relations (are people prepared to undergo training?) and usability. In this context, a more apt definition is captured in the term business process interoperability.
Interoperability can have important economic consequences; for example, research has estimated the cost of inadequate interoperability in the U.S. capital facilities industry to be $15.8 billion a year.[19] If competitors' products are not interoperable (due to causes such as patents, trade secrets or coordination failures), the result may well be monopoly or market failure. For this reason, it may be prudent for user communities or governments to take steps to encourage interoperability in various situations. At least 30 international bodies and countries have implemented eGovernment-based interoperability framework initiatives called e-GIF while in the United States there is the NIEM initiative.[20] Standards Defining Organizations (SDOs) provide open public software specifications to facilitate interoperability; examples include the Oasis-Open organization and buildingSMART (formerly the International Alliance for Interoperability). As far as user communities, Neutral Third Party is creating standards for business process interoperability. Another example of a neutral party is the RFC documents from the Internet Engineering Task Force (IETF).
The OSLC[21] (Open Service for Lifecycle Collaboration) community is working on finding a common standard in order that software tools can share and exchange data e.g. bugs, tasks, requirements etc. The final goal is to agree on an open standard for interoperability of open source ALM tools.[22]
Java is a great example of an interoperable programming language that allows for programs to be written once and run anywhere with a Java Virtual Machine.[23] One writing a program in Java, so long as it does not use system-specific functionality, will maintain interoperability with all machines that have a Java Virtual Machine. There are many implementations of the Java Virtual Machine, such as Oracle, IBM, Android, etc... If a Java Virtual Machine is created to specification, applications will maintain compatibility because while the implementation is different, the underlying language interfaces are the same.[24]
Achieving software interoperability
Software interoperability is achieved through five interrelated ways:
- Product testing
- Products produced to a common standard, or to a sub-profile thereof, depend on the clarity of the standards, but there may be discrepancies in their implementations that system or unit testing may not uncover. This requires that systems formally be tested in a production scenario – as they will be finally implemented – to ensure they actually will intercommunicate as advertised, i.e. they are interoperable. Interoperable product testing is different from conformance-based product testing as conformance to a standard does not necessarily engender interoperability with another product which is also tested for conformance.
- Product engineering
- Implements the common standard, or a sub-profile thereof, as defined by the industry/community partnerships with the specific intention of achieving interoperability with other software implementations also following the same standard or sub-profile thereof.
- Industry/community partnership
- Industry-community partnerships, either domestic or international, sponsor standard workgroups with the purpose to define a common standard that may be used to allow software systems to intercommunicate for a defined purpose. At times an industry/community will sub-profile an existing standard produced by another organization to reduce options and thus making interoperability more achievable for implementations.
- Common technology and IP
- The use of a common technology or IP may speed up and reduce the complexity of interoperability by reducing variability between components from different sets of separately developed software products and thus allowing them to intercommunicate more readily. This technique has some of the same technical results as using a common vendor product to produce interoperability. The common technology can come through 3rd party libraries or open-source developments.
- Standard implementation
- Software interoperability requires a common agreement that is normally arrived at via an industrial, national or international standard.
Each of these has an important role in reducing variability in intercommunication software and enhancing a common understanding of the end goal to be achieved.
Market dominance and power
Interoperability tends to be regarded as an issue for experts and its implications for daily living are sometimes underrated. The European Union Microsoft competition case shows how interoperability concerns important questions of power relationships. In 2004, the European Commission found that Microsoft had abused its market power by deliberately restricting interoperability between Windows work group servers and non-Microsoft work group servers. By doing so, Microsoft was able to protect its dominant market position for work group server operating systems, the heart of corporate IT networks. Microsoft was ordered to disclose complete and accurate interface documentation, which will enable rival vendors to compete on an equal footing (“the interoperability remedy”). As of June 2005, the Commission is market testing a new proposal by Microsoft to do this, having rejected previous proposals as insufficient.
Interoperability has also surfaced in the software patent debate in the European Parliament (June–July 2005). Critics claim that because patents on techniques required for interoperability are kept under RAND (reasonable and non-discriminatory licensing) conditions, customers will have to pay license fees twice: once for the product and, in the appropriate case, once for the patent-protected program the product uses.
Medical industry
New technology is being introduced in hospitals and labs at an ever-increasing rate. The need for “plug-and-play” interoperability – the ability to take a medical device out of its box and easily make it work with one's other devices – has attracted great attention from both healthcare providers and industry.
Increasingly, medical devices like incubators, imaging (MRI, CT, ultrasound, and others) are driven by sophisticated software that must integrate at the point of care and with electronic systems, such as electronic medical records. At the 2016 Regulatory Affairs Professionals Society (RAPS) meeting, experts in the field like Angela N. Johnson with GE Healthcare and representative of the United States Food and Drug Administration provided practical seminars in how companies developing new medical devices, and hospitals installing them, can work more effectively to align interoperable software systems.[25]
Railways
Railways have greater or lesser interoperability depending on conforming to standards of gauge, couplings, brakes, signalling, communications, loading gauge, structure gauge, and operating rules, to mention a few parameters. For passenger rail service, different railway platform height and width clearance standards may also cause interoperability problems.
North American freight and intercity passenger railroads are highly interoperable, but systems in Europe, Asia, Africa, Central and South America, and Australia are much less so. The parameter most difficult to overcome (at reasonable cost) is incompatibility of gauge, though variable gauge axle systems are increasingly used.
Telecommunications
In telecommunication, the term can be defined as:
- The ability to provide services to and accept services from other systems, and to use the services exchanged to enable them to operate effectively together. ITU-T provides standards for international telecommunications.
- The condition achieved among communications-electronics systems or items of communications-electronics equipment when information or services can be exchanged directly and satisfactorily between them and/or their users. The degree of interoperability should be defined when referring to specific cases.[26][27]
In two-way radio, interoperability is composed of three dimensions:
- compatible communications paths (compatible frequencies, equipment and signaling),
- radio system coverage or adequate signal strength, and;
- scalable capacity.
Organizations dedicated to interoperability
Many organizations are dedicated to interoperability. All have in common that they want to push the development of the World Wide Web towards the semantic web. Some concentrate on eGovernment, eBusiness or data exchange in general. Internationally, Network Centric Operations Industry Consortium facilitates global interoperability across borders, language and technical barriers. In Europe, for instance, the European Commission and its IDABC program issue the European Interoperability Framework. IDABC was succeeded by the ISA program. They also initiated the Semantic Interoperability Centre Europe (SEMIC.EU). A European Land Information Service (EULIS) was established in 2006, as a consortium of European National Land Registers. The aim of the service is to establish a single portal through which customers are provided with access to information about individual properties, about land and property registration services, and about the associated legal environment.[28] In the United States, the government's CORE.gov service provides a collaboration environment for component development, sharing, registration, and reuse and related to this is the National Information Exchange Model (NIEM) work and component repository. The National Institute of Standards and Technology serves as an agency for measurement standards.
See also
- Computer and information technology
- Architecture of Interoperable Information Systems
- List of computer standards
- Model Driven Interoperability, frameowrk
- Semantic Web, standard for making internet data machine readable
- Business
- Business interoperability interface, between organisation systems and processes
- Enterprise interoperability, ability to link activities in efficient and competitive way
- Other
- Collaboration, general concept
- Polytely, problem solving
- Universal Data Element Framework, information indexing
References
- "Definition of Interoperability". dedicated website for a Definition of Interoperability at interoperability-definition.info. Copyright AFUL under CC BY-SA.CS1 maint: others (link)
- Institute of Electrical and Electronics Engineers. IEEE Standard Computer Dictionary: A Compilation of IEEE Standard Computer Glossaries. New York, NY: 1990.
- Slater, T. "What is Interoperability?", Network Centric Operations Industry Consortium - NCOIC, 2012
- Willium y Arms 2000(afifa iqbal)
- Slater, T. "Cross-Domain Interoperability", Network Centric Operations Industry Consortium - NCOIC, 2013
- Vercruysse, Kim; Dawson, David A.; Wright, Nigel (2019). "Interoperability: A conceptual framework to bridge the gap between multifunctional and multisystem urban flood management". Journal of Flood Risk Management. 0: e12535. doi:10.1111/jfr3.12535. ISSN 1753-318X.
- "Urban Flood Resilience". www.urbanfloodresilience.ac.uk. Retrieved 2019-05-15.
- NATO Glossary of Terms and Definitions, NATO AAP-06
- Interoperability: A continuing Challenge in Coalition Air Operations - Chapter 2 “A broad Definition of Interoperability”, by Myron Hura, Gary McLeod, James Schneider and others, RAND Monograph Report, 2000,
- Allen, D. K., Karanasios, S., & Norman, A. (2013). Information sharing and interoperability: the case of major incident management. European Journal of Information Systems, 10.1057/ejis.2013.8.
- Baldini, G. (2010). Report of the workshop on “Interoperable communications for Safety and Security”. Ispra: European Commission, Joint Research Centre (JRC), Institute for the Protection and Security of the Citizen.
- "Interoperability system bridges communications gap". FireRescue1. Retrieved 2017-01-25.
- Grier, Robin. "Interoperability Solutions". Interoperability. Catalyst Communications. Retrieved 28 May 2011.
- "Governor Jay Inslee - Washington State". Retrieved 12 August 2016.
- "SIEC". Retrieved 12 August 2016.
- "SAFECOM - Homeland Security". Archived from the original on 2014-12-21. Retrieved 12 August 2016.
- Gordon and Hernandez (2016-05-16). The Official Guide to the SSCP Book. SYBEX. ISBN 978-1119278634.
- SC36 Secretariat (2003-11-13). "Proposed Draft Technical Report for: ISO/IEC xxxxx, Information technology -- Learning, education, and training -- Management and delivery -- Specification and use of extensions and profiles" (PDF). ISO/IEC JTC1 SC36. Archived from the original (PDF) on 2007-11-29. Retrieved 12 August 2016.
- MP Gallaher; AC O’Connor; JL Dettbarn, Jr.; LT Gilday (August 2004). Cost Analysis of Inadequate Interoperability in the U.S. Capital Facilities Industry (PDF) (Report). National Institute of Standards and Technology. p. iv. Archived from the original (PDF) on 2016-02-04. Retrieved 2012-04-19.
- "e-Government Interoperability A comparative analysis of 30 countries" (PDF). CS Transform. 2010. Retrieved 21 January 2016.
- "Open Services for Lifecycle Collaboration". Retrieved 12 August 2016.
- "OSLC (Open Services for Lifecycle Collaboration): open standard for i…". 30 November 2011. Retrieved 12 August 2016.
- Write once, run anywhere
- 9. Java and JVM Interoperability [Book].
- "RAPS Preview: FDA CDRH Director Shuren Talks Priorities". September 19, 2016. Retrieved April 8, 2017.
-
This article incorporates public domain material from the General Services Administration document: "Federal Standard 1037C". (in support of MIL-STD-188) -
This article incorporates public domain material from the United States Department of Defense document: "Dictionary of Military and Associated Terms". - Design, Erskine. "Welcome - EULIS". Archived from the original on 17 September 2016. Retrieved 12 August 2016.
External links
- "When and How Interoperability Drives Innovation," by Urs Gasser and John Palfrey
- CIMIT - Center for Integration of Medicine and Innovative Technology - the MD PnP Program on Medical Device Interoperability
- GIC - The Greek Interoperability Centre: A Research Infrastructure for Interoperability in eGovernment and eBusiness, in SE Europe and the Mediterranean
- Simulation Interoperability Standards Organization (SISO)
- Catalyst Communications
- Interoperability: What is it and why should I want it? Ariadne 24 (2000)
- Interoperability Constitution - DOE's GridWise Architecture Council
- Interoperability Context-Setting Framework - DOE's GridWise Architecture Council
- Decision Maker's Interoperability Checklist - DOE's GridWise Architecture Council
- OA Journal on Interoperability in Business Information Systems
- University of New Hampshire Interoperability Laboratory - premier research facility on interoperability of computer networking technologies
- Interoperability vs. intraoperability: your open choice on Bob Sutor blog, 6 December 2006
- La France v. Apple: who’s the dadvsi in DRMs?, Nicolas Jondet (University of Edinburgh), SCRIPT-ed, December 2006
- ECIS European Committee for Interoperable Systems
- Gradmann, Stefan. INTEROPERABILITY. A key concept for large scale, persistent digital libraries.
- DL.org Digital Library Interoperability, Best Practices and Modelling Foundations