From the same people that brought you the Open Archive Initiative’s Protocol for Metadata Harvesting (OAI-PMH), now comes Object Reuse and Exchange (ORE). It’s a Mellon funded project around the exchange of digital objects between repositories, with a particular focus on digital objects and services that allow access and ingest of these things.
The project is intended to produce an “interoperability fabric” that will facilitate the interoperability between research oriented digital repositories. Since we’re at the start of the project, the scope is not entirely clear but would at least encompass a data model for the objects to be exchanged, a means of binding those (i.e. sticking them in an agreed file format) and some interfaces for moving the objects about.
The obvious question in a case like this is: why do another set of specs?
Why design another such fabric when there are already plenty of digital object aggregation formats such as the Metadata Encoding and Transmission Standard (METS), and no shortage of digital repository query protocols and other interfaces? Indeed, the OAI’s own PMH is a widely used means of providing basic interoperability between the catalogs of repositories, and the authors themselves cite the MPEG 21 Digital Item Declaration standard as a good example of a digital object data model (Interoperability-finalreport.pdf).
Answer in short: unmet use cases in the specific domain of inter-repository interoperability.
There are other between-repository operations one would want to do other than just harvesting of metadata. Obtaining and putting digital objects spring to mind, for example. To be sure, there are protocols for that as well, but possibly not any tailored specifically for inter-repository workflows, with their formal workflows and multi-layered understanding of sets of related artefacts. Still, it’d be nice to see those use cases spelled out, and an indication of the extent to which existing specs support them.
Much the same goes for the digital object data model, though existing specs in this field could well be more directly relevant to the domain. Documentation (Interoperability-finalreport.pdf, MellonProposalwithoutbudget.pdf) suggests that the ORE work is likely to build directly on a datamodel developed under the NSF Pathways project, which at least looks fairly simple and appears to have some implementation experience behind it.
More widely, the fact that people propose work on a new spec in a well known area could suggest that the area simply has not reached full maturity yet. Only when the last counter proposal to a dominant specification has failed to gain any traction can anyone be sure that the technology it codifies is truly well understood.