eXchanging course related information – XCRI timeline

One of the advantages of having being involved with JISC for a number of years (as a project and a service) is the opportunity to reflect on some activities that we’ve been involved in for some time. We thought it would be interesting to take the long view of some of our involvement with OER, XCRI and Learning Environments and reflect on what has worked and why, and where we think these activities are going next.

In this second story Lou McGill traces the history of the eXchanging Course Related Information (XCRI) specification, which is currently becoming a European and British standard (See Adam Cooper’s recent post). XCRI is a fine example of how the right people working together can develop interoperability standards that are truly fit for purpose.

A CETIS perspective of the XCRI story

Scott Wilson, Assistant Director at CETIS, describes XCRI as being ‘a community that formed to develop specifications related to course information’ (Wilson, 2010 [1]). This really captures the central aspect of the XCRI story as being about the community that came together, with the support of CETIS and JISC funding, to address a significant problem for institutions in managing their course information. Universities and colleges produce and utilise course information for several different purposes and duplication is a costly problem. The XCRI community drove forward the development of a shared vocabulary and domain map in the early days which ultimately led to the development of an internationally recognised specification. Their focus was on developing specifications to make generic aspects of a course description publicly available so they can be transferred easily between information systems. The formal outcome of this work is the XCRI-CAP (Course Advertising Profile) [2]. Lisa Corley from CETIS has written a blog post [3] which charts the development of XCRI, recognises work of these early pioneers and provides a very useful description of what it is and what benefits it offers to institutions. There is also extensive information available in the very well maintained XCRI Knowledge Base [4]. This community is still very active and now has fresh impetus from recent national HEFCE initiatives that require improved data exchange and transparency of institutional systems [5],[6] .

The XCRI and JISC CETIS timeline [7] has been developed to highlight the various activities that make up the XCRI landscape and includes JISC and CETIS activities since 2003. It also highlights some wider national and international initiatives which illustrate trends and changes in the last decade in this part of the Enterprise domain.


How we got here

The XCRI community emerged from the Enterprise SIG [8], a CETIS Special Interest Group established in 2003 that focussed on a range of standards, technologies and activities to facilitate the business processes of educational institutions. The Enterprise SIG was, essentially, a community of practice for people interested in, or involved with:

• the IMS Enterprise Specification and Enterprise Services Specification

• exchanging data about students and courses between educational systems (VLEs, Student Records, etc)

• joining up college systems to create Managed Learning Environments

• e-learning frameworks, architecture and web services

At the 2004 CETIS Conference the SIG identified a need for both a cohesive approach to standardising course descriptions and an agreed vocabulary, and in March 2005 the Enterprise SIG XCRI sub-group was formed. This group, led by Professor Mark Stubbs and Alan Paull, with the support of Scott Wilson from CETIS, became the XCRI community and have driven developments in this area in the UK since that date. In 2005 JISC funded XCRI as one of their Reference model projects [9] to define a vocabulary and appropriate (XML) technology bindings for describing course-related information. During this period XCRI produced an R1.0 schema, a repository demonstrator and surveyed 161 prospectus websites. This work happened alongside another JISC Reference Model project – COVARM (Course Validation Reference Model) [10]. Both of these reference model projects substantially mapped their respective domains and outputs fed into the eLearning Framework [11].

XCRI originally intended to ‘bridge the worlds of course marketing and course validation/quality assurance’ but, as Mark Stubbs describes [12], this became unwieldy;

“producing a course definition for validation/modification involves assembling fragments of information into a whole, whereas marketing a course involves communicating a serialized version of a subset of that assembly”

Feedback from the community, after testing the R1.0 schema in different contexts, identified a focus on a limited set of elements that supported course advertising and by 2006 XCRI-CAP was released as an XML specification. This was a very pragmatic outcome and presented something back to the community which responded to their needs and offered a usable schema for JISC to take forward with the wider HE and FE communities.

In 2007 JISC funded a range of projects [13] as part of the eLearning Capital programme around Course Management to build on the work of XCRI and COVARM reference projects. Within the course description and advertising strand a number of institutions specifically aimed to trial and refine XCRI-CAP. The resulting case studies from these projects [14] offer really valuable insight into the challenges and approaches that different types of institutions might encounter when implementing the specification and offer perspectives from a range of stakeholders such as policy makers, managers, administrators and technical staff.

JISC also funded a support project made up of members from CETIS, Manchester Metropolitan University and Kainao Ltd, who had been involved in the original XCRI sub-group. This project had a remit to further develop the specification, provide technical support to projects for the implementation of XCRI-CAP, provide a prototype online aggregation service, and promote the specification towards submission to an appropriate open standards process. The support project effectively continued the work of the XCRI reference project and proved so effective that funding was extended and continued until March 2011. Mark Power from CETIS describes [15] the various activities and achievements of this team and notes that this, and the work of the JISC funded projects, demonstrated the value of XCRI-CAP so successfully that it was placed on the strategic agenda of national agencies.

In 2008 the XCRI Support Project team engaged with other European initiatives in course information through the European Committee for Standardization (CEN) Workshop on Learning Technologies. Following this the CEN endorsed a Workshop Agreement for Metadata for Learning Opportunities (MLO) which defines a model for expressing information about learning opportunities including course information. MLO includes XCRI-CAP from the UK as well as other European specifications and is an attempt to unify these by offering a common subset, whilst still enabling local extensions and implementation architecture. This was ratified as a European Norm (EN 15982) in 2009 and was published in 2011.

Scott Wilson wrote in 2010 [16]

The formal standard defines the majority of the core concepts and information model used by XCRI. The engagement of XCRI in CEN standards development has provided an opportunity and an impetus for the XCRI community to progress to formal standardization. The current roadmap of XCRI is to develop a British Standard for its course syndication format as a conforming binding and application profile of CEN Metadata For Learning Opportunities: Advertising.

In 2009 XCRI-CAP 1.1 [17] was approved by the Information Standards Board for Education Skills and Childrens’ Services (ISB) as the UK eProspectus standard and on 1st March 2012 BS 8581 XCRI-CAP was released for public comment which would create a British Standard that is consistent with the European MLO-Advertising (EN-15982)

So far XCRI-CAP has enabled several institutions to transform practice around producing course information, especially for prospectuses, with reports of huge reductions in data duplication [18]. In 2009 the ISB estimated that a new standard for course information (XCRI-CAP) could save the sector in the region of £ 6 million per annum by removing the need to re-enter data into course handbooks and web sites.

Where we are now…

Scott Wilson from CETIS wrote a blog post in June 2011 entitled XCRI – the end of the beginning [19]. In this Scott notes a shift in the timeline of XCRI – taking us from a period of designing the specification and beta testing into ‘adoption, use and adaption’. This is a significant achievement for those involved in mapping and defining the terrain and testing out the specification across institutional systems. The community now has working exemplars which not only deliver proof of economies of scale, through reduced duplication of data, but also articulate the value of re-thinking a range of business processes affected by course information. It has long been recognised that barriers for institutions in adopting XCRI-CAP lie not in technical complexities of the schema but in the challenges around managing their data and processes to support course information, many of which involve several incompatible systems.

A range of current national drivers require educational institutions to consider how they manage and surface some of their information sets and those institutions that have engaged with XCRI-CAP are likely to find it easier to respond to some of these requirements. In early 2011 a report to HEFCE from the Online Learning Task Force [20] highlighted the challenges that students face due to insufficient, and hard to find, information about courses not dealt with by UCAS. As part of the Government Transparency agenda educational institutions are being required to provide KIS (Key Information Sets) [21] for the majority of undergraduate courses from September 2012 and to feed into the HEAR (Higher Education Achievement Report) [22] recording student achievement. Each of these initiatives provide significant impetus for institutions to consider how their course information is managed and how it links to other processes. Implementing XCRI-CAP can be a valuable way to consider this [23]. Towards the end of 2011 JISC launched a new programme called Course Data: making the most of Course Information [24]. The programme has two stages – the first which took place from September to November 2011 gave 93 institutions £10k to prepare an implementation plan to improve their course data flows and produce feeds for external agencies. 63 institutions have been selected for stage 2 to implement these plans and these began in January 2012 and will end in 2013. Outcomes and outputs of this programme are being synthesised on the XCRI Knowledge Base [25].

Where we are going…

Whilst the 2011 JISC programme will result in larger numbers of courses being advertised in XCRI-CAP format Scott argues that we need to see it taken up by major course aggregation and brokerage services. This was one of the themes discussed at the XCRI eXchange [26] in 2011 which was a National Showcase for XCRI, organised by the SAMSON [27] and MUSKET [28] projects, funded under the JISC Institutional Innovation Programme. Scott concludes his blog post with a suggestion that establishing an alliance could be the key to encourage high profile ownership and promotion of XCRI.

I think it would have to have the major aggregators on board (UCAS, Hotcourses), plus curriculum management solution providers (Unit4, Akari) and larger learning providers (e.g. the Open University, University of Manchester, MMU, Nottingham) as well as some of the smaller tools and services companies that are already working with XCRI (APS, IGSL, Smartways). It would develop the brand identity under which XCRI-CAP adoption would be recognised (not necessarily retaining anything of the original brand) and promote it beyond the reach of funded programmes into large-scale use.

Is this the future for XCRI?

I believe an Alliance like this would be a fitting development in the story of XCRI – a community driven specification having ongoing support and recognition from key stakeholders. It would be a fitting testament to the XCRI community and their achievements over the last decade.


About Lou

Lou McGill is currently working independently and has recently been involved in synthesis and evaluation activities for the HE Academy/JISC UKOER programme and the JISC Transforming Curriculum Delivery Through Technology programme. She lead the team that produced the Good Intentions report on business cases for sharing and worked on the LLiDA study (Learning Literacies in a Digital Age). She has experience of working in a range of HE institutions as a librarian, learning technologist and project manager and used to be a JISC Programme Manager on the eLearning team. In the distant past she worked for NIACE (the adult learning organisation) and in Health education for the NHS. Her interests and experience include digital literacy, information literacy, open education, distance learning, managing organisational change, and effective use of technologies to support learning. Further information on Lou’s work can be found at: http://loumcgill.co.uk

Curriculum Delivery: Dynamic Learning Maps

In this second post Lou McGill focusses on the DLM project which produced a dynamic map of the entire Medical Curriculum at Newcastle University.


The Dynamic Learning Maps (DLM) project at Newcastle University was funded by JISC as part of the Transforming Curriculum Delivery Through Technology programme. This programme saw a very diverse set of projects use technology to address a number of institutional challenges and ultimately transform their approaches to enhancing the student experience.

The DLM project aimed to make visible the complex medical ‘spiral curriculum’ where topics are revisited with increasing depth over a 5 year programme, and to reveal connections across curricula to support modular learning. High level goals included a desire to promote lifelong learning, enhance employability and enable personalisation for students. The diverse nature of stakeholders with unique views of the curricula increased the complexity of the project and led to some interesting paths and choices as the team developed the maps. Simply getting agreement on what constitutes a curriculum map proved challenging and required a highly flexible approach and agile development. Like many technical development projects DLM had to balance the need to develop a common understanding with the need to reflect different stakeholder requirements. Agreeing on what elements to include and the level of detail was important as well as the fundamental issues around how they might be used by both staff and students.

The project stands out in this programme for a few reasons – not least that it has much in common with the ongoing Institutional Approaches to Curriculum Design programme, which has several projects engaged with mapping the curriculum. DLM has developed an interesting mix of formal curriculum maps, personal learning records and community-driven maps which have proved to be as useful for curriculum teams as for the medical, psychology and speech and language students they support. Another distinguishing feature of the project was that it had a more technical focus than other projects – engaging with neural networks, a range of institutional systems, data feeds and mashups, e-portfolios as well as a range of web 2.0 approaches.

A curriculum map from the DLM project

A curriculum map from the DLM project

Curriculum maps
Curriculum maps are diagrammatic representations of the curriculum and can be seen as operational tools for staff to support course management and administration or as aids to support curriculum design and delivery. They offer a window on to the curriculum and can include learning outcomes, content, assessment and information about people. They can be complex and are usually very labour intensive to develop and embed into institutional processes. Making curriculum maps open to students gives them navigational aids through their courses, providing clarity around different modules, identifying connections between these and supporting module choices. Adding capacity to personalise their pathways through the maps, link to individual portfolios records and add their own resources to can turn curriculum maps into powerful learning aids. This is what makes the DLM project stand out for me.

It is challenging enough to visually represent or describe a curriculum, but using established standards and technologies to link data in a meaningful way takes things to another level as connections can be made with a range of institutional systems.

Institutional Systems linked in DLM project

Institutional Systems linked in DLM project

DLM pulls in data from a variety of institutional systems including repositories, library and curriculum databases, and student information systems, as well as having a two-way transfer with e-portfolio systems. This means individuals can navigate the formal curriculum map and see links between elements, such as learning outcomes, timetables, etc. They can also add to the map in the form of personal reflections or notes and external resources which can be rated and discussed. It is important to note that the project highlighted that existing curriculum data will not necessarily plug-in easily to other systems. They identified a range of issues that had to be addressed including governance, QA process and data mitigation and translation. This is one example described in the final report:
A separate tool was developed to manage programme-level learning outcomes and link these to units /modules, this provides a feed used in Learning Maps. This was necessary because existing data from an MBBS study guides database was in non-standardised text; this tool enables curriculum managers to map specific programme-level outcomes, but to still use context-specific language of the module.

Web 2.0 mash-ups
Technically DLM is very interesting – the team built on previous work around personalisation and Web 2.0 approaches. The DLM concept was inspired by neural networks, where nodes can be connected to any other node in the network, and those connections vary in strength. They used open standards and existing corporate and programme data to support sustainability, transferability and reduce dependencies on specific systems.

The project team took good advantage of work being done within the institution as part of other initiatives. The e-portfolio element of DLM drew on work done with the Leap2A specification for e-portfolio portability and interoperability as Newcastle University has been involved in developing the specification and the JISC EPICS-2 project (e-Portfolios and PDP in the North East). Taking a web service approach, when learners add their reflections and notes to curriculum topics these generate an xml file which is stored remotely in their individual portfolio using Leap2 and becomes part of their portable portfolio record. This approach means that even if departments have different e-portfolio systems the standardised data can be stored in any of them. For more information on Leap2A see the recent article by Christina Smart from JISC CETIS.

DLM also benefitted from the experience of using XCRI-CAP from the mini project North East XCRI testbed (NEXT) which reported in April 2010. As an example of using XCRI feeds within a learning application, they added support for DLM. By embedding XCRI feeds inside learning maps course related information relating to specific parts of the curriculum are revealed using rss/atom feeds which send an http request to course database and http response into DLM. This experience was of significant value to both projects. Other initiatives built on by DLM were the use of CAS authentication as an outcome of the JISC Iamsect single-sign-on project and use of corporate data flows from the JISC ID-Maps project.

The project final report describes the project approach:

The project maintained a blog which covers a whole range of interesting aspects but my favourite post was one by Tony McDonald called DLMS as a substrate for academic content where he provided a real glimpse into the possibilities of taking existing data (eg a study guide) and re-presenting this using DLMs, and the kinds of detailed considerations that affect this such as metadata and context. Here is a brief snippet of his conversation with himself…

Well, we would need to deconstruct the study guide into something which is ‘node’-sized for the DLMs machinery. This could be at the paragraph level or smaller. That isn’t so bad to do, we have a lot of contextual information on the guide itself (where it sites in the curriculum, who is the module leader etc) which would contribute to over-arching metadata on the document. We would then need to add DLM-specific metadata on each node. The metadata is quite varied, from simple one word descriptions (eg simple tags) through to multiple-selections for licence usage of the material itself (we very much believe in OER!). The metadata also helps us to decide how the content should be rendered – eg as simple HTML, as something which is only released in a specific time frame, something that is only seen by particular categories of user, etc This deconstruction is certainly doable, and the DLMs team has already done this for small sections of study guide material. (Tony McDonald)

Impact so far
Evaluation occurred throughout the process and early feedback shaped the visual representation and elements included in the maps. Students revealed an almost 50/50 split in preference for visual (concept map style) representation and hierarchical lists (text-based) so DLM has both styles of display, as well as tagcloud views.

Ongoing challenges have emerged that are relevant to any curriculum mapping processes such as changing curricula – sometimes restructuring of whole courses, and the fact that the student journey through the curriculum changes over time. One particular issue is that each cohort has a different experience of the curriculum and the team were faced with a decision around mapping these differences, however they chose to opt only for the current map as this would link to up to date developments and guidelines which are crucial in the healthcare field. Other challenges include managing stepped/timed availability of resources, and that not all data is available in a usable or consistent form. A major challenge lies in balancing automated data with that requiring moderation or contextual information – impacting on choices around granularity of content and specificity.

DLM offers different things to a range of stakeholders. For learners it offers an overview of their learning – a reminder of prior learning, a view of current learning and opportunities to consider future learning choices. They offer interactive opportunities for sharing, rating and reviewing resources as well as facilities to add personal reflective notes, portfolio records and evidencing learning outcomes. Different student groups expressed alternate ways of using DLM – ie for revision, contextualisation, planning curriculum choices or career choices. .

For staff DLM offers mechanisms to review curricula and identify connections across modules. In addition they highlight gaps in provision, duplication and issues around consistency of terminology. Staff are able to see how a student may perceive and engage with their curriculum, monitor access and equality of learning opportunities, and consider alignment of teaching learning and assessment. They will be able to identify which resources emerge as popular and offer a glimpse into learning that may happen outside the formal curriculum.

At Newcastle, thanks to interest at strategic level the team are planning to extend DLM to geography and dentistry. It will be very interesting to see how well it transfers to other subject areas but there are quite a few challenges in transferring the model to other institutions, although the team have received expressions of interest. The extent of customisation required to make this work takes commitment and considerable technical support. A public demonstrator and software download is available via the project Website and thanks to the use of open standards other institutions could potentially take this forward if they too are prepared to develop a shared understanding of curriculum mapping and take the time to share data across their systems.

This excerpt from the project final report nicely sums up the potential of DLM – it is definitely a report worth reading and I hope other institutions consider a similar model or take some of the steps towards making it work within their own context.

DLM is a flexible and interactive tool, which can be readily aligned with an institution’s Teaching and Learning Strategy, whilst at the same time support a diverse range of specific programme requirements. It can be used to increase transparency in the curriculum in an interactive and participative way that more closely matches the changing experience and expectation of many modern learners. It also has potential to help address sector-wide drivers for PDP, employability, greater personalisation and student involvement in the curriculum. DLM final report


Range of outputs

https://learning-maps.ncl.ac.uk/docs/
Demonstration version of Dynamic Learning Maps (needs registration)
Project final report

XCRI-CAP – now is the time

In her third post on Curriculum Design, Lou McGill reflects on the challenges and opportunities surrounding the effective use of course data in institutions.


JISC have recently released a call entitled ‘Course Data: Making the most of Course Information’. This is a different style of call which offers funding for a review and planning stage, during which institutions will develop an implementation plan to improve course data flows within the institution as well as producing feeds for external agencies. The second phase will see some of those institutions funded to take the implementation plan forward. JISC are hoping to fund a range of examples using different kinds of courses – online, postgraduate, distance and CPD courses so we should learn a lot from programme activities. A national XCRI showcase was held in June 2011 and highlighted some really useful exemplars. These are detailed on the JISC Benefits Realisation blog post which also documents some interesting discussions.

The call nicely reflects an increased interest in the role of course information across institutional processes and systems as the post 16 education sector prepares for increasing demands on course data from both students and from government agencies requiring increased transparency from publicly funded bodies. As I mentioned in my last post HEFCE requirements for institutions to provide KIS (Key Information Sets) for all courses from September 2012 and to feed into the HEAR (Higher Education Achievement Report) recording student achievement means that institutions need to collate, manage and provide consistent and complete data. These drivers provide the impetus for institutions to finally embrace and take forward the XCRI specification (Exchanging Course Related Information), which, up to now, has not been taken up widely in institution-wide contexts. This new impetus and the results of ground building work done by pioneer individuals and institutions means that there is now an excellent infrastructure of supporting information and knowledge to move forward.

Lisa Corley from CETIS has written an informative overview blog post which charts the development of XCRI, recognises the work of these pioneers and provides a very useful description of what it is and what benefits it offers to institutions. This, coupled with the excellent XCRI Knowledge Base should provide anyone interested in the call with the basic information to take this forward. Scott Wilson from CETIS has also written a more technically focussed blog post entitled XCRI – the end of the beginning.

One of the most useful things for those about to embark in this process is what they can learn from people and institutions which have already been through it – as they can highlight challenges, pitfalls, good practice and also illustrate benefits. The latter is particularly useful to use with reluctant stakeholders who may need convincing. This post focuses on the work of projects involved in the Institutional approaches to Curriculum Design Programme. Two earlier posts describe the business process approaches adopted by projects and looked in detail at course information.

Sheila McNeill, from CETIS has been working closely with these projects and produced a blog post in April 2011 which provided some excellent visual representations of the technologies being used by them. This wordle, reproduced from that post, illustrates just how significant XCRI is to these projects.

 

screen-capture-11-300x137

Wordle of techs & standards used in Curriculum Design Prog, April 11

However as Sheila points out ‘ we are still some way off all 12 projects actually implementing the specification. From our discussions with the projects, there isn’t really a specific reason for them not implementing XCRI, it’s more that it isn’t a priority for them at the moment.’

This reflects what I was saying above although some notable exceptions are the Supporting Responsive Curricula (SRC), Predict, and Co-educate projects which have engaged significantly with XCRI implementation and development. Early conversations among projects highlighted some shortcomings in the specification, which also reflected a wider community concern that the XCRI-CAP (Course Advertising Profile) profile concentrated on marketing elements and did not support pedagogical information. The recognition of the CAP profile in the European Metadata for Learning Opportunities (MLO) standard in 2011 is a major step towards consolidating XCRIs place in the wider course information landscape. Publishing course information in the standard format means that it can be found and aggregated by services such as UCAS and offers potential for collation in a range of ways.

Although appearing to focus on a fairly narrow aspect of course information (advertising and marketing) the elements that make up XCRI-CAP are central to a much wider range of institutional processes and systems that require accurate and up-to-date course data. This links to wider course information, inputs into institutional systems such as VLEs, and can be connected to student data. The notion of having one accurate definitive source of data should be appealing to many stakeholders in an institution: fundamental for administrators and marketing staff, supporting decision making for senior managers, easing the burden for teaching staff and better informed students – but also for people outside the institution: clarity for prospective students, employers and other interested agencies as well as fulfilling requirements from funders. The implementation process should highlight the different elements of course information and how they connect. It should also help institutions articulate which information is relevant for which stakeholder.

Implementing XCRI-CAP
We learned from the JISC XCRI mini projects (2007-2009) that there are no major technical difficulties in implementing the specification, but as Sheila says in her blog post ‘As with many education specific standards/specifications, unless there is a very big carrot (or stick) widespread adoption and uptake is sporadic however logical the argument for using the spec/standard is.’

So if the KIS and HEAR requirements represent the stick then I think the outcomes and outputs from the Institutional Approaches to Curriculum Design illustrate the carrot – the rewards for taking on this challenge. I describe it as a challenge, not for technical reasons, but because it relates back to issues discussed in my first two posts – the challenge and the benefits that come from having institution-wide conversations. It is time consuming and demanding for institutions to take a ‘big picture’ view of the many processes that link together, to rethink some of these processes and to articulate where they all connect and which data is central to this. However the benefit of this approach has been strongly emphasised by all of the project staff that I have spoken to. In early stages projects typically found a lack of articulation between course review, approval, advertising and enrolment/reporting, and between quality assurance, marketing and student records.

Whilst these projects have a focus on curriculum design processes all have had to take a broad view of whole institutional processes involving course information and student data. Many of the projects worked in parallel with other institution-wide initiatives (such as the University of Bolton Co-Educate project which linked to the development of a module database) reflecting the breadth of scale of their activities. It is hard to tease out the benefits of implementing XCRI-CAP from the benefits of those wider scale activities, because they naturally augment each other. Benefits include:

  • Increased understanding across the institution of how processes connect and how the data and systems facilitate or hinder these processes.
  • Improved efficiencies – such as less duplication of data, time savings, one accurate source of data that feeds into several systems, less paperwork.
  • Transparency of information for registered students, prospective students, and external agencies (e.g. government bodies and employers) has the potential to increase student intake and enhance the experience of students once they register with the course/s.
  • Automatic feeds to comply with funder requirements.

 

There is a consensus that implementing XCRI-CAP is fairly straightforward – once the data is in a database it is fairly uncomplicated to maintain – but when institutions try to broaden this to develop a definitive set of course information, linked to key processes such as quality control or curriculum design activities, then it becomes much more challenging. The Institutional Approaches to Curriculum Design projects have been documenting their experiences and producing some really useful outputs that should be of interest to the wider community. There is a particularly well written report from the University of Bolton Module database project which describes how they took experience from a JISC mini XCRI project and the Co-Educate curriculum design project to redesign and implement their module database.

‘The resulting system supports the capture of information from course inception, the development of modules, through the validation process, to approved and published status. A database application has been implemented with the functionality to support collaborative development of courses and manage the version control for module specifications subjected to minor modification. The system provides the capability to assemble pre-validated modules into new courses using the University’s pre-validated IDIBL course framework. The use of XCRi CAP 1.1 to render module specification in standards based XML, enables module details to be accessed and reused without having to use the database application. This opens up new possibilities for the reuse of module information. The University’s JISC funded Co-educate curriculum design project will be developing further tools for collaborative curriculum specification and design that will now use the XCRI capability.’

The report is really worth reading and they describe their approach and highlight the lessons learned.

The SRC project at Manchester Metropolitan University ran alongside an institutional initiative to Enhance the Quality of Assessment for Learning (EQAL) which is introducing a new curriculum framework, new administrative systems and processes, revised quality assurance processes and new learning systems to transform the student experience. The SRC project has been led by Professor Mark Stubbs, Managed Learning Environment Project Director who has been affectionately described as ‘The Godfather of XCRI’. Mark talks eloquently in a recent presentation on the origins of XCRI. In the video Mark re-iterates the fact that the technology behind the standard is not complex and describes how the Curriculum Delivery and Design programmes have highlighted the business process challenges that need to be worked through to ensure that it is possible on an institution-wide scale.

The project has produced some excellent resources which map and describe their journey and some of these have recently been added to the JISC Design Studio. One of these is a game called Accreditation! which is a training resource for those trying to encourage stakeholder engagement when embarking on a major change process involving program design and approval.

Screen shot of Accreditation board game

They have also produced a case study outlining academic database stakeholder requirements which includes some useful visual representations of their processes.

So the consensus is that ‘now is the time to embrace XCRI’. The JISC call presents a really great opportunity to get started on this. The first phase simply requires a Letter of Commitment from eligible institutions which provides evidence of support from Senior Managers responsible for Teaching and Learning, Marketing, Management Information Systems/IT and the institutional course web sites by12:00 noon UK time on Wednesday 7 September 2011. There is an Elluminate recording of the live briefing session in case you missed it and lots of information described here to convince these various stakeholders of the benefits.