Relationship Management infoKit now live!

On Valentines Day Andy Stewart from JISC Infonet launched the new Relationship Management infoKit The infoKits are excellent practical resources which highlight the lessons learned and resources from many large scale Jisc Programmes and make these both accessible and digestible. The Infonet team are great at developing and managing these really valuable assets.

This new infoKit describes some of the challenges faced by institutions when seeking to improve and maintain relationships with a range of different stakeholders, all of whom have different needs and expectations.

It highlights the different approaches that institutions can adopt and the kinds of infrastructure and cultural change that is needed to facilitate and support sustainable relationships. This infoKit provides an insight into some of the emerging technologies and professional practices explored as part of Jisc’s Relationship Management programme.

I contributed to the writing of the Relationship Management infoKit but the main credit for this great resource really belongs with other people. Sharon Perry, as a member of the CETIS team at Bolton, provided ongoing support and synthesis for the programme – her blog really offered a flavour of key lessons as they emerged. I worked with Sharon as a consultant on the final synthesis, with a focus on the alumni engagement strand. As with any synthesis work the main challenge was in taking the extensive work and evaluation outcomes from several project teams and bringing it together into a cohesive format. We worked closely with two Jisc Programme Managers Simon Whittemore and Myles Danson who brought the overarching vision and passion to the Programme.

I really would recommend that you take a look at the work done by the programme. In these changing and challenging times successful collaboration is key to ensuring that educational institutions develop sustainable and mutually beneficial relationships. As Sir Tim Wilson says in his foreward to the infoKit

This resource recognises the importance of drivers and motivations for institutions in this area, and illustrates how partnership management can be supported through institutional infrastructure and technologies, and managed through communication and networking approaches. The resource addresses common barriers and constraints and proposes approaches to avoid these pitfalls, based on institutional experiences. It provides approaches through which to derive the key benefits of enhanced student employability, engaged and supportive alumni, and professional service design and monitoring. However, for sustainable business value to be derived from these partnerships for the institution, agreed strategic priorities and supporting policies which bring together information/data, departments and stakeholders are necessary.

I learned a lot from doing this work and really hope that alumni engagement becomes much more than a series of fund raising activities. The potential goes much further than that – involving alumni in supporting existing students has far reaching positive consequences for both of these groups, for the wider community and for the institutions. Several other Jisc programmes are illustrating some great student partnerships (UKOER and DIGILIT) and this is one way to capitalise on these and take it forward outside the institution.

If you don’t believe me then go and check out the infoKit.

Lou McGill

http://loumcgill.co.uk

The Role of Libraries and Information Professionals in OER Initiatives

Gema Bueno de la Fuente of the University Carlos III of Madrid reports on the outputs of a survey, undertaken with R. John Robertson (formerly of CETIS, University of Strathclyde) and Stuart Boon, (formerly of CAPLE, University of Strathclyde), on the role of libraries and librarians in OER initiatives.

The main objectives of the study were to gain insight into the level of involvement and commitment of the library as an organizational unit, and of individual librarians and other information science specialists in OER projects. More specifically, it sought to identify their primary roles and responsibilities, whether they already had relevant expertise or, on the contrary, whether they needed further training to meet project requirements. The study also looked at the level of integration of OER initiatives, particularly their content, into library resources and services. The ultimate aim was to recognize and highlight the opportunities for libraries and librarians to be involved in and contribute to OER initiatives, and the advantages that their involvement offered to these projects.

The study, which is partially based on preliminary work done by CETIS Research Fellow John Robertson in 2010, incorporated 15 questions, which make use of scaled, multiple choice, structured, and open questions. Responses were gathered during October and November 2011.

Disregarding partial, empty, duplicated, and problematic responses, the total number of usable responses was 57. Their geographical distribution was quite heterogeneous with contributions coming from all continents, the majority from HE institutions (81.3%), and with a significant number of UKOER and Open Course Ware projects.

Regarding the involvement and roles of the library and librarians in OER initiatives, the survey results indicate that their presence and engagement is quite considerable: three out of four project teams include at least one librarian, most of them based at the institutional library; while in half of the projects surveyed, the library is leading or partnering the initiative.

The main areas of library involvement are: description and classification, management, preservation, dissemination and promotion of OER. In order to support these activities, librarians provided expertise in information science areas, such as: metadata standards, vocabularies, indexing and classification, information retrieval, information literacy, and repository technology and management. It was also found, however, that librarians needed to develop expertise in different areas, including SEO, IPR and licensing, but particularly in e-learning and OER expertise, technologies and standards.

The final conclusions of this study indicate that where the library and/or librarians are already engaged in OER projects, their contribution is considered to be indispensible and is valued highly. However the participation of the libraries in such projects is still not widespread, and a significant lack of awareness exists both from OER initiatives with regards to library activities, and from the libraries about the resources released by OER initiatives. As most of the objectives of content-focused OER initiatives are strongly related to library and information science activities and skills, we consider that their involvement would be of great benefit to those projects not yet engaged with them.

There is a clear need to promote and build awareness among stakeholders about libraries and librarians’ potential contribution to the OER movement, but also among libraries and librarians about their key role as OER advocates within and out-with their institutions. There is an opportunity for libraries and librarians to further engage in the OER movement as creators and users of OER content for their own professional development, particularly in common areas such as information literacy. Some initiatives in this area include the UKOER Project DELILA (Developing Educators Learning and Information Literacies for Accreditation), and the new project of the CILIP CSG-Information Literacy group in partnership with UNESCO.

Libraries, libraries associations, and LIS education institutions should support the development of the skills that librarians need to better support OER initiatives by designing and offering training programs and improving syllabus. In this regard we can highlight the International Association of Universities (IAU) OER Project, which aims to establish an international partnership for the development of a “Training Programme for Academic Librarians on OER Use, Reuse and Production”, specially targeted to librarians in developing countries. This kind of initiative reinforces the relevance of the library’s role for the OER movement and the need of further analysis and developments on this area.

The executive summary of the survey report can be accessed here, and the complete survey report here. Both documents are available from the CETIS publications page.


Gema Bueno de la Fuente was a visiting scholar at CAPLE/CETIS during fall 2011, where she worked with CETIS research fellow R. John Robertson and CAPLE lecturer Stuart Boon on the relationship between OER initiatives and libraries, and on institutional practice in managing learning materials. She is based at the Library and Information Science Department, University Carlos III of Madrid, where she works as an assistant professor teaching in several Undergraduate and Graduate Programs. She holds a PhD in Library Science since 2010 with the dissertation “An Institutional Repository of Educational Content (IREC) Model: management of digital teaching and learning resources in the university library”. Her main research interests are digital teaching and learning materials, open content, digital repositories and e-learning systems, with a special focus on the library’s role in these areas, mainly in relation to metadata, vocabularies and standards.

Improving the student experience with an improved tutorial selection process

As part of the JISC Institutional Approaches to Curriculum Design project, the UG-Flex project at the University of Greenwich undertook set out to “reveal and enhance the University’s curriculum development processes in order to support a more agile and diverse curriculum underpinned by integrated systems.”

As part of on going dialogue about the technical aspects of the project the team shared with us their some of their plans for developing more sophisticated, real time timetabling processes. Although this work is not directly related to the UG-Flex project, this example of choice of systems and their integration demonstrates the positive contribution to be made to the day to day delivery of the University’s curriculum. Clifton Kandler, Web Services Manager explains more in this guest post.

The Problem
Like many university’s at the start of courses (programs), course leaders are faced with the need to separate students in to groups for tutorials, lab sessions and in larger courses lectures. For our Business School which has courses with up to 500 registered this has been a particularly important issue for some time. Having moved on from collating student’s tutorial selections from pieces of paper placed on notice boards, prior to our migration to Moodle the Business School used the group functionality within WebCT to either allocate students to tutorials or enable them to self select a tutorial slot.

The lack of integration between WebCT and our timetable system, Syllabus plus from Scientia however meant that the setting up of these groups within WebCT was a manual process. Once students had been allocated to tutorials or had made a tutorial selection manual intervention was again required to provide this information to our timetable system to enable construction of a personalized timetable for students which is accessed via our Portal (Luminus from Elluician).

These points of manual intervention resulted in errors and delays in providing students with accurate timetable information at the start of courses, frustration on the part of course leaders who could not be sure who should be in their tutorials and consequently delays in organising students in to groups for group assignments for example.

The Opportunities
A clear opportunity existed to improve the experience of students, academics, School administrators and timetabling staff by integrating the systems involved and removing the points of manual intervention.

The decision to migrate to Moodle as our institutional VLE for the start of the 2011/12 academic year also provided an opportunity to develop the environment to meet our specific challenges. This was one of the decisive factors in choosing Moodle. The time table selection block was the first area of development chosen.

The final opportunity came in the form of SunGard’s Infinity Process Platform, this Business Process management tool enabled us with SunGard’s help to model, analyse and execute the work flows and integrations required. This tool is used extensively in the financial services industry and Greenwich is the first to use it in a Higher Education context.

The Process
A series of workshops was held with representatives from Schools to further understand the problem to be addressed and draw up a list of requirements and timetable for development. As well as meeting the objectives set a major outcome of these workshops was the acknowledgment on the part of participants of the complexities of producing solution to the issues raised from a systems integration perspective. The outcome of the process was that the following requirements were identified:

The solution should enable:
*Allocation of students on a Moodle course to tutorials.
*Enable students to self select a tutorial.
*Reduce the size of a tutorial on the fly – allowing staff to hide the full tutorial capacity in case they need to move students.

An eight week timetable was identified for the delivery of the project.

Systems Integration Achieved

Diagram of system integration

Diagram of system integration

Challenges
The major challenge for this development has been managing the co-ordination of 4 the parties (including Greenwich) involved. University of London Computing Centre who host our Moodle environment, SunGard for IPP and Scientia who provide the timetable system. A steep learning curve was involved in delivering this project within a tight 8 week time frame and on budget.

Enabling users to articulate their requirements was an additional challenge, the tendency is for users to largely ask for what they already have and to really only fully understand their requirements after actual use (see paragraph below). The ability to quickly develop in Moodle and IPP has meant that we have been able to respond to new requirements that have emerged.

Implementation and Subsequent Development
The selection block was used on all 561 Business School Courses at the start of the 2012/13 academic year and has been very popular with students, with reports of them valuing the additional control they now have over determining their timetables. We are clearly providing a better service to students.

Following the initial implementation the timetable block has been further developed to provide the following additional functionality:
● Allow staff to time release the block on a course by course basis.
● Allow staff to make changes to all activities in one go.
● Staff to be able to download the list of allocated students from the tutorial.
● Hide individual tutorials.
● Change the size of a tutorial in Moodle – this change is not written back to Syllabus +

The timetable block will be used by our Engineering, Humanities and Social Sciences, and Computing and Maths School’s as well as Business at the start of the 2012/12 academic year which means that over 70% of courses will be using the development.

Conclusion
The development of the Timetable selection block has not only enabled us to improve the student experience via process improvement, but has also enabled us to work with a Business Process Modelling tool seriously for the first time enabling Greenwich to support its desire to be a more agile and Service orientated institution.

About Clifton
Clifton Kandler is the Web Services Manager at the University of Greenwich, leading the team responsible for the development, implementation and support of the University’s VLE, Portal, Library management system and e-portfolio.

Learning Environments Timeline: The JISC CETIS view

One of the advantages of having being involved with JISC for a number of years (as a project and a service) is the opportunity to reflect on some activities that we’ve been involved in for some time. We thought it would be interesting to take the long view of some of our involvement with OER, XCRI and Learning Environments and reflect on what has worked and why, and where we think these activities are going next.

In this final article, we asked Lou McGill and Sarah Currier to talk to CETIS staff and devise
the timeline and this story to illustrate CETIS involvement in the area of Learning Environments over the last decade.

Learning Environments

In the mid 1990’s educational institutions began to see the potential that computer networks could offer to manage their learning activities and content. Alongside that developments on the web were transforming what could be achieved using technology to support learning, often described as e-learning. These two elements essentially emerge from two different motivations one to manage learning, and its’ outcomes, and one to directly support learning. It could be argued that this difference led to a dichotomy of provision for learners through the early 2000s that is now being addressed by more holistic approaches to support an increasingly diverse range of learners.

As institutions invested in Virtual Learning Environments (VLEs) they realised the benefits of linking these to other institutional data systems such as library and student records and the notion of Managed Learning Environments (MLEs) emerged. In parallel, early adopters of web based technologies such as wikis and blogs were focussing on social interactive aspects of learning akin to social constructivist approaches [1]. CETIS highlighted the value of Personal Learning Environments (PLEs) as early as 1999 [2] and have contributed much to this field as educators have now moved towards a range of models to create Distributed Learning Environments (DLEs) [3] that fulfil a range of management and pedagogic requirements. This timeline [4] charts the story and provides links to further information.


How we got here

Investigating and advising on requirements for VLEs was an early activity for CETIS. As institutions started building or procuring these new tools for online education, a number of challenges became evident.

One challenge related to constraints VLE design placed on pedagogical flexibility, directly affecting the learning experience. Professor Oleg Liber and Dr. Sandy Britain penned the seminal 1999 report ‘A Framework for the Pedagogical Evaluation of Virtual Learning Environments’ [5]. This report became an international jumping-off point for exploring the relationship between technology and teaching and learning; a key premise for JISC CETIS’s work since.

The other key challenge concerned how VLEs interacted with other tools, such as student record, management information and library systems. Getting content into and out of VLEs, sharing it across departmental, institutional and regional borders, and being able to deliver the content in a flexible, useful way for learners, were all issues that needed to be addressed. Interoperability requirements entered the equation, and the concept of the managed learning environment (MLE) was born.

Virtual Learning Environments to Managed Learning Environments

CETIS supported the JISC MLE Programme, and became JISC’s representative on the IMS Global Learning Consortium [6], which was producing the relevant interoperability specifications. To fulfil this role, CETIS nurtured local communities of practice (known as special interest groups) to investigate requirements on the ground. These communities coalesced around educational metadata and repositories, assessment, learning design, content packaging, and enterprise systems requirements. CETIS’s experience with the standards development cycle enabled it to support this work, from establishing user requirements, to developing standards based on these, advocating for and supporting their uptake and implementation back in the communities, and using the communities’ experiences to further refine requirements. CETIS staff edited the MLE guide which later became the JISC Infonet toolkit ‘Creating a Managed Learning Environment’ [7] which offered practical advice on the kinds of questions that institutions needed to address when implementing an MLE.

Managed Learning Environments to Personal Learning Environments

The most successful interoperability initiatives around MLEs were enterprise level projects linking student record systems with VLEs [8],[9] and CETIS were instrumental in feeding back experiences from projects to the final version of the IMS Enterprise Services specification. Other, more experimental specifications looking at assessment, content packaging, and learning activities engendered a growing acceptance that a university-wide one-size-fits-all approach was unsustainable [10]. At this time the concept of the e-framework [11] emerged which attempted to map MIS systems in universities and offer an alternative service oriented approach to interoperability. Meanwhile, the Web 2.0 revolution was under way, and new, free tools were emerging on the open web. Students’ expectations of everything from email to online discussion to creating and sharing their own content changed. A session on PLEs at the 2004 CETIS Conference [12] is noted in Wikipedia [13] as the first recorded use of the term. Through the JISC-funded PLE reference model project in 2005/2006 [14], JISC CETIS further developed the idea of the Personal Learning Environment.

futurevle
Image by Scott Wilson, CETIS. The future of the VLE, 2005 [15]

This vision saw learners using a combination of tools of their choice in parallel to the institutional VLE / MLE. It offered universities options to pick and choose which tools they wanted to support, and make their own enterprise systems interact with these tools. CETIS also supported JISC and the international education sector in looking at Service Oriented Architecture (SOA) [16],[17] at the enterprise planning level, and at widgets at the individual level, all of which dovetailed with this new vision. The key goal however was to move from two parallel approaches to a more holistic model with the learner in control.

Where we are now

Personal Learning Environments to Distributed Virtual Learning Environments

Important features of Personal Learning Environments were of learners setting their own goals and managing both the content and processes. However, not all learners want the added responsibility of creating their own learning environment, and most universities want some control over what they offer to students, not least because of the increasing complexity of system support requirements. Alongside this there were some exciting mash-ups of tools and services which offered some innovative approaches to bringing together institutional and web service tools. The 2007 CETIS conference included a hands-on session with some demonstrations of educational related mash ups [18]. During 2008 the interest in widgets continued and CETIS started to engage with the W3C widget specification group who produced a widget landscape study [19]. Following this an interview with Scott Wilson highlighted ways in which widgets could be used to extend VLE functuionality [20]. Around the same time IMS began work on the LTI (learning tools interoperability) specification [21] which enables quick integration of widgets and externally hosted services into systems like VLEs.

The 2008 CETIS conference included a session dedicated to designing widgets for education [22] which highlighted an interest within the community for a working group [23] around sharing practice in developing and deploying widgets. This group aimed to build a widget infrastructure, to determine models of widget use in teaching practice and to identify what widgets were needed in the HE/FE community.

The 2009 CETIS conference broadened widget development work to look at the various ways in which individuals and institutions were integrating widgets to expand their learning environments [24]. CETIS presented four main approaches and later developed and described these in the Distributed Learning Environments (DLE) Briefing paper [25] which was written by Sheila MacNeill and Wilbert Kraan. CETIS also worked closely with JISC programme managers to shape a new programme to explore the viability of these models.

dvle

The goal of the JISC Distributed Virtual Learning Environments (DVLE) [26] Programme was to take the best ideas from VLEs, MLEs, and PLEs to allow more flexibility and less silo thinking. Working with students, teachers and system administrators, the projects have been exploring the development and integration of widgets, apps and gadgets into a variety of commonly used learning environments. The programme has had two strands, one focussing on rapid development and deployment of widgets, and the other exploring institutional approaches to integrating flexible tools and services into teaching and learning environments. As this programme draws to a close we have some excellent exemplars for the wider community. For example the Manchester Metropolitan University’s (MMU) impressive W2C project [27] has added value to their core Moodle VLE, and through strong partnerships and institution-wide activities have proven that a mega-mashup service-oriented approach can work at institutional scale. The Open University DOULS (Distributed Open University Learning Systems) project [28] have utilised a range of Google gadgets and developed a useful set of open guidelines on usability and accessibility from within a VLE. Sheila MacNeill from CETIS produced a useful summer roundup blog post [29] describing each of the projects and their activities. This programme has drawn together over a decade of exploration; with it JISC, and CETIS, have supported the sector’s journey towards flexible and customisable learning environments – what Sheila describes as “pick and mix” learning environments. As the final outcomes are synthesised the wider community should have a range of models and supporting information to learn from.

Where we are going?

We asked Sheila, one of the authors of the CETIS DLE briefing paper, to consider if any of the distributed models presented in 2010 have emerged as stronger, been adopted more fully, or if any had been less effective. Naturally we ended up speaking about the financial constraints currently impacting on educational institutions and their readiness to invest in either institutional technologies or staff to support web services. Sheila felt that this more ‘risk averse’ climate had impacted on institutional readiness to engage with the cloud computing model but that this may still emerge in the future as an attractive model.

Two of the models have emerged strongly as current solutions – and interestingly these present two sides of the picture that have been discussed throughout this article – one institutional model and one more individually focussed model… Institutions are, at present, tending to adopt the ‘plug into existing VLEs’ model which reflects their desire to make good use of existing investment and skillsets, whilst recognising the potential value that stable plug-ins can offer. It’s a relatively safe model that is supported by developments in IMS LTI and Widget specifications. The model that offers a more individual approach to mash-ups – ‘many widgets into one widget container’ such as Netvibes [30] or iGoogle [31] tend not to be integrated into institutional systems but present both educators and students with a way to pull together different elements of their learning environments. It is difficult to estimate how widespread this model currently is but it presents an approach which supports individual control and self regulation.

One of the models identified by Sheila and Wilbert presented the idea of client and provider, using Sakai [32] as the key example. Sakai offers learning management, research collaboration and ePortfolio solutions in one system and is used by 350 educational institutions. Some of the projects in the DVLE programme have used external 3rd party hosting for some elements of their systems, including UCL [33] and Google for email or apps. This kind of model presents possibilities for cost efficiencies and may appear to be less risky. In the short term future we may see more institutions adopt this model – making effective use of expertise and services available elsewhere. In the longer term it will be fascinating to watch this story continue to unfold as institutions seek to balance their institutional needs with the growing demands of highly diverse learners.


About Lou

Lou McGill is currently working independently and has recently been involved in synthesis and evaluation activities for the HE Academy/JISC UKOER programme and the JISC Transforming Curriculum Delivery Through Technology programme. She lead the team that produced the Good Intentions report on business cases for sharing and worked on the LLiDA study (Learning Literacies in a Digital Age). She has experience of working in a range of HE institutions as a librarian, learning technologist and project manager and used to be a JISC Programme Manager on the eLearning team. In the distant past she worked for NIACE (the adult learning organisation) and in Health education for the NHS. Her interests and experience include digital literacy, information literacy, open education, distance learning, managing organisational change, and effective use of technologies to support learning. Further information on Lou’s work can be found at: http://loumcgill.co.uk

About Sarah

Sarah Currier is an educational content management specialist, with a background in librarianship. After emigrating to Scotland from New Zealand in 1997, she worked with educational repositories, metadata and content, including OERs, from 1999 until 2009, including stints with JISC CETIS, IRISS Learning Exchange, and Intrallect Ltd. She then spent three years running her own consultancy, with a varied portfolio including projects on open science, agile metadata management, data skills development, and the use of social media in supporting educational communities of practice. Sarah now commutes between Glasgow and Manchester, coordinating Jorum R&D projects and leading the JLeRN Experiment at Mimas (sarah.currier@manchester.ac.uk).

eXchanging course related information – XCRI timeline

One of the advantages of having being involved with JISC for a number of years (as a project and a service) is the opportunity to reflect on some activities that we’ve been involved in for some time. We thought it would be interesting to take the long view of some of our involvement with OER, XCRI and Learning Environments and reflect on what has worked and why, and where we think these activities are going next.

In this second story Lou McGill traces the history of the eXchanging Course Related Information (XCRI) specification, which is currently becoming a European and British standard (See Adam Cooper’s recent post). XCRI is a fine example of how the right people working together can develop interoperability standards that are truly fit for purpose.

A CETIS perspective of the XCRI story

Scott Wilson, Assistant Director at CETIS, describes XCRI as being ‘a community that formed to develop specifications related to course information’ (Wilson, 2010 [1]). This really captures the central aspect of the XCRI story as being about the community that came together, with the support of CETIS and JISC funding, to address a significant problem for institutions in managing their course information. Universities and colleges produce and utilise course information for several different purposes and duplication is a costly problem. The XCRI community drove forward the development of a shared vocabulary and domain map in the early days which ultimately led to the development of an internationally recognised specification. Their focus was on developing specifications to make generic aspects of a course description publicly available so they can be transferred easily between information systems. The formal outcome of this work is the XCRI-CAP (Course Advertising Profile) [2]. Lisa Corley from CETIS has written a blog post [3] which charts the development of XCRI, recognises work of these early pioneers and provides a very useful description of what it is and what benefits it offers to institutions. There is also extensive information available in the very well maintained XCRI Knowledge Base [4]. This community is still very active and now has fresh impetus from recent national HEFCE initiatives that require improved data exchange and transparency of institutional systems [5],[6] .

The XCRI and JISC CETIS timeline [7] has been developed to highlight the various activities that make up the XCRI landscape and includes JISC and CETIS activities since 2003. It also highlights some wider national and international initiatives which illustrate trends and changes in the last decade in this part of the Enterprise domain.


How we got here

The XCRI community emerged from the Enterprise SIG [8], a CETIS Special Interest Group established in 2003 that focussed on a range of standards, technologies and activities to facilitate the business processes of educational institutions. The Enterprise SIG was, essentially, a community of practice for people interested in, or involved with:

• the IMS Enterprise Specification and Enterprise Services Specification

• exchanging data about students and courses between educational systems (VLEs, Student Records, etc)

• joining up college systems to create Managed Learning Environments

• e-learning frameworks, architecture and web services

At the 2004 CETIS Conference the SIG identified a need for both a cohesive approach to standardising course descriptions and an agreed vocabulary, and in March 2005 the Enterprise SIG XCRI sub-group was formed. This group, led by Professor Mark Stubbs and Alan Paull, with the support of Scott Wilson from CETIS, became the XCRI community and have driven developments in this area in the UK since that date. In 2005 JISC funded XCRI as one of their Reference model projects [9] to define a vocabulary and appropriate (XML) technology bindings for describing course-related information. During this period XCRI produced an R1.0 schema, a repository demonstrator and surveyed 161 prospectus websites. This work happened alongside another JISC Reference Model project – COVARM (Course Validation Reference Model) [10]. Both of these reference model projects substantially mapped their respective domains and outputs fed into the eLearning Framework [11].

XCRI originally intended to ‘bridge the worlds of course marketing and course validation/quality assurance’ but, as Mark Stubbs describes [12], this became unwieldy;

“producing a course definition for validation/modification involves assembling fragments of information into a whole, whereas marketing a course involves communicating a serialized version of a subset of that assembly”

Feedback from the community, after testing the R1.0 schema in different contexts, identified a focus on a limited set of elements that supported course advertising and by 2006 XCRI-CAP was released as an XML specification. This was a very pragmatic outcome and presented something back to the community which responded to their needs and offered a usable schema for JISC to take forward with the wider HE and FE communities.

In 2007 JISC funded a range of projects [13] as part of the eLearning Capital programme around Course Management to build on the work of XCRI and COVARM reference projects. Within the course description and advertising strand a number of institutions specifically aimed to trial and refine XCRI-CAP. The resulting case studies from these projects [14] offer really valuable insight into the challenges and approaches that different types of institutions might encounter when implementing the specification and offer perspectives from a range of stakeholders such as policy makers, managers, administrators and technical staff.

JISC also funded a support project made up of members from CETIS, Manchester Metropolitan University and Kainao Ltd, who had been involved in the original XCRI sub-group. This project had a remit to further develop the specification, provide technical support to projects for the implementation of XCRI-CAP, provide a prototype online aggregation service, and promote the specification towards submission to an appropriate open standards process. The support project effectively continued the work of the XCRI reference project and proved so effective that funding was extended and continued until March 2011. Mark Power from CETIS describes [15] the various activities and achievements of this team and notes that this, and the work of the JISC funded projects, demonstrated the value of XCRI-CAP so successfully that it was placed on the strategic agenda of national agencies.

In 2008 the XCRI Support Project team engaged with other European initiatives in course information through the European Committee for Standardization (CEN) Workshop on Learning Technologies. Following this the CEN endorsed a Workshop Agreement for Metadata for Learning Opportunities (MLO) which defines a model for expressing information about learning opportunities including course information. MLO includes XCRI-CAP from the UK as well as other European specifications and is an attempt to unify these by offering a common subset, whilst still enabling local extensions and implementation architecture. This was ratified as a European Norm (EN 15982) in 2009 and was published in 2011.

Scott Wilson wrote in 2010 [16]

The formal standard defines the majority of the core concepts and information model used by XCRI. The engagement of XCRI in CEN standards development has provided an opportunity and an impetus for the XCRI community to progress to formal standardization. The current roadmap of XCRI is to develop a British Standard for its course syndication format as a conforming binding and application profile of CEN Metadata For Learning Opportunities: Advertising.

In 2009 XCRI-CAP 1.1 [17] was approved by the Information Standards Board for Education Skills and Childrens’ Services (ISB) as the UK eProspectus standard and on 1st March 2012 BS 8581 XCRI-CAP was released for public comment which would create a British Standard that is consistent with the European MLO-Advertising (EN-15982)

So far XCRI-CAP has enabled several institutions to transform practice around producing course information, especially for prospectuses, with reports of huge reductions in data duplication [18]. In 2009 the ISB estimated that a new standard for course information (XCRI-CAP) could save the sector in the region of £ 6 million per annum by removing the need to re-enter data into course handbooks and web sites.

Where we are now…

Scott Wilson from CETIS wrote a blog post in June 2011 entitled XCRI – the end of the beginning [19]. In this Scott notes a shift in the timeline of XCRI – taking us from a period of designing the specification and beta testing into ‘adoption, use and adaption’. This is a significant achievement for those involved in mapping and defining the terrain and testing out the specification across institutional systems. The community now has working exemplars which not only deliver proof of economies of scale, through reduced duplication of data, but also articulate the value of re-thinking a range of business processes affected by course information. It has long been recognised that barriers for institutions in adopting XCRI-CAP lie not in technical complexities of the schema but in the challenges around managing their data and processes to support course information, many of which involve several incompatible systems.

A range of current national drivers require educational institutions to consider how they manage and surface some of their information sets and those institutions that have engaged with XCRI-CAP are likely to find it easier to respond to some of these requirements. In early 2011 a report to HEFCE from the Online Learning Task Force [20] highlighted the challenges that students face due to insufficient, and hard to find, information about courses not dealt with by UCAS. As part of the Government Transparency agenda educational institutions are being required to provide KIS (Key Information Sets) [21] for the majority of undergraduate courses from September 2012 and to feed into the HEAR (Higher Education Achievement Report) [22] recording student achievement. Each of these initiatives provide significant impetus for institutions to consider how their course information is managed and how it links to other processes. Implementing XCRI-CAP can be a valuable way to consider this [23]. Towards the end of 2011 JISC launched a new programme called Course Data: making the most of Course Information [24]. The programme has two stages – the first which took place from September to November 2011 gave 93 institutions £10k to prepare an implementation plan to improve their course data flows and produce feeds for external agencies. 63 institutions have been selected for stage 2 to implement these plans and these began in January 2012 and will end in 2013. Outcomes and outputs of this programme are being synthesised on the XCRI Knowledge Base [25].

Where we are going…

Whilst the 2011 JISC programme will result in larger numbers of courses being advertised in XCRI-CAP format Scott argues that we need to see it taken up by major course aggregation and brokerage services. This was one of the themes discussed at the XCRI eXchange [26] in 2011 which was a National Showcase for XCRI, organised by the SAMSON [27] and MUSKET [28] projects, funded under the JISC Institutional Innovation Programme. Scott concludes his blog post with a suggestion that establishing an alliance could be the key to encourage high profile ownership and promotion of XCRI.

I think it would have to have the major aggregators on board (UCAS, Hotcourses), plus curriculum management solution providers (Unit4, Akari) and larger learning providers (e.g. the Open University, University of Manchester, MMU, Nottingham) as well as some of the smaller tools and services companies that are already working with XCRI (APS, IGSL, Smartways). It would develop the brand identity under which XCRI-CAP adoption would be recognised (not necessarily retaining anything of the original brand) and promote it beyond the reach of funded programmes into large-scale use.

Is this the future for XCRI?

I believe an Alliance like this would be a fitting development in the story of XCRI – a community driven specification having ongoing support and recognition from key stakeholders. It would be a fitting testament to the XCRI community and their achievements over the last decade.


About Lou

Lou McGill is currently working independently and has recently been involved in synthesis and evaluation activities for the HE Academy/JISC UKOER programme and the JISC Transforming Curriculum Delivery Through Technology programme. She lead the team that produced the Good Intentions report on business cases for sharing and worked on the LLiDA study (Learning Literacies in a Digital Age). She has experience of working in a range of HE institutions as a librarian, learning technologist and project manager and used to be a JISC Programme Manager on the eLearning team. In the distant past she worked for NIACE (the adult learning organisation) and in Health education for the NHS. Her interests and experience include digital literacy, information literacy, open education, distance learning, managing organisational change, and effective use of technologies to support learning. Further information on Lou’s work can be found at: http://loumcgill.co.uk

Open Educational Resources timeline

A CETIS perspective of the OER story

One of the advantages of having being involved with JISC for a number of years (as a project and a service) is the opportunity to reflect on some activities that we’ve been involved in for some time. We thought it would be interesting to take the long view of some of our involvement with OER, XCRI, Learning Environments and reflect on what has worked and why, and where we think these activities are going next.

This first story looks at the development of the Open Educational Resources area. Lou McGill talked to Phil Barker and Lorna Campbell about how the OER field has evolved in the last ten years.

Open Educational Resources

CETIS has been engaged with technical aspects of educational resource management and use since its early days. This includes contributing to the development and implementation of standards and technologies for creating and making learning resources discoverable [1], managing and sharing learning resources [2] and technologies for learning design [3]. Each of these areas has a rich history of activities and technological development. In 2008, following on from involvement in the areas of open access to research [4] and open technologies [5], CETIS highlighted the potential of open content for learning and teaching and how this might challenge and transform approaches to educational practice [6]. The term Open Educational Resources (OERs) emerged in the early 2000s and can be understood broadly to mean “digitised materials offered freely and openly for educators, students and self-learners to use and reuse for teaching, learning and research” (OECD, 2007). Since then the UK has seen considerable activity relating to OERs and more recently in open educational practices (OEP). The CETIS wiki [7] has a section dedicated to the technical aspects of OERs providing background information and links to further resources.

How we got here…

JISC funded OER activities in the UK have been shaped by the history of learning resources in national Further and Higher education contexts and by what we have learned from a number of development programmes. The OER and JISC CETIS timeline[8] has been developed to illustrate the various activities that make up the learning resources and OER landscape and includes JISC and CETIS activities since 2002. It also highlights some wider national and international initiatives which illustrate trends and changes in the last decade.


JISC CETIS and OER timeline in tikitoki

Historical perspective in the UK

Cultural aspects and practices around sharing learning resources have been a particular focus for several UK studies [9], [10], [11] and continue to be a focus for UKOER activities. JISC has funded a number of programmes since 2002 to investigate issues around developing, managing and accessing learning resources, and these have surfaced issues relating to institutional policies and practices, business models, teaching practices, legal issues and technical aspects. In addition to providing technical and strategic support and guidance to the programmes, CETIS contributes to scoping the technical requirements and reports on wider trends across the decade.

JISC funded a large scale programme in the early 2000s called eXchange for Learning (X4L) (2002-2006) [12] to support the development, repurposing and sharing of learning resources. This ran in parallel to the establishment of a national learning resources repository Jorum [13]. Following on from X4L the RePRODUCE Programme (2008-2009) [14] focussed on developing courses using repurposed and re-usable learning resources. In the mid to late 2000s JISC funded two other programmes focussed on establishing technical infrastructure within institutions and across the sector – Digital Repositories (2005-2007) [15] and Repositories and Preservation (2006-2009) [16].

These programmes were informed by a strategic and technical vision which was expressed through initiatives including the e-Learning Framework [17], the e-Framework [18], the Information Environment Technical Architecture [19] and the Digital Repositories Roadmap [20].

Projects in the X4L programme were required to explore the process of integrating interoperable learning objects with VLEs. A small number of tools projects were funded to facilitate this task: an assessment management system (TOIA), a content packaging tool (RELOAD) and a learning object repository (Jorum). Projects were given a strong steer to use interoperability standards such as IMS QTI, IMS Content Packaging and IEEE LOM. A mandatory application profile of the IEEE LOM was developed for the programme and formal subject classification vocabularies identified including JACS and Dewey. Projects were strongly recommended to deposit their content in the Jorum repository and institutions were required to sign formal licence agreements before doing so. Access to content deposited content in Jorum was restricted to UK F/HE institutions only.

These conditions meant that projects required significant support to engage with and implement the various standards and invested considerable time on these elements. Depositing learning and teaching materials in formal repositories raised very different issues to that of depositing research outputs, as these included a wide range of formats, levels of granularity and sometimes incorporated accompanying pedagogical guidance. A particular focus for projects and the wider community at this time was the debate about how granularity of learning resources impacted on aggregation/disaggregation and how this affected flexibility and reuse. The X4L review report highlighted the fact that repurposing and reuse are affected by much more than granularity

“Effectively ease of use, improving the learning experience, and improving design are all interrelated, and all will be underpinned by an understanding of who will actually engage in repurposing (or reuse) and why.[21]

Several X4L projects encountered problems with resources that incorporated items with various original licences, and highlighted the fact that teachers had often not previously acknowledged ownership of content they had used or understood the need to do so. The JISC programmes in the mid 2000s did much to challenge the perceptions within the community that licencing and copyright was overly complex, but did little to generate positive attitudes towards this and remove barriers.

“the licences used, and hence the access authorization policy for the JORUM repository, focussed more on restricting access and use than on permitting it.” Phil Barker

CETIS and another JISC Innovation Support Centre, UKOLN [22], were funded to run the Repositories Research Team, as part of the JISC Digital Repositories and Repositories and Preservation Programmes. The remit of this team included: helping projects find and exploit synergies across the programme and beyond, gathering scenarios and use cases from projects, liaising with other national and international repositories activities, including liaison with the e-Framework, synthesizing project and programme outcomes, and engaging with interoperability standards activity and repository architectures. This team were able to draw together key messages from the programmes [23].

Whilst there was an increase in institutional repositories providing access to scholarly works at this time, there was less success supporting and facilitating access to teaching and learning materials. One of the final conclusions of the Repositories and Preservation Advisory Group, which advised the JISC repositories programmes, was that teaching and learning resources had not been served well by the debate about institutional repositories seeking to cover both open access to research outputs and management of teaching and learning materials, as the issues relating to their use and management are fundamentally different [24].

The late Rachel Heery also commented that greater value may be derived from programmes that focus more on achieving strategic objectives (e.g. improving access to resources) and less on a specific technology to meet these objectives (e.g. repositories). An example of this kind of approach is the International JISC/NSF Digital Libraries in the Classroom Programme (2003-2008) [25] which investigated institutional, technical and social aspects to developing, sharing and managing content to support learning activities. Projects in this programme were led by academic departments and focussed on the strategic objectives of using the content with learners. Although specific technologies and standards were not mandated for this programme, projects brought together formal repositories, workflows, copyright, metadata issues and learning design with web 2.0 approaches, tagging, digital literacies and student content, all within real learning and teaching contexts.

Rather than a radical shift in policy these conclusions should be regarded as reflecting a gradual development in policy, licensing and technology right across the web.

Wider context

The emergence of a highly networked ‘social web’ has impacted on how people find, create, manage, share and use content, for both personal professional and learning activities. This includes the advent of web 2.0, the appearance of media specific dissemination platforms such as slideshare, youtube, flickr, iTunesU, interaction through RESTful APIs, OpenID, OAuth and other web-wide technologies, and increasing acceptance of Creative Commons licenses. These services and changing practices are not always open and, it could be argued, openness is not always appropriate. So whilst the open web and the social web are not co-dependent there is a move towards open social web approaches to learning and teaching, of which moocs (Massive open online courses) are one example [26]. This is transforming how learners interact with educational content and is challenging traditional models of educational provision and scholarly activities. This affects institutional policies and strategies, particularly around technologies to support learning and teaching.

“As a result there has been a movement away from developing centralised education specific tools services and towards the integration of institutional systems with applications and services scattered across the web. Furthermore there has been growing awareness of the importance of the web itself as a technical architecture as opposed to a simple interface or delivery platform.” Lorna Campbell. Phil Barker and R. John Robertson

This has been reflected in recent JISC funded programmes where specific technologies and standards are not mandated and projects are encouraged to adopt technologies that suit their purpose and context.

Internationally, various models have emerged to release open content. These models are often shaped by how they have been funded and by the various, and sometimes quite different, motivations to release content as OERs. Community based models offer sustainable approaches based on practice and resources sharing, whilst some educational institutions recognise the potential of OERs as marketing opportunities. Recent initiatives such as University of the People [27] and OER University [28] reflect both the fundamental aspiration of providing access to learning for students around the world as well as a need for educational institutions to find ways to respond to changing needs of learners. Issues around accreditation and assessment in an open context are emerging as an important focus for the community. Activities have predominantly concentrated on releasing OERs with less focus on how these are being used, or who is using them, although the increasing focus on open learning experiences, and the fact that there is now a significant corpus of OERs, is starting to change this. Technologies are increasingly being utilised to track use and feed relevant content to individuals and has been a focus of recent activities in UKOER projects.

Moving towards open in the UK

The launch of the Open University OpenLearn [29] and the University of Nottingham’s u-Now Open Educational Repository [30] in the mid 2000s marked the UKs first formal steps with OERs, although individual academics and teachers were experimenting with open technologies to make some of their content openly available [31]. Despite these, and other international initiatives, the RePRODUCE Programme concluded that projects had significantly underestimated the difficulty of finding high quality teaching and learning materials that were suitable for copyright clearance and reuse. In 2008 JISC funded a research study [32] into the sharing of learning materials which provided a history of sharing and managing learning resources in the UK, described business models and benefits, and focussed on open and community sharing. This report concluded that open approaches to producing and making learning materials accessible was likely to have significant impact on both the sharing and exchange of resources in both national and global contexts. Also in 2008 CETIS produced a briefing paper on OERs and held a scoping session at their annual Conference. These reports fed into the development and scoping of the jointly funded HE Academy/JISC UKOER programme [33].

The UKOER Pilot Programme (2009-2010) involved a range of OER providers including individual educators, discipline-based consortia and institutions.

Given this diversity it was recognised from the outset that no single technical solution would fit all projects, and therefore no specific tools, descriptive standards, exchange or dissemination mechanisms were mandated (apart from a requirement that the resources produced be represented in a national repository of learning materials). [34]

CETIS has supported this diversity by encouraging discussions at meetings or through blog posts to identify which technical choices have been made by individual projects, recording these openly [35] and responding to issues as they arose.

Where we are now…

The UKOER programme has progressed through phase two into phase three and Jorum has continued to be developed as an open national repository with CETIS providing input on issues around bulk upload of resources and syndication. Projects have highlighted a range of technical issues relating to building collections, and providing rich descriptions, of learning resources. CETIS has been involved in supporting this by exploring issues around packaging, describing, tracking and aggregating resources. Project technical approaches have included RSS aggregation and techniques similar to podcasting, presentation of resources through novel interfaces such as timelines and maps using geolocation data; representation of relationships between resources cross search, upload and metadata harvesting through the use of third party host APIs. As well as supporting projects, CETIS provided opportunities to discuss these issues with the wider community at its annual conferences [36] and other events.

An important aspect of CETIS’s work is that of providing a unique space for technically focussed staff to have conversations across institutional boundaries and also offering opportunities for innovation and experimentation. A joint CETIS/UKOLN DevCSI OER Hack Day event proved to be highly productive and stimulating as it brought together software developers, project managers, academics, learning technologists, researchers and users to work in multi-disciplinary teams on ideas for developing tools and solutions to OER problems. Towards the end of 2011 CETIS commissioned two technical OER Mini-projects which adopted the rapid innovation funding model and aimed to encourage openness and innovation. This provides a contrast to longer term large scale programmes [37].

The CETIS team blogs continue to provide an ongoing dialogue around technical issues, identifying emerging trends as well as providing programme level synthesis. There has been a value in taking a team approach to programme support as CETIS staff brought a wealth of experience from their involvement with the earlier JISC funded work around learning resources and repositories. CETIS staff who have been involved in this area include Lorna Campbell, Phil Barker, R. John Roberston, Li Yuan, Sheila MacNeill and Sarah Currier.

“Although we have seen a significant shift in focus from formal repository standards, protocols & procedures, learning objects and controlled access to repositories to lightweight web-wide specifications and social sharing platforms, there is still plenty to discuss regarding resource description, levels of openness, resource discovery, student content and quality” [38]. Phil Barker

Even during the relatively short timescale of the UKOER programme CETIS has seen projects choose a wide range of solutions to increase access to their OERs. During phases 1 and 2, projects released resources of varying levels of granularity from individual images to whole courses. Many projects used multiple platforms [39] to host these and some projects made their OERs available on a combination of national, institutional and subject repositories, social sites and content management systems. Projects were encouraged to use feeds to ensure that resources stored on different hosts were displayed on other sites, which is in marked contrast to the early X4L content and increases visibility and access to the OERs. Projects were aware that the range of different potential users needed different levels of granularity, levels of additional content, metadata and presentation methods. It is interesting to see projects take advantage of the affordances of formal repositories such as more effective content management, version and licence control and metadata, and balancing these with the informal web-based approaches which appear to offer flexibility and choice, tagging and commenting, although repositories are increasingly offering similar functionality.

Where we are going…

CETIS will continue to work with the UK HE and FE community to encourage discussion, innovation and experimentation with technologies and standards to support OERs and open practices, and to feed this into broader national and international contexts. In 2010, in the US, the Learning Registry was established as an international open source technical system offering an alternative approach to learning resource discovery and sharing and as a community for people sharing resources [40]. This initiative seems to exemplify how far the learning and teaching community has come in the last decade in terms of aspirations and approaches. It remains to be seen how successful the technical approaches will be, but Lorna Campbell, Assistant Director at CETIS sees the potential…

If the Learning Registry is successful in creating a “light-weight learning resource information sharing network” it will be a major step forward in terms of facilitating access to the wealth of educational content that is scattered across the web. Lorna Campbell

CETIS advise JISC about the Learning Registry and also advise the JISC JLeRN project [41] which is the experimental node in the UK.

Codebashes, hack days and the annual CETIS conference events provide spaces for JISC funded projects and the wider community to extend the conversations into new areas and continue innovation. The third phase of UKOER will continue to provide real-use studies of a range of different technologies and standards [42] and includes some interesting work with publishers. As the educational community worldwide focuses on open accreditation and assessment, and on digital literacies to produce and use OERs the CETIS team is likely to have a role in drawing together different communities to cross boundaries and share knowledge and experience from other aspects of work.


About Lou

Lou McGill is currently working independently and has recently been involved in synthesis and evaluation activities for the HE Academy/JISC UKOER programme and the JISC Transforming Curriculum Delivery Through Technology programme. She lead the team that produced the Good Intentions report on business cases for sharing and worked on the LLiDA study (Learning Literacies in a Digital Age). She has experience of working in a range of HE institutions as a librarian, learning technologist and project manager and used to be a JISC Programme Manager on the eLearning team. In the distant past she worked for NIACE (the adult learning organisation) and in Health education for the NHS. Her interests and experience include digital literacy, information literacy, open education, distance learning, managing organisational change, and effective use of technologies to support learning. Further information on Lou’s work can be found at: http://loumcgill.co.uk

Curriculum Delivery: Dynamic Learning Maps

In this second post Lou McGill focusses on the DLM project which produced a dynamic map of the entire Medical Curriculum at Newcastle University.


The Dynamic Learning Maps (DLM) project at Newcastle University was funded by JISC as part of the Transforming Curriculum Delivery Through Technology programme. This programme saw a very diverse set of projects use technology to address a number of institutional challenges and ultimately transform their approaches to enhancing the student experience.

The DLM project aimed to make visible the complex medical ‘spiral curriculum’ where topics are revisited with increasing depth over a 5 year programme, and to reveal connections across curricula to support modular learning. High level goals included a desire to promote lifelong learning, enhance employability and enable personalisation for students. The diverse nature of stakeholders with unique views of the curricula increased the complexity of the project and led to some interesting paths and choices as the team developed the maps. Simply getting agreement on what constitutes a curriculum map proved challenging and required a highly flexible approach and agile development. Like many technical development projects DLM had to balance the need to develop a common understanding with the need to reflect different stakeholder requirements. Agreeing on what elements to include and the level of detail was important as well as the fundamental issues around how they might be used by both staff and students.

The project stands out in this programme for a few reasons – not least that it has much in common with the ongoing Institutional Approaches to Curriculum Design programme, which has several projects engaged with mapping the curriculum. DLM has developed an interesting mix of formal curriculum maps, personal learning records and community-driven maps which have proved to be as useful for curriculum teams as for the medical, psychology and speech and language students they support. Another distinguishing feature of the project was that it had a more technical focus than other projects – engaging with neural networks, a range of institutional systems, data feeds and mashups, e-portfolios as well as a range of web 2.0 approaches.

A curriculum map from the DLM project

A curriculum map from the DLM project

Curriculum maps
Curriculum maps are diagrammatic representations of the curriculum and can be seen as operational tools for staff to support course management and administration or as aids to support curriculum design and delivery. They offer a window on to the curriculum and can include learning outcomes, content, assessment and information about people. They can be complex and are usually very labour intensive to develop and embed into institutional processes. Making curriculum maps open to students gives them navigational aids through their courses, providing clarity around different modules, identifying connections between these and supporting module choices. Adding capacity to personalise their pathways through the maps, link to individual portfolios records and add their own resources to can turn curriculum maps into powerful learning aids. This is what makes the DLM project stand out for me.

It is challenging enough to visually represent or describe a curriculum, but using established standards and technologies to link data in a meaningful way takes things to another level as connections can be made with a range of institutional systems.

Institutional Systems linked in DLM project

Institutional Systems linked in DLM project

DLM pulls in data from a variety of institutional systems including repositories, library and curriculum databases, and student information systems, as well as having a two-way transfer with e-portfolio systems. This means individuals can navigate the formal curriculum map and see links between elements, such as learning outcomes, timetables, etc. They can also add to the map in the form of personal reflections or notes and external resources which can be rated and discussed. It is important to note that the project highlighted that existing curriculum data will not necessarily plug-in easily to other systems. They identified a range of issues that had to be addressed including governance, QA process and data mitigation and translation. This is one example described in the final report:
A separate tool was developed to manage programme-level learning outcomes and link these to units /modules, this provides a feed used in Learning Maps. This was necessary because existing data from an MBBS study guides database was in non-standardised text; this tool enables curriculum managers to map specific programme-level outcomes, but to still use context-specific language of the module.

Web 2.0 mash-ups
Technically DLM is very interesting – the team built on previous work around personalisation and Web 2.0 approaches. The DLM concept was inspired by neural networks, where nodes can be connected to any other node in the network, and those connections vary in strength. They used open standards and existing corporate and programme data to support sustainability, transferability and reduce dependencies on specific systems.

The project team took good advantage of work being done within the institution as part of other initiatives. The e-portfolio element of DLM drew on work done with the Leap2A specification for e-portfolio portability and interoperability as Newcastle University has been involved in developing the specification and the JISC EPICS-2 project (e-Portfolios and PDP in the North East). Taking a web service approach, when learners add their reflections and notes to curriculum topics these generate an xml file which is stored remotely in their individual portfolio using Leap2 and becomes part of their portable portfolio record. This approach means that even if departments have different e-portfolio systems the standardised data can be stored in any of them. For more information on Leap2A see the recent article by Christina Smart from JISC CETIS.

DLM also benefitted from the experience of using XCRI-CAP from the mini project North East XCRI testbed (NEXT) which reported in April 2010. As an example of using XCRI feeds within a learning application, they added support for DLM. By embedding XCRI feeds inside learning maps course related information relating to specific parts of the curriculum are revealed using rss/atom feeds which send an http request to course database and http response into DLM. This experience was of significant value to both projects. Other initiatives built on by DLM were the use of CAS authentication as an outcome of the JISC Iamsect single-sign-on project and use of corporate data flows from the JISC ID-Maps project.

The project final report describes the project approach:

The project maintained a blog which covers a whole range of interesting aspects but my favourite post was one by Tony McDonald called DLMS as a substrate for academic content where he provided a real glimpse into the possibilities of taking existing data (eg a study guide) and re-presenting this using DLMs, and the kinds of detailed considerations that affect this such as metadata and context. Here is a brief snippet of his conversation with himself…

Well, we would need to deconstruct the study guide into something which is ‘node’-sized for the DLMs machinery. This could be at the paragraph level or smaller. That isn’t so bad to do, we have a lot of contextual information on the guide itself (where it sites in the curriculum, who is the module leader etc) which would contribute to over-arching metadata on the document. We would then need to add DLM-specific metadata on each node. The metadata is quite varied, from simple one word descriptions (eg simple tags) through to multiple-selections for licence usage of the material itself (we very much believe in OER!). The metadata also helps us to decide how the content should be rendered – eg as simple HTML, as something which is only released in a specific time frame, something that is only seen by particular categories of user, etc This deconstruction is certainly doable, and the DLMs team has already done this for small sections of study guide material. (Tony McDonald)

Impact so far
Evaluation occurred throughout the process and early feedback shaped the visual representation and elements included in the maps. Students revealed an almost 50/50 split in preference for visual (concept map style) representation and hierarchical lists (text-based) so DLM has both styles of display, as well as tagcloud views.

Ongoing challenges have emerged that are relevant to any curriculum mapping processes such as changing curricula – sometimes restructuring of whole courses, and the fact that the student journey through the curriculum changes over time. One particular issue is that each cohort has a different experience of the curriculum and the team were faced with a decision around mapping these differences, however they chose to opt only for the current map as this would link to up to date developments and guidelines which are crucial in the healthcare field. Other challenges include managing stepped/timed availability of resources, and that not all data is available in a usable or consistent form. A major challenge lies in balancing automated data with that requiring moderation or contextual information – impacting on choices around granularity of content and specificity.

DLM offers different things to a range of stakeholders. For learners it offers an overview of their learning – a reminder of prior learning, a view of current learning and opportunities to consider future learning choices. They offer interactive opportunities for sharing, rating and reviewing resources as well as facilities to add personal reflective notes, portfolio records and evidencing learning outcomes. Different student groups expressed alternate ways of using DLM – ie for revision, contextualisation, planning curriculum choices or career choices. .

For staff DLM offers mechanisms to review curricula and identify connections across modules. In addition they highlight gaps in provision, duplication and issues around consistency of terminology. Staff are able to see how a student may perceive and engage with their curriculum, monitor access and equality of learning opportunities, and consider alignment of teaching learning and assessment. They will be able to identify which resources emerge as popular and offer a glimpse into learning that may happen outside the formal curriculum.

At Newcastle, thanks to interest at strategic level the team are planning to extend DLM to geography and dentistry. It will be very interesting to see how well it transfers to other subject areas but there are quite a few challenges in transferring the model to other institutions, although the team have received expressions of interest. The extent of customisation required to make this work takes commitment and considerable technical support. A public demonstrator and software download is available via the project Website and thanks to the use of open standards other institutions could potentially take this forward if they too are prepared to develop a shared understanding of curriculum mapping and take the time to share data across their systems.

This excerpt from the project final report nicely sums up the potential of DLM – it is definitely a report worth reading and I hope other institutions consider a similar model or take some of the steps towards making it work within their own context.

DLM is a flexible and interactive tool, which can be readily aligned with an institution’s Teaching and Learning Strategy, whilst at the same time support a diverse range of specific programme requirements. It can be used to increase transparency in the curriculum in an interactive and participative way that more closely matches the changing experience and expectation of many modern learners. It also has potential to help address sector-wide drivers for PDP, employability, greater personalisation and student involvement in the curriculum. DLM final report


Range of outputs

https://learning-maps.ncl.ac.uk/docs/
Demonstration version of Dynamic Learning Maps (needs registration)
Project final report

Curriculum Delivery: Let’s Get Real

In the first of two posts on the Transforming Curriculum Delivery Through Technology programme Lou McGill discusses how a number of projects used technologies to recreate “real” learning experiences. This post first appeared on Lou’s blog in April this year.


Following on from my post about the final synthesis report of the Transforming Curriculum Delivery Through Technology programme I thought it may be useful to focus of a few of the key themes to emerge from the programme in more detail.

One of the aspects that I found most interesting was the number of projects who were using technologies to support authentic, situated learning experiences. In the past I worked at the University of Strathclyde on a JISC funded Digital Libraries in the Classroom programme on the DIDET project, which used a wiki and a variety of technologies to support design engineering students during the design process. What we aimed to do was provide the technologies to replicate a global product design experience (with our partner Stanford University in the US) where students created, managed and shared design artifacts. I think this project was very forward thinking as it started in 2003 when wiki’s were not widely used in higher education contexts – so definately worth a plug here; )

I was particularly interested then that two of the Transforming Curriculum Delivery projects were focussed on design students. A common need for design students across a range of disciplines is to experience the reality of working collaboratively in teams to tight deadlines and to a fairly well established design process. The Atelier-D project, based at the Open University was aiming to replicate a traditional atelier style environment for distance learning students to learn in a collaborative way with their tutors and other students. They used a range of technologies including flickr, facebook, video conferencing, concept mapping, second life and social networking sites. This project faced significant challenges in implementing, sometimes complex, technologies with distance learners who faced problems both with access and usability of services where the base technical requirements and learning curve for new users can be high.

Also focussed on design students was the Information Spaces for Creative Converstations project led by Middlesex University Interaction Design Centre and partnered by City University London, Centre for HCI Design. This team wanted to make sure that technologies supported creative conversations between design students rather than distract from them. They also used technologies to help students record and conserve these conversations for later reflection and like the DIDET project utilised technologies to manage the range of artifacts that emerge during this part of the design process – such as sketches, photographs, recorded conversations, and later reflections that inform the next stages of design work.

Other projects offering authentic professional or work-based experiences included Generation 4 at St George’s University of London which developed interactive Online Virtual Patient cases with options and consequences. Students work in groups on a virtual patient problem where they could see the impact of their decisions, without damaging a real person. The Duckling project at the University of Leicester was also focussed on distance learning students and they utilised and adapted an existing oil rig in Second Life for occupational psychology students to use.

The MoRSE project led by Kingston University and De Montfort University worked with two different groups of learners using mobile technologies to provide a practice based curricula.

‘The delivery of a situated curriculum for students working beyond the institution in practice based environments is critical along with the ability to be active contributors in real world problem solving. The ability of both institutional and personal technologies to effectively and appropriately enhance this situated curriculum and experience is crucial. For example fieldwork experience in real problem environments for students has been crucial to student understanding to all aspects of real world scenarios from the collection of primary data through its processing, interpretation and analysis to the completion of an output. This experience can be lessened through the student having to split work on a project between the field and institutional laboratories because of time and access to technologies and resources. In addition basic data processing tasks can take a significant period of limited fieldtrip time that could otherwise be spent on analysis and interpretation, and increases the time between data collection and its analysis.’ (MoRSE final report)

This type of approach required quite a lot of learner support and MoRSE used student mentors, which provided useful experience for mentor’s CVs or portfolios, and provided low cost field and placement support.

The Springboard TV project at West Anglia College focused on providing an experience that offered an opportunity to state of the art technologies to develop their own internet tv station.

“Creating an identity and branding has been a very powerful agent in developing a ‘learner centred approach’, where learners now respond as professionals, working in a ‘real life’ production company “
Jayne Walpole Head of Faculty Creative Arts’ Springboard TV

Wikis were used by the INTEGRATE project at the University of Exeter to provide an authentic international group experience for a very large cohort (465 students from 40 countries) to stimulate international co-operation and international management skills. As well as providing an opportunity to practice a professional role it also provided a small group setting where students with a wide range of language and cultural differences could support each other, creating a collegiate environment and culture.

These brief descriptions are just snippets that are more fully explained in the Design Studio and project websites and reports. I think several of these approaches and activities should be of interest to others trying to create authentic learning experiences.

XCRI-CAP – now is the time

In her third post on Curriculum Design, Lou McGill reflects on the challenges and opportunities surrounding the effective use of course data in institutions.


JISC have recently released a call entitled ‘Course Data: Making the most of Course Information’. This is a different style of call which offers funding for a review and planning stage, during which institutions will develop an implementation plan to improve course data flows within the institution as well as producing feeds for external agencies. The second phase will see some of those institutions funded to take the implementation plan forward. JISC are hoping to fund a range of examples using different kinds of courses – online, postgraduate, distance and CPD courses so we should learn a lot from programme activities. A national XCRI showcase was held in June 2011 and highlighted some really useful exemplars. These are detailed on the JISC Benefits Realisation blog post which also documents some interesting discussions.

The call nicely reflects an increased interest in the role of course information across institutional processes and systems as the post 16 education sector prepares for increasing demands on course data from both students and from government agencies requiring increased transparency from publicly funded bodies. As I mentioned in my last post HEFCE requirements for institutions to provide KIS (Key Information Sets) for all courses from September 2012 and to feed into the HEAR (Higher Education Achievement Report) recording student achievement means that institutions need to collate, manage and provide consistent and complete data. These drivers provide the impetus for institutions to finally embrace and take forward the XCRI specification (Exchanging Course Related Information), which, up to now, has not been taken up widely in institution-wide contexts. This new impetus and the results of ground building work done by pioneer individuals and institutions means that there is now an excellent infrastructure of supporting information and knowledge to move forward.

Lisa Corley from CETIS has written an informative overview blog post which charts the development of XCRI, recognises the work of these pioneers and provides a very useful description of what it is and what benefits it offers to institutions. This, coupled with the excellent XCRI Knowledge Base should provide anyone interested in the call with the basic information to take this forward. Scott Wilson from CETIS has also written a more technically focussed blog post entitled XCRI – the end of the beginning.

One of the most useful things for those about to embark in this process is what they can learn from people and institutions which have already been through it – as they can highlight challenges, pitfalls, good practice and also illustrate benefits. The latter is particularly useful to use with reluctant stakeholders who may need convincing. This post focuses on the work of projects involved in the Institutional approaches to Curriculum Design Programme. Two earlier posts describe the business process approaches adopted by projects and looked in detail at course information.

Sheila McNeill, from CETIS has been working closely with these projects and produced a blog post in April 2011 which provided some excellent visual representations of the technologies being used by them. This wordle, reproduced from that post, illustrates just how significant XCRI is to these projects.

 

screen-capture-11-300x137

Wordle of techs & standards used in Curriculum Design Prog, April 11

However as Sheila points out ‘ we are still some way off all 12 projects actually implementing the specification. From our discussions with the projects, there isn’t really a specific reason for them not implementing XCRI, it’s more that it isn’t a priority for them at the moment.’

This reflects what I was saying above although some notable exceptions are the Supporting Responsive Curricula (SRC), Predict, and Co-educate projects which have engaged significantly with XCRI implementation and development. Early conversations among projects highlighted some shortcomings in the specification, which also reflected a wider community concern that the XCRI-CAP (Course Advertising Profile) profile concentrated on marketing elements and did not support pedagogical information. The recognition of the CAP profile in the European Metadata for Learning Opportunities (MLO) standard in 2011 is a major step towards consolidating XCRIs place in the wider course information landscape. Publishing course information in the standard format means that it can be found and aggregated by services such as UCAS and offers potential for collation in a range of ways.

Although appearing to focus on a fairly narrow aspect of course information (advertising and marketing) the elements that make up XCRI-CAP are central to a much wider range of institutional processes and systems that require accurate and up-to-date course data. This links to wider course information, inputs into institutional systems such as VLEs, and can be connected to student data. The notion of having one accurate definitive source of data should be appealing to many stakeholders in an institution: fundamental for administrators and marketing staff, supporting decision making for senior managers, easing the burden for teaching staff and better informed students – but also for people outside the institution: clarity for prospective students, employers and other interested agencies as well as fulfilling requirements from funders. The implementation process should highlight the different elements of course information and how they connect. It should also help institutions articulate which information is relevant for which stakeholder.

Implementing XCRI-CAP
We learned from the JISC XCRI mini projects (2007-2009) that there are no major technical difficulties in implementing the specification, but as Sheila says in her blog post ‘As with many education specific standards/specifications, unless there is a very big carrot (or stick) widespread adoption and uptake is sporadic however logical the argument for using the spec/standard is.’

So if the KIS and HEAR requirements represent the stick then I think the outcomes and outputs from the Institutional Approaches to Curriculum Design illustrate the carrot – the rewards for taking on this challenge. I describe it as a challenge, not for technical reasons, but because it relates back to issues discussed in my first two posts – the challenge and the benefits that come from having institution-wide conversations. It is time consuming and demanding for institutions to take a ‘big picture’ view of the many processes that link together, to rethink some of these processes and to articulate where they all connect and which data is central to this. However the benefit of this approach has been strongly emphasised by all of the project staff that I have spoken to. In early stages projects typically found a lack of articulation between course review, approval, advertising and enrolment/reporting, and between quality assurance, marketing and student records.

Whilst these projects have a focus on curriculum design processes all have had to take a broad view of whole institutional processes involving course information and student data. Many of the projects worked in parallel with other institution-wide initiatives (such as the University of Bolton Co-Educate project which linked to the development of a module database) reflecting the breadth of scale of their activities. It is hard to tease out the benefits of implementing XCRI-CAP from the benefits of those wider scale activities, because they naturally augment each other. Benefits include:

  • Increased understanding across the institution of how processes connect and how the data and systems facilitate or hinder these processes.
  • Improved efficiencies – such as less duplication of data, time savings, one accurate source of data that feeds into several systems, less paperwork.
  • Transparency of information for registered students, prospective students, and external agencies (e.g. government bodies and employers) has the potential to increase student intake and enhance the experience of students once they register with the course/s.
  • Automatic feeds to comply with funder requirements.

 

There is a consensus that implementing XCRI-CAP is fairly straightforward – once the data is in a database it is fairly uncomplicated to maintain – but when institutions try to broaden this to develop a definitive set of course information, linked to key processes such as quality control or curriculum design activities, then it becomes much more challenging. The Institutional Approaches to Curriculum Design projects have been documenting their experiences and producing some really useful outputs that should be of interest to the wider community. There is a particularly well written report from the University of Bolton Module database project which describes how they took experience from a JISC mini XCRI project and the Co-Educate curriculum design project to redesign and implement their module database.

‘The resulting system supports the capture of information from course inception, the development of modules, through the validation process, to approved and published status. A database application has been implemented with the functionality to support collaborative development of courses and manage the version control for module specifications subjected to minor modification. The system provides the capability to assemble pre-validated modules into new courses using the University’s pre-validated IDIBL course framework. The use of XCRi CAP 1.1 to render module specification in standards based XML, enables module details to be accessed and reused without having to use the database application. This opens up new possibilities for the reuse of module information. The University’s JISC funded Co-educate curriculum design project will be developing further tools for collaborative curriculum specification and design that will now use the XCRI capability.’

The report is really worth reading and they describe their approach and highlight the lessons learned.

The SRC project at Manchester Metropolitan University ran alongside an institutional initiative to Enhance the Quality of Assessment for Learning (EQAL) which is introducing a new curriculum framework, new administrative systems and processes, revised quality assurance processes and new learning systems to transform the student experience. The SRC project has been led by Professor Mark Stubbs, Managed Learning Environment Project Director who has been affectionately described as ‘The Godfather of XCRI’. Mark talks eloquently in a recent presentation on the origins of XCRI. In the video Mark re-iterates the fact that the technology behind the standard is not complex and describes how the Curriculum Delivery and Design programmes have highlighted the business process challenges that need to be worked through to ensure that it is possible on an institution-wide scale.

The project has produced some excellent resources which map and describe their journey and some of these have recently been added to the JISC Design Studio. One of these is a game called Accreditation! which is a training resource for those trying to encourage stakeholder engagement when embarking on a major change process involving program design and approval.

Screen shot of Accreditation board game

They have also produced a case study outlining academic database stakeholder requirements which includes some useful visual representations of their processes.

So the consensus is that ‘now is the time to embrace XCRI’. The JISC call presents a really great opportunity to get started on this. The first phase simply requires a Letter of Commitment from eligible institutions which provides evidence of support from Senior Managers responsible for Teaching and Learning, Marketing, Management Information Systems/IT and the institutional course web sites by12:00 noon UK time on Wednesday 7 September 2011. There is an Elluminate recording of the live briefing session in case you missed it and lots of information described here to convince these various stakeholders of the benefits.