Relationship Management infoKit now live!

On Valentines Day Andy Stewart from JISC Infonet launched the new Relationship Management infoKit The infoKits are excellent practical resources which highlight the lessons learned and resources from many large scale Jisc Programmes and make these both accessible and digestible. The Infonet team are great at developing and managing these really valuable assets.

This new infoKit describes some of the challenges faced by institutions when seeking to improve and maintain relationships with a range of different stakeholders, all of whom have different needs and expectations.

It highlights the different approaches that institutions can adopt and the kinds of infrastructure and cultural change that is needed to facilitate and support sustainable relationships. This infoKit provides an insight into some of the emerging technologies and professional practices explored as part of Jisc’s Relationship Management programme.

I contributed to the writing of the Relationship Management infoKit but the main credit for this great resource really belongs with other people. Sharon Perry, as a member of the CETIS team at Bolton, provided ongoing support and synthesis for the programme – her blog really offered a flavour of key lessons as they emerged. I worked with Sharon as a consultant on the final synthesis, with a focus on the alumni engagement strand. As with any synthesis work the main challenge was in taking the extensive work and evaluation outcomes from several project teams and bringing it together into a cohesive format. We worked closely with two Jisc Programme Managers Simon Whittemore and Myles Danson who brought the overarching vision and passion to the Programme.

I really would recommend that you take a look at the work done by the programme. In these changing and challenging times successful collaboration is key to ensuring that educational institutions develop sustainable and mutually beneficial relationships. As Sir Tim Wilson says in his foreward to the infoKit

This resource recognises the importance of drivers and motivations for institutions in this area, and illustrates how partnership management can be supported through institutional infrastructure and technologies, and managed through communication and networking approaches. The resource addresses common barriers and constraints and proposes approaches to avoid these pitfalls, based on institutional experiences. It provides approaches through which to derive the key benefits of enhanced student employability, engaged and supportive alumni, and professional service design and monitoring. However, for sustainable business value to be derived from these partnerships for the institution, agreed strategic priorities and supporting policies which bring together information/data, departments and stakeholders are necessary.

I learned a lot from doing this work and really hope that alumni engagement becomes much more than a series of fund raising activities. The potential goes much further than that – involving alumni in supporting existing students has far reaching positive consequences for both of these groups, for the wider community and for the institutions. Several other Jisc programmes are illustrating some great student partnerships (UKOER and DIGILIT) and this is one way to capitalise on these and take it forward outside the institution.

If you don’t believe me then go and check out the infoKit.

Lou McGill

http://loumcgill.co.uk

eXchanging course related information – XCRI timeline

One of the advantages of having being involved with JISC for a number of years (as a project and a service) is the opportunity to reflect on some activities that we’ve been involved in for some time. We thought it would be interesting to take the long view of some of our involvement with OER, XCRI and Learning Environments and reflect on what has worked and why, and where we think these activities are going next.

In this second story Lou McGill traces the history of the eXchanging Course Related Information (XCRI) specification, which is currently becoming a European and British standard (See Adam Cooper’s recent post). XCRI is a fine example of how the right people working together can develop interoperability standards that are truly fit for purpose.

A CETIS perspective of the XCRI story

Scott Wilson, Assistant Director at CETIS, describes XCRI as being ‘a community that formed to develop specifications related to course information’ (Wilson, 2010 [1]). This really captures the central aspect of the XCRI story as being about the community that came together, with the support of CETIS and JISC funding, to address a significant problem for institutions in managing their course information. Universities and colleges produce and utilise course information for several different purposes and duplication is a costly problem. The XCRI community drove forward the development of a shared vocabulary and domain map in the early days which ultimately led to the development of an internationally recognised specification. Their focus was on developing specifications to make generic aspects of a course description publicly available so they can be transferred easily between information systems. The formal outcome of this work is the XCRI-CAP (Course Advertising Profile) [2]. Lisa Corley from CETIS has written a blog post [3] which charts the development of XCRI, recognises work of these early pioneers and provides a very useful description of what it is and what benefits it offers to institutions. There is also extensive information available in the very well maintained XCRI Knowledge Base [4]. This community is still very active and now has fresh impetus from recent national HEFCE initiatives that require improved data exchange and transparency of institutional systems [5],[6] .

The XCRI and JISC CETIS timeline [7] has been developed to highlight the various activities that make up the XCRI landscape and includes JISC and CETIS activities since 2003. It also highlights some wider national and international initiatives which illustrate trends and changes in the last decade in this part of the Enterprise domain.


How we got here

The XCRI community emerged from the Enterprise SIG [8], a CETIS Special Interest Group established in 2003 that focussed on a range of standards, technologies and activities to facilitate the business processes of educational institutions. The Enterprise SIG was, essentially, a community of practice for people interested in, or involved with:

• the IMS Enterprise Specification and Enterprise Services Specification

• exchanging data about students and courses between educational systems (VLEs, Student Records, etc)

• joining up college systems to create Managed Learning Environments

• e-learning frameworks, architecture and web services

At the 2004 CETIS Conference the SIG identified a need for both a cohesive approach to standardising course descriptions and an agreed vocabulary, and in March 2005 the Enterprise SIG XCRI sub-group was formed. This group, led by Professor Mark Stubbs and Alan Paull, with the support of Scott Wilson from CETIS, became the XCRI community and have driven developments in this area in the UK since that date. In 2005 JISC funded XCRI as one of their Reference model projects [9] to define a vocabulary and appropriate (XML) technology bindings for describing course-related information. During this period XCRI produced an R1.0 schema, a repository demonstrator and surveyed 161 prospectus websites. This work happened alongside another JISC Reference Model project – COVARM (Course Validation Reference Model) [10]. Both of these reference model projects substantially mapped their respective domains and outputs fed into the eLearning Framework [11].

XCRI originally intended to ‘bridge the worlds of course marketing and course validation/quality assurance’ but, as Mark Stubbs describes [12], this became unwieldy;

“producing a course definition for validation/modification involves assembling fragments of information into a whole, whereas marketing a course involves communicating a serialized version of a subset of that assembly”

Feedback from the community, after testing the R1.0 schema in different contexts, identified a focus on a limited set of elements that supported course advertising and by 2006 XCRI-CAP was released as an XML specification. This was a very pragmatic outcome and presented something back to the community which responded to their needs and offered a usable schema for JISC to take forward with the wider HE and FE communities.

In 2007 JISC funded a range of projects [13] as part of the eLearning Capital programme around Course Management to build on the work of XCRI and COVARM reference projects. Within the course description and advertising strand a number of institutions specifically aimed to trial and refine XCRI-CAP. The resulting case studies from these projects [14] offer really valuable insight into the challenges and approaches that different types of institutions might encounter when implementing the specification and offer perspectives from a range of stakeholders such as policy makers, managers, administrators and technical staff.

JISC also funded a support project made up of members from CETIS, Manchester Metropolitan University and Kainao Ltd, who had been involved in the original XCRI sub-group. This project had a remit to further develop the specification, provide technical support to projects for the implementation of XCRI-CAP, provide a prototype online aggregation service, and promote the specification towards submission to an appropriate open standards process. The support project effectively continued the work of the XCRI reference project and proved so effective that funding was extended and continued until March 2011. Mark Power from CETIS describes [15] the various activities and achievements of this team and notes that this, and the work of the JISC funded projects, demonstrated the value of XCRI-CAP so successfully that it was placed on the strategic agenda of national agencies.

In 2008 the XCRI Support Project team engaged with other European initiatives in course information through the European Committee for Standardization (CEN) Workshop on Learning Technologies. Following this the CEN endorsed a Workshop Agreement for Metadata for Learning Opportunities (MLO) which defines a model for expressing information about learning opportunities including course information. MLO includes XCRI-CAP from the UK as well as other European specifications and is an attempt to unify these by offering a common subset, whilst still enabling local extensions and implementation architecture. This was ratified as a European Norm (EN 15982) in 2009 and was published in 2011.

Scott Wilson wrote in 2010 [16]

The formal standard defines the majority of the core concepts and information model used by XCRI. The engagement of XCRI in CEN standards development has provided an opportunity and an impetus for the XCRI community to progress to formal standardization. The current roadmap of XCRI is to develop a British Standard for its course syndication format as a conforming binding and application profile of CEN Metadata For Learning Opportunities: Advertising.

In 2009 XCRI-CAP 1.1 [17] was approved by the Information Standards Board for Education Skills and Childrens’ Services (ISB) as the UK eProspectus standard and on 1st March 2012 BS 8581 XCRI-CAP was released for public comment which would create a British Standard that is consistent with the European MLO-Advertising (EN-15982)

So far XCRI-CAP has enabled several institutions to transform practice around producing course information, especially for prospectuses, with reports of huge reductions in data duplication [18]. In 2009 the ISB estimated that a new standard for course information (XCRI-CAP) could save the sector in the region of £ 6 million per annum by removing the need to re-enter data into course handbooks and web sites.

Where we are now…

Scott Wilson from CETIS wrote a blog post in June 2011 entitled XCRI – the end of the beginning [19]. In this Scott notes a shift in the timeline of XCRI – taking us from a period of designing the specification and beta testing into ‘adoption, use and adaption’. This is a significant achievement for those involved in mapping and defining the terrain and testing out the specification across institutional systems. The community now has working exemplars which not only deliver proof of economies of scale, through reduced duplication of data, but also articulate the value of re-thinking a range of business processes affected by course information. It has long been recognised that barriers for institutions in adopting XCRI-CAP lie not in technical complexities of the schema but in the challenges around managing their data and processes to support course information, many of which involve several incompatible systems.

A range of current national drivers require educational institutions to consider how they manage and surface some of their information sets and those institutions that have engaged with XCRI-CAP are likely to find it easier to respond to some of these requirements. In early 2011 a report to HEFCE from the Online Learning Task Force [20] highlighted the challenges that students face due to insufficient, and hard to find, information about courses not dealt with by UCAS. As part of the Government Transparency agenda educational institutions are being required to provide KIS (Key Information Sets) [21] for the majority of undergraduate courses from September 2012 and to feed into the HEAR (Higher Education Achievement Report) [22] recording student achievement. Each of these initiatives provide significant impetus for institutions to consider how their course information is managed and how it links to other processes. Implementing XCRI-CAP can be a valuable way to consider this [23]. Towards the end of 2011 JISC launched a new programme called Course Data: making the most of Course Information [24]. The programme has two stages – the first which took place from September to November 2011 gave 93 institutions £10k to prepare an implementation plan to improve their course data flows and produce feeds for external agencies. 63 institutions have been selected for stage 2 to implement these plans and these began in January 2012 and will end in 2013. Outcomes and outputs of this programme are being synthesised on the XCRI Knowledge Base [25].

Where we are going…

Whilst the 2011 JISC programme will result in larger numbers of courses being advertised in XCRI-CAP format Scott argues that we need to see it taken up by major course aggregation and brokerage services. This was one of the themes discussed at the XCRI eXchange [26] in 2011 which was a National Showcase for XCRI, organised by the SAMSON [27] and MUSKET [28] projects, funded under the JISC Institutional Innovation Programme. Scott concludes his blog post with a suggestion that establishing an alliance could be the key to encourage high profile ownership and promotion of XCRI.

I think it would have to have the major aggregators on board (UCAS, Hotcourses), plus curriculum management solution providers (Unit4, Akari) and larger learning providers (e.g. the Open University, University of Manchester, MMU, Nottingham) as well as some of the smaller tools and services companies that are already working with XCRI (APS, IGSL, Smartways). It would develop the brand identity under which XCRI-CAP adoption would be recognised (not necessarily retaining anything of the original brand) and promote it beyond the reach of funded programmes into large-scale use.

Is this the future for XCRI?

I believe an Alliance like this would be a fitting development in the story of XCRI – a community driven specification having ongoing support and recognition from key stakeholders. It would be a fitting testament to the XCRI community and their achievements over the last decade.


About Lou

Lou McGill is currently working independently and has recently been involved in synthesis and evaluation activities for the HE Academy/JISC UKOER programme and the JISC Transforming Curriculum Delivery Through Technology programme. She lead the team that produced the Good Intentions report on business cases for sharing and worked on the LLiDA study (Learning Literacies in a Digital Age). She has experience of working in a range of HE institutions as a librarian, learning technologist and project manager and used to be a JISC Programme Manager on the eLearning team. In the distant past she worked for NIACE (the adult learning organisation) and in Health education for the NHS. Her interests and experience include digital literacy, information literacy, open education, distance learning, managing organisational change, and effective use of technologies to support learning. Further information on Lou’s work can be found at: http://loumcgill.co.uk

Open Educational Resources timeline

A CETIS perspective of the OER story

One of the advantages of having being involved with JISC for a number of years (as a project and a service) is the opportunity to reflect on some activities that we’ve been involved in for some time. We thought it would be interesting to take the long view of some of our involvement with OER, XCRI, Learning Environments and reflect on what has worked and why, and where we think these activities are going next.

This first story looks at the development of the Open Educational Resources area. Lou McGill talked to Phil Barker and Lorna Campbell about how the OER field has evolved in the last ten years.

Open Educational Resources

CETIS has been engaged with technical aspects of educational resource management and use since its early days. This includes contributing to the development and implementation of standards and technologies for creating and making learning resources discoverable [1], managing and sharing learning resources [2] and technologies for learning design [3]. Each of these areas has a rich history of activities and technological development. In 2008, following on from involvement in the areas of open access to research [4] and open technologies [5], CETIS highlighted the potential of open content for learning and teaching and how this might challenge and transform approaches to educational practice [6]. The term Open Educational Resources (OERs) emerged in the early 2000s and can be understood broadly to mean “digitised materials offered freely and openly for educators, students and self-learners to use and reuse for teaching, learning and research” (OECD, 2007). Since then the UK has seen considerable activity relating to OERs and more recently in open educational practices (OEP). The CETIS wiki [7] has a section dedicated to the technical aspects of OERs providing background information and links to further resources.

How we got here…

JISC funded OER activities in the UK have been shaped by the history of learning resources in national Further and Higher education contexts and by what we have learned from a number of development programmes. The OER and JISC CETIS timeline[8] has been developed to illustrate the various activities that make up the learning resources and OER landscape and includes JISC and CETIS activities since 2002. It also highlights some wider national and international initiatives which illustrate trends and changes in the last decade.


JISC CETIS and OER timeline in tikitoki

Historical perspective in the UK

Cultural aspects and practices around sharing learning resources have been a particular focus for several UK studies [9], [10], [11] and continue to be a focus for UKOER activities. JISC has funded a number of programmes since 2002 to investigate issues around developing, managing and accessing learning resources, and these have surfaced issues relating to institutional policies and practices, business models, teaching practices, legal issues and technical aspects. In addition to providing technical and strategic support and guidance to the programmes, CETIS contributes to scoping the technical requirements and reports on wider trends across the decade.

JISC funded a large scale programme in the early 2000s called eXchange for Learning (X4L) (2002-2006) [12] to support the development, repurposing and sharing of learning resources. This ran in parallel to the establishment of a national learning resources repository Jorum [13]. Following on from X4L the RePRODUCE Programme (2008-2009) [14] focussed on developing courses using repurposed and re-usable learning resources. In the mid to late 2000s JISC funded two other programmes focussed on establishing technical infrastructure within institutions and across the sector – Digital Repositories (2005-2007) [15] and Repositories and Preservation (2006-2009) [16].

These programmes were informed by a strategic and technical vision which was expressed through initiatives including the e-Learning Framework [17], the e-Framework [18], the Information Environment Technical Architecture [19] and the Digital Repositories Roadmap [20].

Projects in the X4L programme were required to explore the process of integrating interoperable learning objects with VLEs. A small number of tools projects were funded to facilitate this task: an assessment management system (TOIA), a content packaging tool (RELOAD) and a learning object repository (Jorum). Projects were given a strong steer to use interoperability standards such as IMS QTI, IMS Content Packaging and IEEE LOM. A mandatory application profile of the IEEE LOM was developed for the programme and formal subject classification vocabularies identified including JACS and Dewey. Projects were strongly recommended to deposit their content in the Jorum repository and institutions were required to sign formal licence agreements before doing so. Access to content deposited content in Jorum was restricted to UK F/HE institutions only.

These conditions meant that projects required significant support to engage with and implement the various standards and invested considerable time on these elements. Depositing learning and teaching materials in formal repositories raised very different issues to that of depositing research outputs, as these included a wide range of formats, levels of granularity and sometimes incorporated accompanying pedagogical guidance. A particular focus for projects and the wider community at this time was the debate about how granularity of learning resources impacted on aggregation/disaggregation and how this affected flexibility and reuse. The X4L review report highlighted the fact that repurposing and reuse are affected by much more than granularity

“Effectively ease of use, improving the learning experience, and improving design are all interrelated, and all will be underpinned by an understanding of who will actually engage in repurposing (or reuse) and why.[21]

Several X4L projects encountered problems with resources that incorporated items with various original licences, and highlighted the fact that teachers had often not previously acknowledged ownership of content they had used or understood the need to do so. The JISC programmes in the mid 2000s did much to challenge the perceptions within the community that licencing and copyright was overly complex, but did little to generate positive attitudes towards this and remove barriers.

“the licences used, and hence the access authorization policy for the JORUM repository, focussed more on restricting access and use than on permitting it.” Phil Barker

CETIS and another JISC Innovation Support Centre, UKOLN [22], were funded to run the Repositories Research Team, as part of the JISC Digital Repositories and Repositories and Preservation Programmes. The remit of this team included: helping projects find and exploit synergies across the programme and beyond, gathering scenarios and use cases from projects, liaising with other national and international repositories activities, including liaison with the e-Framework, synthesizing project and programme outcomes, and engaging with interoperability standards activity and repository architectures. This team were able to draw together key messages from the programmes [23].

Whilst there was an increase in institutional repositories providing access to scholarly works at this time, there was less success supporting and facilitating access to teaching and learning materials. One of the final conclusions of the Repositories and Preservation Advisory Group, which advised the JISC repositories programmes, was that teaching and learning resources had not been served well by the debate about institutional repositories seeking to cover both open access to research outputs and management of teaching and learning materials, as the issues relating to their use and management are fundamentally different [24].

The late Rachel Heery also commented that greater value may be derived from programmes that focus more on achieving strategic objectives (e.g. improving access to resources) and less on a specific technology to meet these objectives (e.g. repositories). An example of this kind of approach is the International JISC/NSF Digital Libraries in the Classroom Programme (2003-2008) [25] which investigated institutional, technical and social aspects to developing, sharing and managing content to support learning activities. Projects in this programme were led by academic departments and focussed on the strategic objectives of using the content with learners. Although specific technologies and standards were not mandated for this programme, projects brought together formal repositories, workflows, copyright, metadata issues and learning design with web 2.0 approaches, tagging, digital literacies and student content, all within real learning and teaching contexts.

Rather than a radical shift in policy these conclusions should be regarded as reflecting a gradual development in policy, licensing and technology right across the web.

Wider context

The emergence of a highly networked ‘social web’ has impacted on how people find, create, manage, share and use content, for both personal professional and learning activities. This includes the advent of web 2.0, the appearance of media specific dissemination platforms such as slideshare, youtube, flickr, iTunesU, interaction through RESTful APIs, OpenID, OAuth and other web-wide technologies, and increasing acceptance of Creative Commons licenses. These services and changing practices are not always open and, it could be argued, openness is not always appropriate. So whilst the open web and the social web are not co-dependent there is a move towards open social web approaches to learning and teaching, of which moocs (Massive open online courses) are one example [26]. This is transforming how learners interact with educational content and is challenging traditional models of educational provision and scholarly activities. This affects institutional policies and strategies, particularly around technologies to support learning and teaching.

“As a result there has been a movement away from developing centralised education specific tools services and towards the integration of institutional systems with applications and services scattered across the web. Furthermore there has been growing awareness of the importance of the web itself as a technical architecture as opposed to a simple interface or delivery platform.” Lorna Campbell. Phil Barker and R. John Robertson

This has been reflected in recent JISC funded programmes where specific technologies and standards are not mandated and projects are encouraged to adopt technologies that suit their purpose and context.

Internationally, various models have emerged to release open content. These models are often shaped by how they have been funded and by the various, and sometimes quite different, motivations to release content as OERs. Community based models offer sustainable approaches based on practice and resources sharing, whilst some educational institutions recognise the potential of OERs as marketing opportunities. Recent initiatives such as University of the People [27] and OER University [28] reflect both the fundamental aspiration of providing access to learning for students around the world as well as a need for educational institutions to find ways to respond to changing needs of learners. Issues around accreditation and assessment in an open context are emerging as an important focus for the community. Activities have predominantly concentrated on releasing OERs with less focus on how these are being used, or who is using them, although the increasing focus on open learning experiences, and the fact that there is now a significant corpus of OERs, is starting to change this. Technologies are increasingly being utilised to track use and feed relevant content to individuals and has been a focus of recent activities in UKOER projects.

Moving towards open in the UK

The launch of the Open University OpenLearn [29] and the University of Nottingham’s u-Now Open Educational Repository [30] in the mid 2000s marked the UKs first formal steps with OERs, although individual academics and teachers were experimenting with open technologies to make some of their content openly available [31]. Despite these, and other international initiatives, the RePRODUCE Programme concluded that projects had significantly underestimated the difficulty of finding high quality teaching and learning materials that were suitable for copyright clearance and reuse. In 2008 JISC funded a research study [32] into the sharing of learning materials which provided a history of sharing and managing learning resources in the UK, described business models and benefits, and focussed on open and community sharing. This report concluded that open approaches to producing and making learning materials accessible was likely to have significant impact on both the sharing and exchange of resources in both national and global contexts. Also in 2008 CETIS produced a briefing paper on OERs and held a scoping session at their annual Conference. These reports fed into the development and scoping of the jointly funded HE Academy/JISC UKOER programme [33].

The UKOER Pilot Programme (2009-2010) involved a range of OER providers including individual educators, discipline-based consortia and institutions.

Given this diversity it was recognised from the outset that no single technical solution would fit all projects, and therefore no specific tools, descriptive standards, exchange or dissemination mechanisms were mandated (apart from a requirement that the resources produced be represented in a national repository of learning materials). [34]

CETIS has supported this diversity by encouraging discussions at meetings or through blog posts to identify which technical choices have been made by individual projects, recording these openly [35] and responding to issues as they arose.

Where we are now…

The UKOER programme has progressed through phase two into phase three and Jorum has continued to be developed as an open national repository with CETIS providing input on issues around bulk upload of resources and syndication. Projects have highlighted a range of technical issues relating to building collections, and providing rich descriptions, of learning resources. CETIS has been involved in supporting this by exploring issues around packaging, describing, tracking and aggregating resources. Project technical approaches have included RSS aggregation and techniques similar to podcasting, presentation of resources through novel interfaces such as timelines and maps using geolocation data; representation of relationships between resources cross search, upload and metadata harvesting through the use of third party host APIs. As well as supporting projects, CETIS provided opportunities to discuss these issues with the wider community at its annual conferences [36] and other events.

An important aspect of CETIS’s work is that of providing a unique space for technically focussed staff to have conversations across institutional boundaries and also offering opportunities for innovation and experimentation. A joint CETIS/UKOLN DevCSI OER Hack Day event proved to be highly productive and stimulating as it brought together software developers, project managers, academics, learning technologists, researchers and users to work in multi-disciplinary teams on ideas for developing tools and solutions to OER problems. Towards the end of 2011 CETIS commissioned two technical OER Mini-projects which adopted the rapid innovation funding model and aimed to encourage openness and innovation. This provides a contrast to longer term large scale programmes [37].

The CETIS team blogs continue to provide an ongoing dialogue around technical issues, identifying emerging trends as well as providing programme level synthesis. There has been a value in taking a team approach to programme support as CETIS staff brought a wealth of experience from their involvement with the earlier JISC funded work around learning resources and repositories. CETIS staff who have been involved in this area include Lorna Campbell, Phil Barker, R. John Roberston, Li Yuan, Sheila MacNeill and Sarah Currier.

“Although we have seen a significant shift in focus from formal repository standards, protocols & procedures, learning objects and controlled access to repositories to lightweight web-wide specifications and social sharing platforms, there is still plenty to discuss regarding resource description, levels of openness, resource discovery, student content and quality” [38]. Phil Barker

Even during the relatively short timescale of the UKOER programme CETIS has seen projects choose a wide range of solutions to increase access to their OERs. During phases 1 and 2, projects released resources of varying levels of granularity from individual images to whole courses. Many projects used multiple platforms [39] to host these and some projects made their OERs available on a combination of national, institutional and subject repositories, social sites and content management systems. Projects were encouraged to use feeds to ensure that resources stored on different hosts were displayed on other sites, which is in marked contrast to the early X4L content and increases visibility and access to the OERs. Projects were aware that the range of different potential users needed different levels of granularity, levels of additional content, metadata and presentation methods. It is interesting to see projects take advantage of the affordances of formal repositories such as more effective content management, version and licence control and metadata, and balancing these with the informal web-based approaches which appear to offer flexibility and choice, tagging and commenting, although repositories are increasingly offering similar functionality.

Where we are going…

CETIS will continue to work with the UK HE and FE community to encourage discussion, innovation and experimentation with technologies and standards to support OERs and open practices, and to feed this into broader national and international contexts. In 2010, in the US, the Learning Registry was established as an international open source technical system offering an alternative approach to learning resource discovery and sharing and as a community for people sharing resources [40]. This initiative seems to exemplify how far the learning and teaching community has come in the last decade in terms of aspirations and approaches. It remains to be seen how successful the technical approaches will be, but Lorna Campbell, Assistant Director at CETIS sees the potential…

If the Learning Registry is successful in creating a “light-weight learning resource information sharing network” it will be a major step forward in terms of facilitating access to the wealth of educational content that is scattered across the web. Lorna Campbell

CETIS advise JISC about the Learning Registry and also advise the JISC JLeRN project [41] which is the experimental node in the UK.

Codebashes, hack days and the annual CETIS conference events provide spaces for JISC funded projects and the wider community to extend the conversations into new areas and continue innovation. The third phase of UKOER will continue to provide real-use studies of a range of different technologies and standards [42] and includes some interesting work with publishers. As the educational community worldwide focuses on open accreditation and assessment, and on digital literacies to produce and use OERs the CETIS team is likely to have a role in drawing together different communities to cross boundaries and share knowledge and experience from other aspects of work.


About Lou

Lou McGill is currently working independently and has recently been involved in synthesis and evaluation activities for the HE Academy/JISC UKOER programme and the JISC Transforming Curriculum Delivery Through Technology programme. She lead the team that produced the Good Intentions report on business cases for sharing and worked on the LLiDA study (Learning Literacies in a Digital Age). She has experience of working in a range of HE institutions as a librarian, learning technologist and project manager and used to be a JISC Programme Manager on the eLearning team. In the distant past she worked for NIACE (the adult learning organisation) and in Health education for the NHS. Her interests and experience include digital literacy, information literacy, open education, distance learning, managing organisational change, and effective use of technologies to support learning. Further information on Lou’s work can be found at: http://loumcgill.co.uk

Curriculum Delivery: Dynamic Learning Maps

In this second post Lou McGill focusses on the DLM project which produced a dynamic map of the entire Medical Curriculum at Newcastle University.


The Dynamic Learning Maps (DLM) project at Newcastle University was funded by JISC as part of the Transforming Curriculum Delivery Through Technology programme. This programme saw a very diverse set of projects use technology to address a number of institutional challenges and ultimately transform their approaches to enhancing the student experience.

The DLM project aimed to make visible the complex medical ‘spiral curriculum’ where topics are revisited with increasing depth over a 5 year programme, and to reveal connections across curricula to support modular learning. High level goals included a desire to promote lifelong learning, enhance employability and enable personalisation for students. The diverse nature of stakeholders with unique views of the curricula increased the complexity of the project and led to some interesting paths and choices as the team developed the maps. Simply getting agreement on what constitutes a curriculum map proved challenging and required a highly flexible approach and agile development. Like many technical development projects DLM had to balance the need to develop a common understanding with the need to reflect different stakeholder requirements. Agreeing on what elements to include and the level of detail was important as well as the fundamental issues around how they might be used by both staff and students.

The project stands out in this programme for a few reasons – not least that it has much in common with the ongoing Institutional Approaches to Curriculum Design programme, which has several projects engaged with mapping the curriculum. DLM has developed an interesting mix of formal curriculum maps, personal learning records and community-driven maps which have proved to be as useful for curriculum teams as for the medical, psychology and speech and language students they support. Another distinguishing feature of the project was that it had a more technical focus than other projects – engaging with neural networks, a range of institutional systems, data feeds and mashups, e-portfolios as well as a range of web 2.0 approaches.

A curriculum map from the DLM project

A curriculum map from the DLM project

Curriculum maps
Curriculum maps are diagrammatic representations of the curriculum and can be seen as operational tools for staff to support course management and administration or as aids to support curriculum design and delivery. They offer a window on to the curriculum and can include learning outcomes, content, assessment and information about people. They can be complex and are usually very labour intensive to develop and embed into institutional processes. Making curriculum maps open to students gives them navigational aids through their courses, providing clarity around different modules, identifying connections between these and supporting module choices. Adding capacity to personalise their pathways through the maps, link to individual portfolios records and add their own resources to can turn curriculum maps into powerful learning aids. This is what makes the DLM project stand out for me.

It is challenging enough to visually represent or describe a curriculum, but using established standards and technologies to link data in a meaningful way takes things to another level as connections can be made with a range of institutional systems.

Institutional Systems linked in DLM project

Institutional Systems linked in DLM project

DLM pulls in data from a variety of institutional systems including repositories, library and curriculum databases, and student information systems, as well as having a two-way transfer with e-portfolio systems. This means individuals can navigate the formal curriculum map and see links between elements, such as learning outcomes, timetables, etc. They can also add to the map in the form of personal reflections or notes and external resources which can be rated and discussed. It is important to note that the project highlighted that existing curriculum data will not necessarily plug-in easily to other systems. They identified a range of issues that had to be addressed including governance, QA process and data mitigation and translation. This is one example described in the final report:
A separate tool was developed to manage programme-level learning outcomes and link these to units /modules, this provides a feed used in Learning Maps. This was necessary because existing data from an MBBS study guides database was in non-standardised text; this tool enables curriculum managers to map specific programme-level outcomes, but to still use context-specific language of the module.

Web 2.0 mash-ups
Technically DLM is very interesting – the team built on previous work around personalisation and Web 2.0 approaches. The DLM concept was inspired by neural networks, where nodes can be connected to any other node in the network, and those connections vary in strength. They used open standards and existing corporate and programme data to support sustainability, transferability and reduce dependencies on specific systems.

The project team took good advantage of work being done within the institution as part of other initiatives. The e-portfolio element of DLM drew on work done with the Leap2A specification for e-portfolio portability and interoperability as Newcastle University has been involved in developing the specification and the JISC EPICS-2 project (e-Portfolios and PDP in the North East). Taking a web service approach, when learners add their reflections and notes to curriculum topics these generate an xml file which is stored remotely in their individual portfolio using Leap2 and becomes part of their portable portfolio record. This approach means that even if departments have different e-portfolio systems the standardised data can be stored in any of them. For more information on Leap2A see the recent article by Christina Smart from JISC CETIS.

DLM also benefitted from the experience of using XCRI-CAP from the mini project North East XCRI testbed (NEXT) which reported in April 2010. As an example of using XCRI feeds within a learning application, they added support for DLM. By embedding XCRI feeds inside learning maps course related information relating to specific parts of the curriculum are revealed using rss/atom feeds which send an http request to course database and http response into DLM. This experience was of significant value to both projects. Other initiatives built on by DLM were the use of CAS authentication as an outcome of the JISC Iamsect single-sign-on project and use of corporate data flows from the JISC ID-Maps project.

The project final report describes the project approach:

The project maintained a blog which covers a whole range of interesting aspects but my favourite post was one by Tony McDonald called DLMS as a substrate for academic content where he provided a real glimpse into the possibilities of taking existing data (eg a study guide) and re-presenting this using DLMs, and the kinds of detailed considerations that affect this such as metadata and context. Here is a brief snippet of his conversation with himself…

Well, we would need to deconstruct the study guide into something which is ‘node’-sized for the DLMs machinery. This could be at the paragraph level or smaller. That isn’t so bad to do, we have a lot of contextual information on the guide itself (where it sites in the curriculum, who is the module leader etc) which would contribute to over-arching metadata on the document. We would then need to add DLM-specific metadata on each node. The metadata is quite varied, from simple one word descriptions (eg simple tags) through to multiple-selections for licence usage of the material itself (we very much believe in OER!). The metadata also helps us to decide how the content should be rendered – eg as simple HTML, as something which is only released in a specific time frame, something that is only seen by particular categories of user, etc This deconstruction is certainly doable, and the DLMs team has already done this for small sections of study guide material. (Tony McDonald)

Impact so far
Evaluation occurred throughout the process and early feedback shaped the visual representation and elements included in the maps. Students revealed an almost 50/50 split in preference for visual (concept map style) representation and hierarchical lists (text-based) so DLM has both styles of display, as well as tagcloud views.

Ongoing challenges have emerged that are relevant to any curriculum mapping processes such as changing curricula – sometimes restructuring of whole courses, and the fact that the student journey through the curriculum changes over time. One particular issue is that each cohort has a different experience of the curriculum and the team were faced with a decision around mapping these differences, however they chose to opt only for the current map as this would link to up to date developments and guidelines which are crucial in the healthcare field. Other challenges include managing stepped/timed availability of resources, and that not all data is available in a usable or consistent form. A major challenge lies in balancing automated data with that requiring moderation or contextual information – impacting on choices around granularity of content and specificity.

DLM offers different things to a range of stakeholders. For learners it offers an overview of their learning – a reminder of prior learning, a view of current learning and opportunities to consider future learning choices. They offer interactive opportunities for sharing, rating and reviewing resources as well as facilities to add personal reflective notes, portfolio records and evidencing learning outcomes. Different student groups expressed alternate ways of using DLM – ie for revision, contextualisation, planning curriculum choices or career choices. .

For staff DLM offers mechanisms to review curricula and identify connections across modules. In addition they highlight gaps in provision, duplication and issues around consistency of terminology. Staff are able to see how a student may perceive and engage with their curriculum, monitor access and equality of learning opportunities, and consider alignment of teaching learning and assessment. They will be able to identify which resources emerge as popular and offer a glimpse into learning that may happen outside the formal curriculum.

At Newcastle, thanks to interest at strategic level the team are planning to extend DLM to geography and dentistry. It will be very interesting to see how well it transfers to other subject areas but there are quite a few challenges in transferring the model to other institutions, although the team have received expressions of interest. The extent of customisation required to make this work takes commitment and considerable technical support. A public demonstrator and software download is available via the project Website and thanks to the use of open standards other institutions could potentially take this forward if they too are prepared to develop a shared understanding of curriculum mapping and take the time to share data across their systems.

This excerpt from the project final report nicely sums up the potential of DLM – it is definitely a report worth reading and I hope other institutions consider a similar model or take some of the steps towards making it work within their own context.

DLM is a flexible and interactive tool, which can be readily aligned with an institution’s Teaching and Learning Strategy, whilst at the same time support a diverse range of specific programme requirements. It can be used to increase transparency in the curriculum in an interactive and participative way that more closely matches the changing experience and expectation of many modern learners. It also has potential to help address sector-wide drivers for PDP, employability, greater personalisation and student involvement in the curriculum. DLM final report


Range of outputs

https://learning-maps.ncl.ac.uk/docs/
Demonstration version of Dynamic Learning Maps (needs registration)
Project final report

Curriculum Delivery: Let’s Get Real

In the first of two posts on the Transforming Curriculum Delivery Through Technology programme Lou McGill discusses how a number of projects used technologies to recreate “real” learning experiences. This post first appeared on Lou’s blog in April this year.


Following on from my post about the final synthesis report of the Transforming Curriculum Delivery Through Technology programme I thought it may be useful to focus of a few of the key themes to emerge from the programme in more detail.

One of the aspects that I found most interesting was the number of projects who were using technologies to support authentic, situated learning experiences. In the past I worked at the University of Strathclyde on a JISC funded Digital Libraries in the Classroom programme on the DIDET project, which used a wiki and a variety of technologies to support design engineering students during the design process. What we aimed to do was provide the technologies to replicate a global product design experience (with our partner Stanford University in the US) where students created, managed and shared design artifacts. I think this project was very forward thinking as it started in 2003 when wiki’s were not widely used in higher education contexts – so definately worth a plug here; )

I was particularly interested then that two of the Transforming Curriculum Delivery projects were focussed on design students. A common need for design students across a range of disciplines is to experience the reality of working collaboratively in teams to tight deadlines and to a fairly well established design process. The Atelier-D project, based at the Open University was aiming to replicate a traditional atelier style environment for distance learning students to learn in a collaborative way with their tutors and other students. They used a range of technologies including flickr, facebook, video conferencing, concept mapping, second life and social networking sites. This project faced significant challenges in implementing, sometimes complex, technologies with distance learners who faced problems both with access and usability of services where the base technical requirements and learning curve for new users can be high.

Also focussed on design students was the Information Spaces for Creative Converstations project led by Middlesex University Interaction Design Centre and partnered by City University London, Centre for HCI Design. This team wanted to make sure that technologies supported creative conversations between design students rather than distract from them. They also used technologies to help students record and conserve these conversations for later reflection and like the DIDET project utilised technologies to manage the range of artifacts that emerge during this part of the design process – such as sketches, photographs, recorded conversations, and later reflections that inform the next stages of design work.

Other projects offering authentic professional or work-based experiences included Generation 4 at St George’s University of London which developed interactive Online Virtual Patient cases with options and consequences. Students work in groups on a virtual patient problem where they could see the impact of their decisions, without damaging a real person. The Duckling project at the University of Leicester was also focussed on distance learning students and they utilised and adapted an existing oil rig in Second Life for occupational psychology students to use.

The MoRSE project led by Kingston University and De Montfort University worked with two different groups of learners using mobile technologies to provide a practice based curricula.

‘The delivery of a situated curriculum for students working beyond the institution in practice based environments is critical along with the ability to be active contributors in real world problem solving. The ability of both institutional and personal technologies to effectively and appropriately enhance this situated curriculum and experience is crucial. For example fieldwork experience in real problem environments for students has been crucial to student understanding to all aspects of real world scenarios from the collection of primary data through its processing, interpretation and analysis to the completion of an output. This experience can be lessened through the student having to split work on a project between the field and institutional laboratories because of time and access to technologies and resources. In addition basic data processing tasks can take a significant period of limited fieldtrip time that could otherwise be spent on analysis and interpretation, and increases the time between data collection and its analysis.’ (MoRSE final report)

This type of approach required quite a lot of learner support and MoRSE used student mentors, which provided useful experience for mentor’s CVs or portfolios, and provided low cost field and placement support.

The Springboard TV project at West Anglia College focused on providing an experience that offered an opportunity to state of the art technologies to develop their own internet tv station.

“Creating an identity and branding has been a very powerful agent in developing a ‘learner centred approach’, where learners now respond as professionals, working in a ‘real life’ production company “
Jayne Walpole Head of Faculty Creative Arts’ Springboard TV

Wikis were used by the INTEGRATE project at the University of Exeter to provide an authentic international group experience for a very large cohort (465 students from 40 countries) to stimulate international co-operation and international management skills. As well as providing an opportunity to practice a professional role it also provided a small group setting where students with a wide range of language and cultural differences could support each other, creating a collegiate environment and culture.

These brief descriptions are just snippets that are more fully explained in the Design Studio and project websites and reports. I think several of these approaches and activities should be of interest to others trying to create authentic learning experiences.

Curriculum Design: X marks the spot?

In her second post on Curriculum Design Lou McGill considers how institutions connect and manage course information, and the role that XCRI can play.

course-map2

This ‘middle earth’ style map produced by Professor Mark Stubbs, Managed Learning Environment Project Director at the Manchester Metropolitan University (MMU) shows the extent of information about courses that exist in further and higher education institutions. What is missing on the map, and sometimes within institutions, are the paths which connect the breadth of processes and systems that link up this data. It would certainly make for a more complex picture but this is actually what many of the Institutional approaches to Curriculum Design projects are doing through business process mapping and through their baselining and early stakeholder engagement activities. The previous post introduced some of the approaches adopted by projects to map their processes. This post offers a bit more detail of the ways they managed their information and made sure that systems share and utilise this effectively.

Course information comprises a range of data from several fundamental processes including course creation, approval, validation, documentation, QA, resource management (timetabling, resource allocation), modification and review. It seems rather obvious to say that the management of this information presents many challenges but feedback from the projects has seen common use of terminology such as ‘grappling’ and ‘wrestling’ to describe their efforts to prevent duplication and disconnected silos of data. Projects also highlighted a need for different views and pathways into course information for different stakeholders. However, course information does not exist in isolation and projects really benefitted from taking a broad view of the whole institutional landscape and thinking about how the different processes and data across other functions connect. This rich picture emerging through conversations with several projects nicely highlights the need for joined-up thinking across organizational boundaries between Student Records, Quality Assurance, Marketing and Course Teams.

coursediscussions

Course approval, as an example, is a key activity in curriculum design and during baselining activities several projects identified challenges with existing processes as they involved formal (with a strong emphasis on QA) and well established paper-based methods. The format of this activity shaped the kinds of information collected and resulted in the need for augmentation and modification at later stages when inputting the data into different systems. Projects highlighted that many staff responded to the process as a ‘form-filling’ exercise rather than an opportunity to think about and re-consider their practice. At a practical level course related documents (such as handbooks, online module descriptions) were usually developed locally and quite separately to the course approval process. At a more strategic level, course-related information to support decision making and planning was often poorly collated and managed. How institutions utilise diverse course feedback information such as external examiners reports, evaluation data and broad market research is often subject to localised departmental approaches.

The UG-Flex project team carried out a business process review for the University of Greenwich’s existing Programme Approval and Review process in order to identify stakeholders, issues and inform system requirements. The resulting documents are now informing ongoing system review at Greenwich. Other course process maps from other projects are included in the previous post and more will eventually be available on JISC Design Studio.

Linking student and course information

Student data, which includes information about enrolment, admissions, registration, progression, assessment, records and e-portfolios, links to course information at several points and project activities have also included work in this area. A significant issue to emerge during the review process was that many existing institutional systems reflect the more traditional standard academic year course patterns. Departments which had adopted more flexible teaching approaches and models to respond to changing learner demands were frustrated by centralised systems that did not fit their needs and were using workarounds to fit their students in or developing parallel local systems. UG-Flex project stakeholders described cases where some students on short courses who had actually finished their course before gaining access to the VLE, which highlighted the need to organise their information differently to ensure timely access for different student cohorts.

Claire Eustance, Project Manager at UG-Flex emphasises the need to establish and maintain trust when undertaking business process mapping to ensure that initial talks with staff are followed-up, with them being shown the outcomes and solutions to problems and keeping them involved throughout the process.

‘When we first started talking to our stakeholders their perception tended to be that problems lay in the systems and what they could, or couldn’t do. Eventually though more people are beginning to understand that the systems merely reflect existing institutional mechanisms which have either simply evolved over a period of years or have been based on the needs of ‘mainstream’ students. I can’t stress enough how significant this has been. Now at Greenwich, at the highest level, there are moves to ensure that our organisational mechanisms and processes reflect the needs of all of our students. Once we have these in place then the systems will be redesigned around them.’

UG-Flex’s efforts to reveal and map Greenwich’s institutional processes and how systems support these have helped strategic and operational managers recognise and articulate the need for change. UG-Flex aspires to see this approach to business process review and mapping embedded into mainstream strategic planning at Greenwich, anticipating long term benefits as systems and processes develop through cycles of change and review. Whilst this can appear to be about efficiencies such as reducing the administrative burden, duplication and clerical error, the crux is the real value that comes from being able to enhance the experience for all students.

Where VLEs link into wider systems

One of the places that most learners connect with institutional systems is through the VLE, and whilst they may not be interested in the underlying processes and systems, their learning experience can be significantly affected by them. VLEs are just one of the systems that benefit from well managed course and student information.

The PREDICT project at City University have been looking at how course information links to student data with a particular emphasis on the admissions process and through linking information about student module choices to the VLE. The new student registration system at City has led to improved quality of information and significant reduction in administrative time (from 3 hours to 10 minutes) with about 90% of students registering online before their courses started. The vision is that on day one a student at City University will log into the VLE and see their own space with appropriate course information, discussion areas and content, and ultimately assessment and grading information. This will be achieved through the use of middleware to establish automatic links between the VLE , the admissions system, course information systems, the student record system, the library, finance and identify management functions to facilitate the sharing of data.

Quite apart from the cost efficiencies (saving £20K per year on printing postage and data entry) time is freed for staff to do other activities and the student experience at the very beginning of their relationship with the University is significantly enhanced. The way that information is now being made available across systems immediately is a vast improvement on the traditional data-dump scheduled transfer that those of us who have been around for a while are very familiar.

With an imperative to reduce duplication and the drive to enhance the student experience it is also possible to personalise the student view of their modules through the creation of rules which link content within the VLE. So, for example, a student registered on one course may automatically have relevant content from a different course or module revealed. There are also plans to incorporate module information so that students can select elective modules online, requiring links with timetabling and resource planning information. This can be quite a drawn out process as their selections may depend on results due in several months time – hence a need to incorporate attainment data. This currently necessitates some backward data exchange as marks entered into the VLE need to be seen by the student records system.

Like other projects the activities of the PREDICT team have resulted in revised documentation and data collection mechanisms. One example is their module and programme revision documents which are available on the project website. In the longer term the institution is also considering linking the VLE to the student application and enquiry processes and the potential of using university OERs to feed into broader marketing processes. Future plans also include addressing staff and research data. The pragmatic incremental approach taken by City has some merit making sure they achieved some quick wins and using those to push forward more challenging tasks.

The student experience can be greatly enhanced by quite pragmatic approaches to incorporating information into VLEs and making them transparent to learners. Some of the projects in the sister programme Transforming Curriculum Delivery Through Technology saw significant enhancement through these kinds of approaches, such as course enrolment and payment, timetabling, attendance data and assignment handling. At the heart of these achievements is the need to create core sets of data that can be exposed in number of places… Perhaps the most significant example of this in relation to course information is XCRI.

XCRI (Exchanging Course Related Information)

The value of XCRI as a standard to facilitate exchange of course related information is fairly obvious, but an imperative for implementing it on a wide scale in institutions has been lacking. However the HEFCE requirements for institutions to provide KIS (Key Information Sets) for all courses from September 2012 and to feed into the HEAR (Higher Education Achievement Report) recording student achievement have both provided strong drivers to encourage the implementation of XCRI.

Mark Stubbs, mentioned earlier, has been involved with both the SRC Project (Supporting Responsive Curricula) which was featured in the earlier blog post and with the development of XCRI said “Although being able to make prospectus information available on course comparison websites without retyping will doubtless become a plus, the real value of XCRI lies in re-thinking business processes used to manage course-related information so that definitive data are available freely for re-use: for transcripts, for course approval, to provide context for VLE activities, to support personal development and for business intelligence-driven continuous improvement”

I plan to talk in more detail about XCRI in a future post…

It’s hard to capture the range of activities of a whole programme in a few blog posts but some key issues to emerge from talking to projects and reading their outputs so far are:

  • need for creation of core data sets that can be exposed in a variety of places
  • technical systems are sometimes perceived as the root of problems but simply reflect traditional and sometimes outmoded practice
  • integrated technical solutions can have significant impact on reducing inefficiencies and duplication if based on institution-wide dialogue and examination of processes, and through streamlining systems to share data more effectively
  • need for changes to documentation to facilitate better data collection – and best done after business process modelling has been undertaken
  • value of integrating business process review into ongoing core practice
  • stakeholder engagement and ongoing involvement
  • utilising modelling methods that suit the organisation
  • curriculum design and delivery can become embedded into core institution planning by making sure that people involved in making key decisions start with the learning and teaching requirements
  • informed curriculum planning can result from streamlined systems and people that understand , engage with, view and contribute to curriculum development processes in more meaningful ways

Perhaps it is better said by the SRC Project from MMU when talking about the development of their central academic database…

‘The change to an authoritative single source of courses information from pre-validation through advertising, enrolment, teaching and learning, and to production of HEAR records and even alumni support is a powerful one. It involves breaking down self-standing silos of information and addressing information technology and process issues across the whole institution. It results in a deeper knowledge and understanding of curricula by staff in the institution, and potentially by partners, by students and other learners, by employers and employees. This knowledge and understanding can be used to develop new curriculum elements, both pro-active and reactive to demands from inside and outside the institution, from learners in general and from employers in particular. At the heart of this is a customer-centric view that sees the organisation’s processes from a student viewpoint, the student customer journey being an end-to-end lifecycle that cuts across institutional functional silos.’

Excerpt from SRC case study.

A shorter two year sister programme ran in parallel to the Institutional approaches to Curriculum Design programme which focussed on curriculum delivery – the space where students engage with the curriculum. Both programmes naturally involved some overlap with curriculum design and delivery having close synergies. The Transforming Curriculum Delivery Through Technology programme has now completed and outcomes (lessons learned) and outputs (case studies, guidelines, etc.) are incorporated into the JISC Design Studio. Both programmes are feeding into this resource which was created during the programmes to provide both a resource for projects and ultimately a source for the wider community. http://jiscdesignstudio.pbworks.com/w/page/40379712/Transforming-Curriculum-Delivery-through-Technology

http://jiscdesignstudio.pbworks.com/w/page/40489793/Institutional-Approaches-to-Curriculum-Design


About Lou

Lou McGill is currently working independently and has recently been involved in synthesis and evaluation activities for the HE Academy/JISC UKOER programme and the JISC Transforming Curriculum Delivery Through Technology programme. She lead the team that produced the Good Intentions report on business cases for sharing and worked on the LLiDA study (Learning Literacies in a Digital Age). She has experience of working in a range of HE institutions as a librarian, learning technologist and project manager and used to be a JISC Programme Manager on the eLearning team. In the distant past she worked for NIACE (the adult learning organisation) and in Health education for the NHS. Her interests and experience include digital literacy, information literacy, open education, distance learning, managing organisational change, and effective use of technologies to support learning. Further information on Lou’s work can be found at: http://loumcgill.co.uk

Curriculum Design: The Big Picture

Guest Post

As part of our series of articles on the technical aspects of the e-Learning Programme over the next few weeks Lou McGill, e-learning consultant, will be focussing on the Institutional Approaches to Curriculum Design projects. Now three years in to the four year programme these twelve projects have been exploring the convoluted and sometimes opaque processes universities use to design, validate and deliver courses. In this first post Lou discovers just how useful modelling approaches have been to help projects clarify these curriculum processes.

Engaging with institutional processes and practice

Learning and teaching is of course the core business of our Universities and Colleges, but the processes around how courses are designed and developed are sometimes fragmented. Curriculum Design connects with several processes and systems within an educational institution and impacts on a range of stakeholders. It is difficult to engage with institutional processes without referring to ‘business’ language – and talking about curriculum design in this way can easily alienate the very people you need to engage. Taking a business process view of educational activities however can help to highlight technical and system requirements as well as supporting strategic planning and development. Similarly, focussing on efficiencies, reducing duplication and saving time can result in real enhancements for both the staff and student experience, as highlighted by the JISC Transforming Curriculum Delivery Through Technology programme which has just completed. (see end of this post for more information)

Projects taking part in the ongoing JISC funded Institutional approaches to Curriculum Design programme have become very familiar with the challenges of taking an institution-wide business process approach. How can you get academic staff who often see the course approval process as a ‘form-filling exercise’ to use it as an opportunity to re-imagine and re-think their curricula? How do you convince staff to see the bigger institution-wide picture and understand the links between the seemingly separate processes that support teaching and learning? How do you utilise technologies to take some of the burden of these processes away, leaving staff time to focus on more important activities?

Projects have been responding to these challenges by looking at ways to effectively integrate institutional systems to prevent duplication and streamline processes, particularly in relation to learning and teaching technologies such as institutional VLEs (Virtual Learning Environment) or e-portfolio systems. They have had to articulate and demonstrate the added value that linking these systems brings to a range or different stakeholders. Their experiences in identifying which technologies and standards can meet institutional needs and, perhaps more importantly, which people in the institution have the knowledge to inform these decisions are of value to other institutions. Key to streamlining these processes and integrating systems is the need to identify which data is central and how institutions collate, share and manage that information.

Like many Universities, project teams have also been grappling with the need to align curriculum design with external drivers such as employer needs, the widening participation agenda or practical things like UCAS or HERA requirements; institutional requirements for increased efficiency; and flexibility to respond to changing learner needs. Many institutional systems reflect a time when standard three year degrees where the norm. Modularisation of courses and increasing numbers of part-time, distance and work-based students has resulted in the need for more agile systems that can reflect changing learning patterns and the need for more flexible support mechanisms.

Sarah Knight, Programme Manager for the JISC e-Learning Team says ‘This is a difficult time for educational institutions as they struggle to make sure that they continue to offer high quality learning and teaching whilst responding to drivers for increased efficiency and the need to offer flexible learning choices. The projects in this programme are making excellent use of business process modelling and other innovative approaches to engage stakeholders, highlight their strengths and adapt their systems to be more effective. The wider HE and FE communities should find much to inform their own practice.’

For many projects the programme timing mirrored an institutional desire to review existing systems which provided an ideal opportunity to re-examine the processes affected by these systems.
The processes covered so far by the programme include:

  • course creation, approval, validation, documentation, QA, management (timetabling, resource allocation), modification;
  • student data – enrolment, admissions, registration, progression and assessment, records and e-portfolios;
  • marketing and advertising – recruitment

This range of processes is underpinned by a number of systems and data, many of which will have been developed over time in response to specific needs, and often without an institution-wide consideration. At City University the PREDICT Project (Promoting Realistic and Engaging Discussions in Curriculum Teams) proved timely as the Institution had identified a need for review of IS systems. The obvious practical value of the two things happening at the same time has been augmented by the long term benefits in raising the profile and understanding of curriculum planning requirements. John Gallagher, Enterprise Architect at City said,

‘Historically with IS our projects have ignored the impact of both Curriculum design and delivery unless that was the specific focus of the project. However now, thanks to the profile of PREDICT, we do try to assess any possible impact. An example of this recently was the introduction of a new Student Module Feedback system to provide additional information in order to assess academic performance. However this feedback is also essential for staff to review the design and delivery of the module. Prior to PREDICT we would have ignored this angle as it was not a specific deliverable of the project!’

Where to start?

As a four year programme projects had time to do a comprehensive baseline review and to invest significant time in engaging stakeholders. Project baselining activities or benchmarking are of high value to any change process as they offer time to look at and record where an institution is at the start of the change process and to really examine existing practice, identify strengths and highlight weaknesses. Having to explain something you do to others in detail provides an opportunity for questioning of activities that seem obvious but can be based on historical or traditional approaches that are no longer relevant.

When baselining is done across a programme of activities a picture of the sector begins to emerge and this one has highlighted issues relating to marketing, quality assurance and enhancement, understanding and responding to employer and learner needs and demands, issues around assessment and feedback and managing course information. These areas are discussed in more detail in a briefing paper produced in February 2011 and will, of course, be described in more detail in the final synthesis reports at the end of the programme. This series of blog posts will focus on issues around managing course information and how institutions link this to other data through a range of processes and systems.

Business process modelling

During the baselining stage many diagrams were developed to capture the various processes and data that result from these. One example is the process workflow diagram from the PIP project (Principles in Patterns) at the University of Strathclyde.

pip1-sm

It can be useful to see how other institutions visualise their activities and processes, even if they do tend to be context specific, but it is the process of developing the diagrams that has been of most significance to the institutions involved, such as this curriculum approval process mapping described by the PALET Project (Programme Approval Lean Electronic Toolset) from Cardiff university. The dialogue and learning to be had from identifying, recording and sharing how the various business processes link up has been noted by several projects. The SRC Project (Supporting Responsive Curricula) has produced an excellent case study detailing Manchester Metropolitan University’s stakeholder requirements for an Academic Database of programmes and units. This document describes the activities and includes examples of process diagrams. Engaging stakeholders through the use of rich pictures was a method adopted by the UG-Flex Project at the University of Greenwich – a diagramming technique developed to capture stakeholder views in a non-confrontational way. See this example – computer says no!

computersaysno

(A further explanation of developing rich pictures is available at:http://jiscdesignstudio.pbworks.com/w/page/24763278/Rich-Pictures)

It is clear from even these brief examples that Projects have adopted a range of modelling approaches. JISC CETIS ran a workshop in May 2009 to highlight the ArchiMate enterprise architechture modelling language for curriculum design processes. Discussions with projects at the end of the CETIS workshop highlighted that:

‘ In terms of cost-benefit, adopting a modelling approach for those projects that didn’t already use it, opinions differed. Some felt that the investment in software and skills acquisition were only worth it if an institution took the strategic decision to adopt a modelling approach. Others felt that a lightweight, iterative use of Archimate, perhaps using common drawing tools such as Visio during the trial stage, was a good first step. Yet others thought that modelling was worth doing only with considerable investment in tools and skills right at the beginning.’ Wilbert Kraan, JISC CETIS

The SRC Project used ArchiMate components to present visualisations during baselining activities and have also blogged about this. They presented at the CETIS workshop on their experiences with ARchiMate which they decided to try after finding the Course Validation Reference Model (COVARM) useful but a bit complex for most stakeholders. The SRC project also used UML (Unifying Modelling Language) to develop more detailed diagrams and examples are available in the SRC case study.

A range of project diagrams and process maps reflecting the different modelling approaches will be made available in the JISC Design Studio which is being added to as projects progress. The JISC CETIS Architecture and Modelling page provides an ongoing picture of developments in this area. It seems likely that these projects have significant potential to further our understanding of some key interoperability standards such as XCRI, learning design specifications, competency standards and qualification frameworks, particularly in relation to how these support data sharing across the range of institutional systems.

A shorter two year sister programme ran in parallel to the Institutional approaches to Curriculum Design programme which focussed on curriculum delivery – the space where students engage with the curriculum. Both programmes naturally involved some overlap with curriculum design and delivery having close synergies. The Transforming Curriculum Delivery Through Technology programme has now completed and outcomes (lessons learned) and outputs (case studies, guidelines, etc.) are incorporated into the JISC Design Studio. Both programmes are feeding into this resource which was created during the programmes to provide both a resource for projects and ultimately a source for the wider community. http://jiscdesignstudio.pbworks.com/w/page/40379712/Transforming-Curriculum-Delivery-through-Technology

http://jiscdesignstudio.pbworks.com/w/page/40489793/Institutional-Approaches-to-Curriculum-Design

About Lou

Lou McGill is currently working independently and has recently been involved in synthesis and evaluation activities for the HE Academy/JISC UKOER programme and the JISC Transforming Curriculum Delivery Through Technology programme. She lead the team that produced the Good Intentions report on business cases for sharing and worked on the LLiDA study (Learning Literacies in a Digital Age). She has experience of working in a range of HE institutions as a librarian, learning technologist and project manager and used to be a JISC Programme Manager on the eLearning team. In the distant past she worked for NIACE (the adult learning organisation) and in Health education for the NHS. Her interests and experience include digital literacy, information literacy, open education, distance learning, managing organisational change, and effective use of technologies to support learning. Further information on Lou’s work can be found at: http://loumcgill.co.uk

Welcome to OtherVoices

Other Voices is the JISC CETIS guest blog. It gives us a chance to invite others to join the conversation and bring their perspective and expertise to our discussions.

We may not always agree with everything guest bloggers write but we think they are worth listening to and consider it important to ensure they are part of the conversation.