De-regulation, data and learning design

Data,to coin a phrase from the fashion industry, it’s the new black isn’t it? Open data, linked data, shared data the list goes on. With the advent of the KIS, gathering aspects institutional data is becoming an increasing strategic priority with HE institutions (particularly in England).

Over the past couple of weeks I’ve been to a number of events where data has been a central theme, albeit from very different perspectives. Last week I attended the Deregulating higher education: risks and responsibilities conference. I have to confess that I was more than a bit out of my comfort zone at this meeting. The vast majority of delegates were made up of Registrars, Financial Managers and Quality Assurance staff. Unsurprisingly there were no major insights into the future, apart from a sort of clarification that the new “level playing field” for HE Institutions, is actually in reality going to be more of a series of playing fields. Sir Alan Langlands presentation gave an excellent summary of the challenges facing HEFCE as its role evolves from ” from grant provider to targeted investor”.

Other keynote speakers explored the risks, benefits exposed by the suggested changes to the HE sector – particularly around measurements for private providers. Key concerns from the floor seemed to centre around greater clarity of the status of University i.e. they are not public bodies but are expected to deal with FOI requests in the same way which is very costly; whilst conversely having to complete certain corporation tax returns when they don’t actually pay corporation tax. Like I said, I was quite out of my comfort zone – and slightly dismayed about the lack of discussion around teaching, learning and research activities.

However, as highlighted by John Craven, University of Plymouth, good auditable information is key for any competitive market. There are particular difficulties (or challenges?) in coming to consensus around key information for the education sector. KIS is a start at trying to do exactly this. But, and here’s the rub, is KIS really the key information we need to collect? Is there a consensus? How will it enhance the student experience – particularly around impact of teaching and learning strategies and the effective use of technology? And (imho) most crucially how will it evolve? How can we ensure KIS data collection is more than a tick box exercise?

Of course I don’t have any of the answers, but I do think a key part of this is lies in continued educational research and development, particularly learning analytics. We need to find ways to empowering students and academics to effectively use and interact with tools and technology which collect data. And also help them understand where, how and what data is collected and used and represented in activities such as KIS collection.

As these thoughts were mulling in my head, I was at the final meeting for the LDSE project earlier this week. During Diana Laurillard’s presentation, the KIS was featured. This time in the context of how a tool such as the Learning Designer could be used to as part of the data collection process. The Learning Designer allows a user to analyse a learning design in terms of its pedagogical structure and time allocation both in terms of teaching and preparation time, as the screen shot below illustrates.

Learning Designer screenshot

The tool is now also trying to encourage re-use of materials (particularly OERs) by giving a comparison of preparation time between creating a resource and reusing and existing one.

The development of tools with this kind of analysis is crucial in helping teachers (and learners) understand more about the composition and potential impact of learning activities. I’d also hope that by encouraging teachers to use these tools (and similar ones developed by the OULDI project for example) we could start to engage in a more meaningful dialogue around what types of data around teaching and learning activities should be included in such activities as the KIS. Simple analysis of bottom line teacher contact time does our teachers and learners an injustice – not to mention potentially negate innovation.

The Learning Designer is now at the difficult transition point from being tool developed as part of a research project into something that can actually be used “in anger”. I struck me that what might be useful would be tap into the work of current JISC elearning programmes and have one (or perhaps a series) of design bashes where we could look more closely at the Learning Designer and explore potential further developments. This would also provide an opportunity to have some more holistic discussions around the wider work flow issues around integrating design tools not only in the design process but also in other data driven processes such as KIS collection. I’d welcome any thoughts anyone may have about this.

Exploring learning in transition, latest JISC Radio Show

In the run up to this years JISC online conference, a selection of the key note speakers have contributed to the latest JISC radio show, JISC Online Conference explores learning in transition. As well as giving some insights into their views on some of the key topics the conference, during the show keynotes also share some of their experiences of being a participant in an online conference.

Touching on topics from open education and the use and development of OERs to curriculum design to increasing learner engagement, the podcast gives a tantalising taster of some of the issues these keynote speakers will be raising. For example, Ewan MacIntosh poses the challenge to universities and colleges of providing learning maps, compasses or ulitmately GPSs for students for their learning journeys, whilst Mike Sharples highlights the importance of the “co-evolution of learning and technology” to create truly engaging and effective learning experiences. All in all a great way to warm up and get thinking about the discussions and debates which will take place during the conference week.

The podcast (and transcript) is available from the JISC website, and it’s not too late to register for the conference itself, more information is again available from the JISC website. If you’re still in two minds about participating in an online conference, there’s also a nice little video from past participants sharing their experiences at the bottom of the main conference page.

Sustaining and Embedding Change: Curriculum Design Programme meeting overview

The penultimate Curriculum Design Programme meeting took place earlier this week in Nottingham. Three and a half years into the funding cycle, the meeting focused on life after programme. What are the most effective ways to share, embed, build on the changes instigated by projects within and across institutions?

I’ll be writing a more reflective post over the coming days but here is a summary of the two days, based on the #jisccdd twitter stream.

[View the story “Sustaining and embedding changes to curriculum design practices and processes” on Storify]

Developing Digital Literacies Programme Start Up Meeting

The 12 successfully funded projects in the JISC Developing Digital Literacies programme met yesterday (4 October) in Birmingham for the programme start-up meeting.

The aim of the programme is to:

” . . .promote the development of coherent, inclusive and holistic institutional strategies and organisational approaches for developing digital literacies for all staff and students in UK further and higher education.”

with projects:

. . .working across the following stakeholder groupings in their plans for developing digital literacies: students, academic staff, research staff, librarians and learning resources and support staff, administrators and managers and institutional support staff . . .”

The programme has developed from previous user centred work funded by the JISC Elearning programme starting back in 2008 with the Learners’ experiences of e-learning programme, the 2009 Learning Literacies for a Digital Age Study, the 2010 Supporting learners in a Digital Age study and the series of Digital Literacy workshops being run this year.

To help get to know a bit more about each other, the projects gave three minute elevator pitches (which included a very entertaining poem from Pat Parslow of the Digitally Ready project, University of Reading.) Although all have different approaches, as highlighted by Helen Beetham (part of the programme synthesis team) there are a number of commonalities across the projects including:

*common access and opportunity
*impacts of technology on core practice 
*new demands on the sector

Helen also highlighted that at a programme level JISC wants to be able to move forward practice and thinking around digital literacies, build on what we know and not repeat what has gone before. From the short presentations given by the projects, I think there will be a lot rich information coming from all of the projects over the next two years.

As part of CETIS input, I will be providing programme level support around the technologies being used in the programme and collating information into our PROD database. Although the projects are very user-centric, I am particularly interested in surfacing issues around what are the preferred technologies for the different stake holder groups, how are they being provisioned at an institutional level? And, at more holistic level, what does it mean to be a truly digitally literate institution? In parallel with staff/student skills developments what are the technical infrastructure developments that need to be enabled? What are the key messages and workflows that need to truly embedded and understood by everyone in an institution?

I can already see links with the approaches being taken by the DVLE programme in-terms of light weight widgets/apps and mobile integrations with VLEs and other admin processes; and the DIAL project at the University of the Arts as part of its elevator pitch also highlighted links to its OER work. I’ll be writing this up initially as a series of blog posts.

Building on the model developed through the Curriculum Design and Delivery programmes, the Design Studio will also be used as an open collation and sharing space for project outputs. The programme is also going to work with a number of related professional bodies an related membership organisations to help share and promote common sector wide experience and best practice.

Quick overview of Design Bash 2011

We had another excellent Design Bash event on Friday 30 September at the University of Oxford. There was lots of discussion, sharing of ideas, practice and tools. I’ll be writing a more in-depth overview of the event over the coming week, but in the meantime, this twitter story gives a taster of the day.

View “Design Bash 2011″ on Storify

Design bash 11 pre-event ponderings and questions

In preparation for the this year’s Design Bash, I’ve been thinking about some of the “big” questions around learning design and what we actually want to achieve on the day.

When we first ran a design bash, 4 years ago as part of the JISC Design for Learning Programme we outlined three areas of activity /interoperability that we wanted to explore:
*System interoperability – looking at how the import and export of designs between systems can be facilitated;
*Sharing of designs – ascertaining the most effective way to export and share designs between systems;
*Describing designs – discovering the most useful representations of designs or patterns and whether they can be translated into runnable versions.

And to be fair I think these are still the valid and summarise the main areas we still need more exploration and sharing – particularly the translation into runnable versions aspect.

Over the past three years, there has been lots of progress in terms of the wider context of learning design in course and curriculum design contexts (i.e. through the JISC Curriculum Design and Delivery programmes) and also in terms of how best to support practitioners engage, develop and reflect on their practice. The evolution of the pedagogic planning tools from the Design for Learning programme into the current LDSE project being a key exemplar. We’ve also seen progress each year as a directly result of discussions at previous Design bashes e.g. embedding of LAMS sequences into Cloudworks (see my summary post from last year’s event for more details).

The work of the Curriculum Design projects in looking at the bigger picture in terms of the processes involved in formal curriculum design and approval processes, is making progress in bridging the gaps between formal course descriptions and representations/manifestations in such areas as course handbooks and marketing information, and what actually happens in the at the point of delivery to students. There is a growing set of tools emerging to help provide a number of representations of the curriculum. We also have a more thorough understanding of the wider business processes involved in curriculum approval as exemplified by this diagram from the PiP team, University of Strathclyde.

PiP Business Process workflow model

PiP Business Process workflow model

Given the multiple contexts we’re dealing with, how can we make the most of the day? Well I’d like to try and move away from the complexity of the PiP diagram concentrate a bit more on the “runtime” issue ie transforming and import representations/designs into systems which then can be used by students. It still takes a lot to beat the integration of design and runtime in LAMS imho. So, I’d like to see some exploration around potential workflows around the systems represented and how far inputs and outputs from each can actually go.

Based on some of the systems I know will be represented at the event, the diagram below makes a start at trying to illustrates some workflows we could potentially explore. N.B. This is a very simplified diagram and is meant as a starting point for discussion – it is not a complete picture.

Design Bash Workflows

Design Bash Workflows

So, for example, starting from some initial face to face activities such as the workshops being so successfully developed by the Viewpoints project or the Accreditation! game from the SRC project at MMU, or the various OULDI activities, what would be the next step? Could you then transform the mostly paper based information into a set of learning outcomes using the Co-genT tool? Could the file produced there then be imported into a learning design tool such as LAMS or LDSE or Compendium LD? And/ or could the file be imported to the MUSKET tool and transformed into XCRI CAP – which could then be used for marketing purposes? Can the finished design then be imported into a or a course database and/or a runtime environment such as a VLE or LAMS?

Or alternatively, working from the starting point of a course database, e.g. SRC where they have developed has a set template for all courses; would using the learning outcomes generating properties of the Co-genT tool enable staff to populate that database with “better” learning outcomes which are meaningful to the institution, teacher and student? (See this post for more information on the Co-genT toolkit).

Or another option, what is the scope for integrating some of these tools/workflows with other “hybrid” runtime environments such as Pebblepad?

These are just a few suggestions, and hopefully we will be able to start exploring some of them in more detail on the day. In the meantime if you have any thoughts/suggestions, I’d love to hear them.

Understanding, creating and using learning outcomes

How do you write learning outcomes? Do you really ensure that they are meaningful to you, to you students, to your academic board? Do you sometimes cut and paste from other courses? Are they just something that has to be done and are a bit opaque but do they job?

I suspect for most people involved in the development and teaching of courses, it’s a combination of all of the above. So, how can you ensure your learning outcomes are really engaging with all your key stakeholders?

Creating meaningful discussions around developing learning outcomes with employers was the starting point for the CogenT project (funded through the JISC Life Long Learning and Workforce Development Programme). Last week I attended a workshop where the project demonstrated the online toolkit they have developed. Initially designed to help foster meaningful and creative dialogue during co-circular course developments with employers, as the tool has developed and others have started to use it, a range of uses and possibilities have emerged.

As well as fostering creative dialogue and common understanding, the team wanted to develop a way to evidence discussions for QA purposes which showed explicit mappings between the expert employer language and academic/pedagogic language and the eventual learning outcomes used in formal course documentation.

Early versions of the toolkit started with the inclusion of number of relevant (and available) frameworks and vocabularies for level descriptors, from which the team extracted and contextualised key verbs into a list view.

List view of Cogent toolkit

List view of Cogent toolkit

(Ongoing development hopes to include the import of competencies frameworks and the use of XCRI CAP.)

Early feedback found that the list view was a bit off-putting so the developers created a cloud view.

Cloud view of CongeT toolkit

Cloud view of CongeT toolkit

and a Blooms view (based on Blooms Taxonomy).

Blooms View of CogenT toolkit

Blooms View of CogenT toolkit

By choosing verbs, the user is directed to set of recognised learning outcomes and can start to build and customize these for their own specific purpose.

CogenT learning outcomes

CogenT learning outcomes

As the tool uses standard frameworks, early user feedback started to highlight the potential for other uses for it such as: APEL; using it as part of HEAR reporting; using it with adult returners to education to help identify experience and skills; writing new learning outcomes and an almost natural progression to creating learning designs. Another really interesting use of the toolkit has been with learners. A case study at the University of Bedfordshire University has shown that students have found the toolkit very useful in helping them understand the differences and expectations of learning outcomes at different levels for example to paraphrase student feedback after using the tool ” I didn’t realise that evaluation at level 4 was different than evaluation at level 3″.

Unsurprisingly it was the learning design aspect that piqued my interest, and as the workshop progressed and we saw more examples of the toolkit in use, I could see it becoming another part of the the curriculum design tools and workflow jigsaw.

A number of the Design projects have revised curriculum documents now e.g. PALET and SRC, which clearly define the type of information needed to be inputted. The design workshops the Viewpoints project is running are proving to be very successful in getting people started on the course (re)design process (and like Co-genT use key verbs as discussion prompts).

So, for example I can see potential for course design teams after for taking part in a Viewpoints workshop then using the Co-genT tool to progress those outputs to specific learning outcomes (validated by the frameworks in the toolkit and/or ones they wanted to add) and then completing institutional documentation. I could also see toolkit being used in conjunction with a pedagogic planning tool such as Phoebe and the LDSE.

The Design projects could also play a useful role in helping to populate the toolkit with any competency or other recognised frameworks they are using. There could also be potential for using the toolkit as part of the development of XCRI to include more teaching and learning related information, by helping to identify common education fields through surfacing commonly used and recognised level descriptors and competencies and the potential development of identifiers for them.

Although JISC funding is now at an end, the team are continuing to refine and develop the tool and are looking for feedback. You can find out more from the project website. Paul Bailey has also written an excellent summary of the workshop.

Transforming curriculum delivery through technology: New JISC guide and radio show launched

A new JISC guide ” Transforming curriculum delivery through technology: Stories of challenge, benefit and change” has been launched today.

a mini-guide to the outcomes of the JISC Transforming Curriculum Delivery Through Technology programme, summarises the headline benefits of technology in curriculum delivery made evident by the work of the 15 projects in the programme The outcomes of these projects provide a rich insight into the ways in which institutions and individual curriculum areas can make use of technology to respond more robustly to the demands of a changing world.”

You can access PDF and text only versions of the guide, or order a print copy by following this link

The latest installment of the JISC on Air series, Efficiences, enhancements and transformation: how technology can deliver includes interviews with two projects involved in the programme, (Making the New Diploma a Success and eBioLabs) discussing the impact achieved in two very different contexts and disciplines.

If the mini-guide whets your appetite for more information about the programme, the Programme Synthesis report provides more in-depth analysis of the lessons learned, and further information and access to project outputs is available from Design Studio.