Design bash 11 post event ponderings and questions

Following on from my pre event ponderings and questions , this post reflects on some of the outcomes from our recent Design Bash in Oxford. A quick summary post based on tweets from the day is also available.

Below is an updated potential workflow(s) diagram which I created to encourage discussion around potential workflows for some of the systems represented at the event.

Potential learning design workflows

Potential learning design workflows

As I pointed out in my earlier post, this is not a definitive view, rather a starting point for discussion and there are obvious and quite deliberate gaps, not least the omission of content sources. As learning design is primarily about structure, process and sequencing of activities not just content, I didn’t want to make it explicit and add yet another layer of complexity to an already crowded picture. What I was keen to see was some more investigation of the links between the more staff development, face to face processes and various systems, to quote myself:

“starting from some initial face to face activities such as the workshops being so successfully developed by the Viewpoints project or the Accreditation! game from the SRC project at MMU, or the various OULDI activities, what would be the next step? Could you then transform the mostly paper based information into a set of learning outcomes using the Co-genT tool? Could the file produced there then be imported into a learning design tool such as LAMS or LDSE or Compendium LD? And/ or could the file be imported to the MUSKET tool and transformed into XCRI CAP – which could then be used for marketing purposes? Can the finished design then be imported into a or a course database and/or a runtime environment such as a VLE or LAMS? “

Well we maybe didn’t get to quite as long a chain as that, however one of the several break-out groups did identify an alternative workflow

potential workflow tweet

potential workflow tweet

During the lightening presentation session Alejandro Armellini (University of Leicester) gave an overview of the Carpe Diem learning design process they have developed. Ale outlined how learning design had provided a backbone for their OER work. More information on the process is available in this post.

In the afternoon James Dalziel demo’d another workflow, where he took a pattern from the LDSE Learning Designer (a “predict, observe, explain” pattern shown in the lightening session by Diana Laurillard) converted it into a LAMS sequence, shared it in the LAMS community and embedded it into Cloudworks. A full overview of how James went about this, with reflections on the process and a powerpoint walkthrough is available on Cloudworks. The recent sharing and embedding features of LAMS are another key development in re-use.

Although technical interoperability is a key driver for integrating systems, with learning design pedagogical interoperability is just as important. Sharing (and shareable) designs is akin to the holy grail for learning design research, but there is always an element of human translation needed.

thoughts on design process

thoughts on design process

However James’ demo did show how much closer we are now to being able to effectively and easily share design patterns. You can see another example of an embedded LAMS sequence here.

The day generated a lot of discussion and hopefully stimulated some new workflows for participants to work on. In terms of issues coming out of the discussions, below is a list of some of the common themes which emerged from the feedback session:

*how to effectively combine f2f activities with more formal institutional processes
*useful to see connections between module and course level designs being articulated more
*emerging interoperability of systems
*looking at potential integrations has raised even more questions
*links to OER
*capturing commonalities and mapping of vocabularies and tools, role of semantic technologies and linked data approaches
*sufacing elements of course, module, activity design and the potential impact on learners as well as teachers
*what are “good enough” descriptions/ representations of designs to allow real teachers to use them

So, plenty of food for thought. Over the coming months I’ll be working on a mapping of the process/tools/guides etc we know of in this space. I’ll initially focus on JISC funded work, so if you know of other learning design tools, or have a shareable workflow, then please let me know.

The future of technology in education (FOTE11)

What is the the future of technology in education? This is the premise for the FOTE conference which was held on 7 October at UCL.  And the answer is . . . . 42, a piece of string? Well of course there isn’t a single one, and I don’t think there should be one definitive answer either, but parts of the complex jigsaw puzzle were highlighted over the day.

A few suggestions which were aired during the morning morning sessions included: it’s the standards and EA approaches on the latest Gartner education hype cycle; it’s “cool stuff” combining the physical and digital world to create engaging, memorable experiences (as exemplified by Bristol Uni); it’s predictive analytics; it’s flipped and naked; it’s games; it’s data objects; it’s the user – v – we don’t know as we haven’t figured out the purpose of education yet; it’s about better communication between IT departments and students. It’s about providing ubiquitous, reliable wifi access on campus and plenty of power sockets.

It’s probably a combination of all of these and more. But if we in education are to truly reap the benefits of the affordances of technology then we also need to be ensuring our culture is developing in parallel. As James Clay pointed out, people inherently don’t like change and this can be exacerbated in educational contexts. Why change when we’ve “always done it this way” or “it works, why change it?”. Students are powerful change agents – but only if our institutional processes allow them to be. Although there was knowing laughter around the room when he pointed out that “students are dangerous”, there was a serious underlying message. We need to be working more effectively with students to really uncover their needs for technology, and have meaningful interactions so that those in charge can make the most effective decisions about the services/hardware and software institutions provide. James rightly pointed out that we need to be asking students “what do you want to do” not “what do you want”.

There was also a lot of discussion over the day about students and “BYOD” (bring your own device). I think there is a general assumption now that students going to University will have a laptop and least one other mobile internet enable device (probably a phone). Which raises the question of institutional provision. During the day, I have to say I did feel that this panel session didn’t work that well, however it is actually the session/topic that I have spent most time thinking about since Friday.

On several occasions the student reps (and others) brought up the fact that often students don’t actually know if/where and when they can use their own devices in H/FE. Given the fact that in school all hardware is provided and personal devices are openly discouraged, this uncertainty isn’t that surprising, but I was glad to be reminded of it. Again this relates to the importance of recognising and allowing for cultural change and the importance of communication. Is it made clear to students when, where and how they can use their own devices (mobile, laptop and/or tablet)? How easy is it for students to find out about logging in to institutional services such as email, printers etc? How safe is it to carry your laptop/ipad to Uni? Do staff encourage or discourage use of personal devices in their classes? I’m sure that even amongst the technology savvy audience on Friday there were a few people wishing others weren’t constantly staring at their phones, laptops and predictably ipads and were listening to what the speakers were saying :-) After spending Tuesday at the Developing Digital Literacies Programme start up meeting, the issue of digital literacies is also key to the future technology in education.

All in all I found the day very engaging and thought provoking and the organisers should be congratulated for bringing together such a diverse range of speakers. I wonder what the future will look like this time next year?

Developing Digital Literacies Programme Start Up Meeting

The 12 successfully funded projects in the JISC Developing Digital Literacies programme met yesterday (4 October) in Birmingham for the programme start-up meeting.

The aim of the programme is to:

” . . .promote the development of coherent, inclusive and holistic institutional strategies and organisational approaches for developing digital literacies for all staff and students in UK further and higher education.”

with projects:

. . .working across the following stakeholder groupings in their plans for developing digital literacies: students, academic staff, research staff, librarians and learning resources and support staff, administrators and managers and institutional support staff . . .”

The programme has developed from previous user centred work funded by the JISC Elearning programme starting back in 2008 with the Learners’ experiences of e-learning programme, the 2009 Learning Literacies for a Digital Age Study, the 2010 Supporting learners in a Digital Age study and the series of Digital Literacy workshops being run this year.

To help get to know a bit more about each other, the projects gave three minute elevator pitches (which included a very entertaining poem from Pat Parslow of the Digitally Ready project, University of Reading.) Although all have different approaches, as highlighted by Helen Beetham (part of the programme synthesis team) there are a number of commonalities across the projects including:

*common access and opportunity
*impacts of technology on core practice 
*new demands on the sector

Helen also highlighted that at a programme level JISC wants to be able to move forward practice and thinking around digital literacies, build on what we know and not repeat what has gone before. From the short presentations given by the projects, I think there will be a lot rich information coming from all of the projects over the next two years.

As part of CETIS input, I will be providing programme level support around the technologies being used in the programme and collating information into our PROD database. Although the projects are very user-centric, I am particularly interested in surfacing issues around what are the preferred technologies for the different stake holder groups, how are they being provisioned at an institutional level? And, at more holistic level, what does it mean to be a truly digitally literate institution? In parallel with staff/student skills developments what are the technical infrastructure developments that need to be enabled? What are the key messages and workflows that need to truly embedded and understood by everyone in an institution?

I can already see links with the approaches being taken by the DVLE programme in-terms of light weight widgets/apps and mobile integrations with VLEs and other admin processes; and the DIAL project at the University of the Arts as part of its elevator pitch also highlighted links to its OER work. I’ll be writing this up initially as a series of blog posts.

Building on the model developed through the Curriculum Design and Delivery programmes, the Design Studio will also be used as an open collation and sharing space for project outputs. The programme is also going to work with a number of related professional bodies an related membership organisations to help share and promote common sector wide experience and best practice.

Quick overview of Design Bash 2011

We had another excellent Design Bash event on Friday 30 September at the University of Oxford. There was lots of discussion, sharing of ideas, practice and tools. I’ll be writing a more in-depth overview of the event over the coming week, but in the meantime, this twitter story gives a taster of the day.

View “Design Bash 2011″ on Storify

Design bash 11 pre-event ponderings and questions

In preparation for the this year’s Design Bash, I’ve been thinking about some of the “big” questions around learning design and what we actually want to achieve on the day.

When we first ran a design bash, 4 years ago as part of the JISC Design for Learning Programme we outlined three areas of activity /interoperability that we wanted to explore:
*System interoperability – looking at how the import and export of designs between systems can be facilitated;
*Sharing of designs – ascertaining the most effective way to export and share designs between systems;
*Describing designs – discovering the most useful representations of designs or patterns and whether they can be translated into runnable versions.

And to be fair I think these are still the valid and summarise the main areas we still need more exploration and sharing – particularly the translation into runnable versions aspect.

Over the past three years, there has been lots of progress in terms of the wider context of learning design in course and curriculum design contexts (i.e. through the JISC Curriculum Design and Delivery programmes) and also in terms of how best to support practitioners engage, develop and reflect on their practice. The evolution of the pedagogic planning tools from the Design for Learning programme into the current LDSE project being a key exemplar. We’ve also seen progress each year as a directly result of discussions at previous Design bashes e.g. embedding of LAMS sequences into Cloudworks (see my summary post from last year’s event for more details).

The work of the Curriculum Design projects in looking at the bigger picture in terms of the processes involved in formal curriculum design and approval processes, is making progress in bridging the gaps between formal course descriptions and representations/manifestations in such areas as course handbooks and marketing information, and what actually happens in the at the point of delivery to students. There is a growing set of tools emerging to help provide a number of representations of the curriculum. We also have a more thorough understanding of the wider business processes involved in curriculum approval as exemplified by this diagram from the PiP team, University of Strathclyde.

PiP Business Process workflow model

PiP Business Process workflow model

Given the multiple contexts we’re dealing with, how can we make the most of the day? Well I’d like to try and move away from the complexity of the PiP diagram concentrate a bit more on the “runtime” issue ie transforming and import representations/designs into systems which then can be used by students. It still takes a lot to beat the integration of design and runtime in LAMS imho. So, I’d like to see some exploration around potential workflows around the systems represented and how far inputs and outputs from each can actually go.

Based on some of the systems I know will be represented at the event, the diagram below makes a start at trying to illustrates some workflows we could potentially explore. N.B. This is a very simplified diagram and is meant as a starting point for discussion – it is not a complete picture.

Design Bash Workflows

Design Bash Workflows

So, for example, starting from some initial face to face activities such as the workshops being so successfully developed by the Viewpoints project or the Accreditation! game from the SRC project at MMU, or the various OULDI activities, what would be the next step? Could you then transform the mostly paper based information into a set of learning outcomes using the Co-genT tool? Could the file produced there then be imported into a learning design tool such as LAMS or LDSE or Compendium LD? And/ or could the file be imported to the MUSKET tool and transformed into XCRI CAP – which could then be used for marketing purposes? Can the finished design then be imported into a or a course database and/or a runtime environment such as a VLE or LAMS?

Or alternatively, working from the starting point of a course database, e.g. SRC where they have developed has a set template for all courses; would using the learning outcomes generating properties of the Co-genT tool enable staff to populate that database with “better” learning outcomes which are meaningful to the institution, teacher and student? (See this post for more information on the Co-genT toolkit).

Or another option, what is the scope for integrating some of these tools/workflows with other “hybrid” runtime environments such as Pebblepad?

These are just a few suggestions, and hopefully we will be able to start exploring some of them in more detail on the day. In the meantime if you have any thoughts/suggestions, I’d love to hear them.

My memory of eAssessment Scotland

Along with around another 270 people, attended the eAssessment Scotland Conference on 26 August at the University of Dundee. It was a thought provoking day, with lots of examples of some innovative approaches to assessment within the sector.

Steve Wheeler got the day off to a great start talking us through some of the “big questions” around assesment, for example is it knowledge or wisdom that we should be assessing? and what are the best ways to do this? Steve also emphasised the the evolving nature of assessment and the need to share best practice and introduced many of us to the term “ipsative assessment”. The other keynotes complemented this big picture view with Becka Coley sharing her experiences of the student perspective on assessment and Pamela Kata showing taking us through some of the really innovative serious games work she is doing with medical students. The closing keynote from Donald Clark again went back to some of the more generic issues around assessment and in particular assessment in schools and the current UK governments obsession with maths.

There is some really great stuff going on in the sector, and there is a growing set of tools, and more importantly evidence of the impact of using e-assessment techniques (as highlighted by Steve Draper, University of Glasgow). However it does seem still quite small scale. As Peter Hartley said e-assessment does seem to be a bit of a cottage industry at the moment and we really more institutional wide buy in for things to move up a gear. I particularly enjoyed the wry, slightly self-deprecating presentation from Malcolm MacTavish (University of Abertay Dundee) about his experiments with giving audio feedback to students. Despite being now able to evidence the impact of audio feedback and show that there were some cost efficiencies for staff, the institution has now implemented a written feedback only policy.

Perhaps we are on the cusp a breakthrough, and certainly the new JISC Assessment and Feedback programme will be allowing another round of innovative projects to get some more institutional traction.

I sometimes joke that twitter is my memory of events – I tweet therefore I am mentality :-) And those of you who read my blog will know I have experimented with the Storify service for collating tweets from events. But for a change, here is my twitter memory of the day via the memolane service.

Understanding, creating and using learning outcomes

How do you write learning outcomes? Do you really ensure that they are meaningful to you, to you students, to your academic board? Do you sometimes cut and paste from other courses? Are they just something that has to be done and are a bit opaque but do they job?

I suspect for most people involved in the development and teaching of courses, it’s a combination of all of the above. So, how can you ensure your learning outcomes are really engaging with all your key stakeholders?

Creating meaningful discussions around developing learning outcomes with employers was the starting point for the CogenT project (funded through the JISC Life Long Learning and Workforce Development Programme). Last week I attended a workshop where the project demonstrated the online toolkit they have developed. Initially designed to help foster meaningful and creative dialogue during co-circular course developments with employers, as the tool has developed and others have started to use it, a range of uses and possibilities have emerged.

As well as fostering creative dialogue and common understanding, the team wanted to develop a way to evidence discussions for QA purposes which showed explicit mappings between the expert employer language and academic/pedagogic language and the eventual learning outcomes used in formal course documentation.

Early versions of the toolkit started with the inclusion of number of relevant (and available) frameworks and vocabularies for level descriptors, from which the team extracted and contextualised key verbs into a list view.

List view of Cogent toolkit

List view of Cogent toolkit

(Ongoing development hopes to include the import of competencies frameworks and the use of XCRI CAP.)

Early feedback found that the list view was a bit off-putting so the developers created a cloud view.

Cloud view of CongeT toolkit

Cloud view of CongeT toolkit

and a Blooms view (based on Blooms Taxonomy).

Blooms View of CogenT toolkit

Blooms View of CogenT toolkit

By choosing verbs, the user is directed to set of recognised learning outcomes and can start to build and customize these for their own specific purpose.

CogenT learning outcomes

CogenT learning outcomes

As the tool uses standard frameworks, early user feedback started to highlight the potential for other uses for it such as: APEL; using it as part of HEAR reporting; using it with adult returners to education to help identify experience and skills; writing new learning outcomes and an almost natural progression to creating learning designs. Another really interesting use of the toolkit has been with learners. A case study at the University of Bedfordshire University has shown that students have found the toolkit very useful in helping them understand the differences and expectations of learning outcomes at different levels for example to paraphrase student feedback after using the tool ” I didn’t realise that evaluation at level 4 was different than evaluation at level 3″.

Unsurprisingly it was the learning design aspect that piqued my interest, and as the workshop progressed and we saw more examples of the toolkit in use, I could see it becoming another part of the the curriculum design tools and workflow jigsaw.

A number of the Design projects have revised curriculum documents now e.g. PALET and SRC, which clearly define the type of information needed to be inputted. The design workshops the Viewpoints project is running are proving to be very successful in getting people started on the course (re)design process (and like Co-genT use key verbs as discussion prompts).

So, for example I can see potential for course design teams after for taking part in a Viewpoints workshop then using the Co-genT tool to progress those outputs to specific learning outcomes (validated by the frameworks in the toolkit and/or ones they wanted to add) and then completing institutional documentation. I could also see toolkit being used in conjunction with a pedagogic planning tool such as Phoebe and the LDSE.

The Design projects could also play a useful role in helping to populate the toolkit with any competency or other recognised frameworks they are using. There could also be potential for using the toolkit as part of the development of XCRI to include more teaching and learning related information, by helping to identify common education fields through surfacing commonly used and recognised level descriptors and competencies and the potential development of identifiers for them.

Although JISC funding is now at an end, the team are continuing to refine and develop the tool and are looking for feedback. You can find out more from the project website. Paul Bailey has also written an excellent summary of the workshop.

Transforming curriculum delivery through technology: New JISC guide and radio show launched

A new JISC guide ” Transforming curriculum delivery through technology: Stories of challenge, benefit and change” has been launched today.

a mini-guide to the outcomes of the JISC Transforming Curriculum Delivery Through Technology programme, summarises the headline benefits of technology in curriculum delivery made evident by the work of the 15 projects in the programme The outcomes of these projects provide a rich insight into the ways in which institutions and individual curriculum areas can make use of technology to respond more robustly to the demands of a changing world.”

You can access PDF and text only versions of the guide, or order a print copy by following this link

The latest installment of the JISC on Air series, Efficiences, enhancements and transformation: how technology can deliver includes interviews with two projects involved in the programme, (Making the New Diploma a Success and eBioLabs) discussing the impact achieved in two very different contexts and disciplines.

If the mini-guide whets your appetite for more information about the programme, the Programme Synthesis report provides more in-depth analysis of the lessons learned, and further information and access to project outputs is available from Design Studio.

Approaching The Learning Stack case study

Over the past couple of years, I’ve seen a number of presentations by various colleagues from the Univeristat Oberta de Catalunya about the development of their learning technology provision. And last September I was privileged to join with other international colleagues for their OpenEd Tech summit.

Eva de Lera (Senior Strategist at UOC) has just sent me a copy of a case study they have produced for Gartner (Case Study: Approaching the Learning Stack: The Third Generation LMS at Univeristat Oberta de Catalunya). The report gives an overview of how and why UOC have moved from a traditional monolithic VLE to their current “learning stack”, which is based on a SOA approach. NB you do have to register to access the report.

The key findings and recommendations are salient and resonate with many of the findings that are starting to come through for example the JISC Curriculum Design programme and many (if not all) of the JISC programmes which we at CETIS support. The findings and recommendations focus on the need for development of community collaboration which UOC has fostered. Both in terms of the internal staff/student community and in terms of the community driven nature of open source sofware development. Taking this approach has ensured that their infrastructure is flexible enough to incorporate new services whilst still maintaining tried and trusted ones and allowed them the flexibility to implement a range of relevant standards and web 2 technologies. The report also highlights the need to accept failure when supporting innovation – and importantly the need to document and share any failures. It is often too easy to forget that many (if not most of) the best innovation comes from the lessons learned from the experience of failure.

If we want to build flexible, effective systems (both in terms of user experience and cost) then we need to ensure that we have foster an culture which supports open innovation. I certainly feel that that is one thing which JISC has enabled the UK HE and FE sectors to do, and long may it continue.

From challenge to change: how technology can transform curriculum delivery

A recording of the online presentation “From challenge to change: how technology can transform curriculum delivery” by Lisa Gray (JISC Progamme Manager), Marianne Sheppard (Researcher/Analyst, JISC infoNet and project co-ordinator for the Support and Synthesis project) and myself is now available online.

Session Synopsis:
During 2008–2010, the JISC Transforming Curriculum Delivery through Technology Programme investigated the potential of technology to support more flexible and creative models of curriculum delivery in colleges and universities. The 15 projects within the programme sought to address a wide range of challenges such as: improving motivation, achievement and retention; managing large cohorts; supporting remote and distance learners; engaging learners with feedback; responsiveness to changing stakeholder needs; delivering resource efficiencies which enhance the quality of the learning experience. Through the various project investigations, the programme has learned how and where technology can not only add value but can transform the way in which the curriculum is delivered in different contexts.

This session summarized the key messages and findings emerging from the work of the projects and demonstrated some of the outputs from the projects available from the Design Studio.

For more detailed information I can thoroughly recommend the programme synthesis report by Lou McGill which provides detailed information on programme theme, key lessons learnt and project outputs.