Communicating technical change – the trojan horse of technology

As the JISC funded Curriculum Design Programme is now entering its final year, the recent Programme meeting focused on effective sharing of outputs. The theme of the day was “Going beyond the obvious, talking about challenge and change”.

In the morning there were a number of breakout sessions around different methods/approaches of how to effectively tell stories from projects. I co-facilitated the “Telling the Story – representing technical change” session.

Now, as anyone who has been involved in any project that involved implementing of changing technology systems, one of the keys to success is actually not to talk too much about the technology itself – but to highlight the benefits of what it actually does/will do. Of course there are times when projects need to have in-depth technical conversations, but in terms of the wider project story, the technical details don’t need to be at the forefront. What is vital is that that the project can articulate change processes both in technical and human work-flow terms.

Each project in the programme undertook an extensive base-lining exercise to identify the processes and systems (human and technical) involved in the curriculum design process ( the PiP Process workflow model is a good example of the output of this activity).

Most projects agreed that this activity had been really useful in allowing wider conversations around the curriculum design and approval process, as there actually weren’t any formal spaces for these types of discussions. In the session there was also the feeling that actually, technology was the trojan horse around which the often trickier human process issues could be discussed. As with all educational technology related projects all projects have had issues with language and common understandings.

So what are the successful techniques or “stories” around communicating technical changes? Peter Bird and Rachael Forsyth from the SRC project shared their experiences with using and external consultant to run stakeholder engagement workshops around the development of a new academic database. They have also written a comprehensive case study on their experiences. The screen shot below captures some of the issues the project had to deal with – and I’m sure that this could represents views in practically any institution.
screen-capture2

MMU have now created their new database and have a documentation which is being rolled out. You can see a version of it in the Design Studio. There was quite a bit of discussion in the group about how they managed to get a relatively minimal set of fields (5 learning outcomes, 2 assessments) – some of that was down that well known BOAFP (back of a fag packet) methodology . . .

Conversely, the PALET team at Cardiff are now having to add more fields to their programme and module forms now they are integrating with SITS and have more feedback from students. Again you can see examples of these in the Design Studio. The T-Sparc project have also undertaken extensive stakeholder engagement (in which they used a number of techniques including video which was part of another break out session) and are now starting to work with a dedicated sharepoint developer to build their new webforms. To aid collaboration the user interface will have discussion tabs and then the system will create a definitive PDF for a central document store, it will also be able to route the data into other relevant places such as course handbooks, KIS returns etc.

As you can see from the links in the text we are starting to build up a number of examples of course and module specifications in the Design Studio, and this will only grow as more projects start to share their outputs in this space over the coming year. One thing the group discussed which the support team will work with the projects to try and create is some kind of check list for course documentation creation based on the findings of all the projects. There was also a lot of discussion around the practical issues of course information management and general data management e.g. data creation, storage, workflow, versioning, instances.

As I pointed out in my previous post about the meeting, it was great to see such a lot of sharing going on in the meeting and that these experiences are now being shared via a number of routes including the Design Studio.

Crib sheet for 2011 Educause Horizon Report

The 2011 Horizon Report from Educause again provides some clear indicators for key trends and drivers for technology in education. As ever the report outlines key trends, critical drivers and short and long term forecasts as well as providing an excellent set of resources for each of the identified trends. But if you haven’t time even to read the executive summary here are the main points.

Key trends (building very much on previous years):

*The abundance of resources and relationships made easily accessible via the Internet is increasingly challenging us to revisit our roles as educators in sense-making, coaching, and credentialing.
*People expect to be able to work, learn, and study whenever and wherever they want.
*The world of work is increasingly collaborative, giving rise to reflection about the way student projects are structured.
*The technologies we use are increasingly cloud-based, and our notions of IT support are decentralized

Critical Challenges:
*Digital media literacy continues its rise in importance as a key skill in every discipline and profession.
*Appropriate metrics of evaluation lag behind the emergence of new scholarly forms of
authoring, publishing, and researching.
*Economic pressures and new models of education are presenting unprecedented competition to traditional models of the university.
*Keeping pace with the rapid proliferation of information, software tools, and devices is challenging for students and teachers alike.

Technologies to watch:
*electronic books – as they develop they are changing “our perception of what it means to read”
*mobiles – “increasingly a user’s first choice for Internet access”

Second Adoption Horizon (technologies expected to gain widespread adoption in 2 to 3 years from now).
*Agumented reality
*Game-based learning

Far term horizon (technologies expected to gain widespread adoption in 4 – 5 years from now)
*Gesture based computing
*Learning analytics

Assessment and Feeback – the story from 2 February

Many thanks to my colleague Rowin Young and the Making Assessment Count project at the University of Westminster for organising a thoroughly engaging and thought provoking event around assessment and feedback yesterday. I just got my storify invite through this morning, so to give a flavour of the day here is a selected tweet story from the day.

What technologies have been used to transform curriculum delivery?

The Transforming Curriculum Delivery through Technology (aka Curriculum Delivery) Programme is now finished. Over the past two years, the 15 funded projects have all been on quite a journey and have between them explored the use of an array of technologies (over 60) from excel to skype to moodle to google wave.

The bubblegram and treegraph below give a couple of different visual overviews of the range technologies used.

As has been reported before, there’s not been anything particularly revolutionary or cutting edge about the technologies being used. The programme did not mandate any particular standards or technical approaches. Rather, the projects have concentrated on staff and student engagement with technology. Which of course is the key to having real impact in teaching and learning. The technologies themselves can’t do it alone.

The sheer numbers of technologies being used does, I think, show an increasing confidence and flexibility not only from staff and students but also in developing institutional systems. People are no longer looking for the magic out of the box solution and are more willing to develop their own integrations based on their real needs. The ubiquity of the VLE does come through loud and clear.

There are still some key lessons coming through.

* Simple is best – don’t try and get staff (and students) to use too many new things at once.
* Have support in place for users – if you are trying something new, make sure you have the appropriate levels of support in place for users.
*Tell people what you are doing – talk about your project, wherever you can and share your objectives as widely as possible. Show people the benefits of what you are doing. Encourage others to share too.
*Talk to institutional IT support teams about what you are planning – before trying to use a new piece of software, make sure it does work within your institutional network. IT teams can provide invaluable information and advice about will/won’t work. They can also provide insights into scalability issues for future developments. A number of the projects have found that although web 2.0 technologies can be implemented relatively quickly, there are issues when trying to increase the scale of trial projects.

A full record of the technologies in use for the projects is available from our PROD project database. More information on the projects and a selection of very useful shareable outputs (including case studies and resources) is available from the Design Studio.

Thoughts so far on LAK11

Along with about 400 or so others world-wide, I’ve signed up for the LAK11 (Learning and Knowledge Analytics) MOOC run by George Siemens and colleagues at the Technology Enhanced Knowledge Research Institute (TEKRI) at Athabasca University. We’re now into week 2, and I think I’m just about getting into the swing of things.

When George was in the UK late last year, I managed to catch his presentation at Glasgow Caledonian, and I was intrigued with the concept of learning analytics, and in particular how we can start to use data in meaningful ways for teaching and learning. I wanted to know more about what learning analytics are and so signed up for the course. I’ve also been intrigued by the concept of MOOCs so this seemed liked the ideal opportunity to try one out for myself.

In her overview paper, Tanya Elias provides a useful description: ” Learning analytics is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study including business intelligence, web analytics, academic analytics, educational data mining, and action analytics.” (Elias, T. (2011) Learning Analytics: Definitions, Processes, Potential)

The course outcomes are:
*Define learning and knowledge analytics
*Map the developments of technologies and practices that influence learning and knowledge analytics as well as developments and trends peripheral to the field.
*Evaluate prominent analytics methods and tools and determine appropriate contexts where the methods would be most effective.
*Describe how “big data” and data-driven decision making differ from traditional decision making and the potential future implications of this transition.
*Design a learning analytics implementation plan at a course level. 
*Evaluate the potential impact of the semantic web and linked data on learning resources and curriculum.
*Detail various elements organizational leaders need to consider to roll out an integrated knowledge and learning analytics model in an organizational setting.
*Describe and evaluate developing trends in learning and knowledge analytics and develop models for their potential impact on teaching, learning, and organizational knowledge

You can check out the full course syllablus here .

The fact that the course is open and non-accredited really appealed to me as, to be honest, I am a bit lazy and not sure if I wanted to commit to to a formal course. The mix of online resources, use of tags, aggregation etc fits right in with my working practices. I blog, I tweet, I’m always picking up bits of useful (and useless) information from my streams – so having a bit of focus for some activity sounded perfect – I’m a self motivated kind of a person aren’t I?

But it’s never that simple is it? Old habits die hard – particularly that nagging feeling of guilt about signing up for a course and not reading all the suggested texts, reading all the forum messages, doing all the suggested activities. Is it just me that suffers from the tensions of trying to be an engaged, self motivated learner and everyday distractions and procrastination? I’ve had some vey circular discussion about myself about why I’m not actually looking at the course material at times.

However, George and the team have been particularly good at reassuring people and emphasising that we need to “let go of traditional boundaries”. With a cohort this large it’s pretty near impossible to keep up with everything so they actively encourage people only to do what they can, and concentrate on what what really interests you. They actively encourage “skim and dive” techniques -skim the all the resources and dive into what catches your eye/interest. If you’ve being thinking about doing one of the MOOCs then I would recommend having a listen to the introductory elluminate session (another great thing about open courses is that all the resources are available to everyone, anytime).

I’ve found the eliminate sessions the most interesting so far. Not because the other resources provided aren’t as engaging – far from it. I think it’s more to do with the synchronous element and actually feeling part of a community. All the speakers so far have been very engaging, as has the chat from participants.

Last week as introduction to Learning Analytics, John Fritz, UMBC gave an overview of the work he’s leading in trying to help students improve performance by giving them access to data about their online activity. They built a BlackBoard building block called Check My Activity (CMA), you can read more about it here. John and colleagues are also now active in trying to use data from their LMS to help teachers design more effective online actives.

This week’s topic is “The Rise of Big Data” and on Tuesday, Ryan Baker from Worcester Polytechnic Institute was in the eliminate hot seat, giving us an introduction to Educational Data Mining (EDM). EDM draws heavily on data mining methodologies, but in the context of educational data. Ryan explained it as a distillation of data for human judgement. In other words making complex data understandable and useful for non information scientists. EDM and Learning Analytics are both growing research areas, and the there are a number of parallels between them. We did have quite a bit of discussion about what the differences were exactly, which boiled down to the fact that both are concerned with the same deep issues, but learning analytics is maybe broader in scope and using more qualitative approaches to data and not so dedicated to data mining methodology as EDM. Ryan gave an overview of the work he has been doing around behaviour modelling from data generated by some of Carnegie Mellon Cognitive Tutor programmes, and how they are using the data to redesign actives to reduce for example students going “off task”. Again you can access the talk from the course moodle site.

Next week I’m hoping to be doing a bit more diving as the topic is Sematinc Web, Linked Data and Intelligent Curriculum. Despite the promise, there really isn’t that much evidence of linked data approaches being used in teaching and learning contexts as we found with the JISC funded SemTech report and more recently when Lorna Campbell and I produced our briefing paper on The Semantic Web, Linked and Open Data. I think that there are many opportunities for using linked data approaches. The Dynamic Learning Maps project at the University of Newcastle is probably the best example I can think of. However, linking data within many institutions is a key problem. The data is invariable not in a standard form, and even when it is there’s a fair bit of house keeping to be done. So finding linkable data to use is still a key challenge and I’m looking forward to finding out what others are doing in this area.

Design Studio video walkthrough

Finding resources from JISC programmes is an perennial problem. Websites wither and die once funding ends, people move on, we forget project names and resources become increasingly difficult to track down. The current JISC Curriculum Design and Delivery Programmes are trying to help solve this problem through the development of the Design Studio.

The Design Studio is a wiki-based resource which links and contextualizes resources created by the projects in both programmes, and other related resources from previous JISC and HEA funded activities. As part of a session for the upcoming JISC Online conference, Marianne Sheppard (JISC Infonet) has created a short introductory video to the Design Studio – if you are interested in tracking down resources related to innovative teaching and learning practice then this is a great place to start and the video is a great introduction to the resource.

Challenging times, challenging curriculum(s)

The fact that we are living in increasingly challenging times is becoming ever more apparent. With the release of the Browne Report on HE funding and student finance, and the results of the Comprehensive Spending Review imminent; we are faced with radical changes to the current models of funding for our Universities. This is raising fundamental questions about the nature of teaching and learning provision, the role and relationship of students to institutions, the role and relationship of institutions and government and how institutions work with industry (in the widest sense of the word). It was in the wake of this complex backdrop, the current JISC funded Curriculum Delivery and Design programmes held a joint programme meeting last week Nottingham. The projects in these programmes are all grappling with issues around effective use of technology to enhance curriculum design and delivery process and provide a range of more flexible, adaptable curricula.

The meeting began with a very timely keynote from Peter Finlay from the QAA. Dispelling some of the current myths around the point and processes involved in QAA audits, Peter illustrated how inter-dependencies of what he described as the “triad” forces (State, Institutions and National Agencies) influence the quality assurance processes. The triad tends to work in a cyclical fashion with the interactions and developments of each stakeholder oscillating between extremes of autonomy within institutions to extremes of regulation from the State. The later most noticeably enforced by QA procedures. Peter highlighted how forward thinking institutions can use the QA process to create and foster institutional cultures of enquiry, based on informed reflection which should allow planned enhancement strategies.

The work of both the curriculum design and delivery programmes is already helping the institutions involved to take this approach as the projects are fundamentally about transforming course delivery and the course design and validation processes. Peter encouraged projects to promote and enhance the work they are doing. The current political context is unpredictable. However, by being proactive, institutions can influence the practice of QA. Peter finished by restating that he felt the programmes, and the work already highlighted within the Design Studio, is of great relevance and a major asset to the wider community.

The rest of the first day was then divided into a number of breakout session centred around some barriers/drivers to institutional change. Notes from each of the sessions will be available from the Circle website later this week. The day culminated with the Great Exhibition Awards Ceremony. Each of the Delivery projects set up their stall (you can get a feel for the stands from the pre event adverts for each project in the Design Studio ). Delegates had time to visit each stand then vote. The two runaway winners were Springboard TV (College of West Anglia) and Integrate (University of Exeter). Both teams thoroughly deserved the thoroughly outrageous chocolate prizes.

The second day started with another timely keynote, this time from Professor Betty Collis. Betty’s talk focused on her experiences learning from a workplace perspective -in particular through some of the key trends from her experiences of working with Shell. Taking us on a journey through some of the stages in the development of task orientated, work-based learning activities, Betty explained how they had developed a culture change from “I learn from myself, through to I learn with my group, to I learning in order to contribute to the learning of others throughout the enterprise.” Quite a leap – even for highly qualified, professionals. Shell had identified that their new graduate staff (even those at PhD level) had little experience of multidisciplinary, high pressured team working situations. By introducing a framework encapsulated by three verbs “ask, share, learn”, Betty and her team fostered the notion of coaching and effective organisational knowledge sharing. The use of a wiki as a common platform for knowledge sharing was fundamental to this process.

Betty encouraged the audience to think about formal education settings in a similar way by designing more cross discipline activities to help develop sharing/coaching and team working skills and to start thinking of e-portfolios not just as individual collation tools but as shared learning resources. She also challenged the programmes definition of design for learning which “refers to the complex processes by which practitioners devise, structure and realise learning for others” and reframe thinking to ask is it ultimately the task of formal education to fosters methods for learners (and teachers) to work with others to become more mature members of a learning organisation?

A number of the breakout sessions again highlighted some of the inroads projects are making in a number of these areas. Student engagement was high on the agenda and Integrate project from the University of Exeter has some excellent examples of students acting as real change agents.

The meeting finished was a panel session, which unsurprisingly focused on many of the issues the Brown report highlighted – particularly around fees and contact hours. Today’s education space is more complicated than ever. At a sectoral level we need to get politicians to understand the complexities, and we be able to provide accurate, update information about courses at a range of levels for a range of stakeholders. We are of course making good inroads with the work of XCRI in particular, but we need to do more and think more about how we can harness the principles of linked data to share information internally and externally. Peter Finlay also highlighted the need for greater clarity about when students are part of the learning partnership and when they are more service based customers i.e. paying for halls of residence as opposed to choosing a course of study. We need to ensure that students are able to commit to a learning partnership, as co-creators of knowledge and not just passive recipients.

We live in challenging times. However, there is a huge amount of experience within these two programmes (and across a range of JISC funded projects and beyond). We need to ensure that the lessons learned about the effective use of technology throughout the curriculum design and delivery process are being used as positive change agents to help us ensure the quality of our sector.

More information about the programme meeting is available from the Circle website and resources from the projects are available from the Design Studio. A timeline of the events twitter activity is also available online.

Design Bash update

Due to holidays etc I’ve been a bit late in reporting back on the Design Bash we held in conjunction with the 2010 European LAMS conference last month at the University of Oxford.

This is the third design bash I’ve been involved in organising, and they’re probably closest in style and structure to an un-conference. There is no pre-set agenda and the main aim of the day is to foster meaningful extended dialogue between delegates. In other words, just allowing people to speak to each other. This year, the groups divided along a number of lines. One group spent most of the day discussing the ” critical success factors for curriculum design”. Paul Bartholomew from the T-SPARC project at BCU, helpfully created a mindmap of the discussion.

In contrast to these more cerebral discussions, there were a number of mini-demonstrations of tools and systems including the GLO tool, ldshake, and compendium LD, and wookie. Again links to all the tools are in the available online from the Design Bash Cloudworks site.

James Dalziel demoed a number of new features of the LAMS system such as embedding which many of the delegates were interested in. At last year’s design bash, embedding and previewing of designs was a key theme of many of the discussions, so it was great to see how over the year the discussion has developed into an actual implementation.

Members of the LDSE project team attended and the day provided a great opportunity for the team to discuss and develop potential integrations from others. For example, Bill Olivier and Diana Laurillard had a very fruitful discussion about LDSE using the IDIBL framework that the University of Bolton have developed.

Unlike last year’s event there wasn’t very much activity around sharing of designs, and I’m not sure if that was due to the size of this year’s event – there were quite a few more people in attendance. Or, if it was simply down the the overriding interests of participants this year. If we run the event again next year, we may have a slightly more structured agenda and dedicated demo slots and a slightly more structured technical stream. We did also discuss the possibility of running a similar event online. This is something we may well investigate further, and certainly it has possibilities. The cloudworks site itself does allow for a level of interactivity, however I did notice that there wasn’t as much external contribution this year compared with last. However, again this just maybe down to fact that we had more people there in person.

Overall though, there was very positive feedback from delegates on the day. You can view (comment and contribute too) all the resources from the day from Cloudworks.

Making assessment count, e-Reflect SUM released

Gunter Saunders and his team on the Making Assessment Count project (part of the current JISC Curriculum Delivery programme), have just released a SUM (service useage model) describing the process they have introduced to engage students (and staff) in the assessment process.

“The SUM presents a three stage framework for feedback to students on coursework. The SUM can act to guide both students and staff in the feedback process, potentially helping to ensure that both groups of stakeholders view feedback and its use as a structured process centred around reflection and discussion and leading to action and development.”

You can access the e-Reflect SUM here.