REAPing the benefits of transformation

Attendees at last September’s SIG meeting will remember Martin Hawksey’s lively presentation on the Re-Engineering Assessment Practices in Scottish Higher Education (REAP) project.  Funded by the Scottish Funding Council and supported by JISC, the project explored ways in which technology can be used to enhance and transform assessment practice in large first year university classes, resulting in enhanced learner skills, greater achievement rates, and deeper engagement.

A final report on the project is available, discussing a range of topics such as project achievements and lessons learned, preparing for, managing and coping with large scale organisational changes, the pedagogic principles underlying transformation and a study on the use of electronic voting systems (EVS) and the surprising impact they can make on learning and achievement.

The figures reported are impressive: one course saw mean pass marks rise from 51.1% to 57.4%, another’s examination failure rate dropped from 24% to 4.6%, while a third saw a 10.4% gain in mean examination marks; hundreds of hours of staff time were saved through reductions in lectures, tutorials and the use of online assessments while students actually spent more time ‘on task’, and the nature of staff-student contact became more supportive and facilitative.  Self-assessment and peer assessment gave students more responsibility for and ownership of their learning, to which students generally responded positively.

As the report suggests, ‘these findings suggest that these processes of transformation are a plausible prospect more generally in the HE sector’.

The Horizon Report 2008

The 2008 Horizon Report, fifth in the annual series, is now available online.  A collaboration between the New Media Consortium (NMC) and the EDUCAUSE Learning Initiative, the report examines six emerging technologies that the authors predict ‘will likely enter mainstream use in learning-focused organizations … over the next one to five years’.

The report focuses on six technologies or practices in particular: grassroots video and collaboration webs, predicted to enter the mainstream over the next year; mobile broadband and data mashups (two to three years); and collective intelligence and social operating systems (four to five years).  The emphasis is on educational applications of these technologies, with a range of example projects and products illustrating them in action.

Earlier Horizon reports and other publications can also be freely downloaded from the NMC site.

Joint CETIS Assessment and Educational Content SIGs meeting announced

Registrations are now open for our next Assessment SIG meeting, and you’re warmly invited to book your place for this event hosted by the University of Cambridge.  It’s a joint meeting with the CETIS Educational Content SIG, something we’ve been planning to do for some time, looking in particular at two standards of interest to assessment: IMS Common Cartridge and IMS Tools Interoperability.

Common Cartridge hasn’t even been released yet, but has already generated significant interest amongst content vendors and publishers and has been heavily promoted by IMS.  It combines profiles of a number of different standards, including IMS Content Packaging v1.1.4, IMS Question and Test Interoperability v1.2.1, IMS Tools Interoperability Guidelines v1.0, IEEE LOM v1.0 and SCORM v1.2 and 2004.  The resultant learning object package or ‘cartridge’ is intended to be fully interoperable across any compliant system allowing content to be delivered to any authorised individual.

The appeal of Common Cartridge coupled with authentication and digital rights management systems to content publishers is clear, and the specification is particularly suited to the American educational system where there is a closer relationship between content vendor and courses than in UK Higher Education; in the UK, its primary impact may be in the schools and Further Education sectors where there is more of a history of buying content from publishers than HE.  The inclination of many UK HE lecturers to produce their own content and the bespoke nature of many higher level courses are issues we’ve already encountered when looking at topics such as open content and item banking, but there is some interest within UK education, in particular from the Open University.  As a major content producer whose resources are used far beyond their own courses, Common Cartridge has clear potential, and Ross McKenzie and Sarah Wood of OU OpenLearn will offer an insight into their experiences of implementing the specification and developing cartridges.  There has been very little work to date on the delivery of assessment material through Common Cartridge, a topic which will be addressed by Niall Barr of NB Software.  Our own Wilbert Kraan and Adam Cooper will update delegates on the current position of Common Cartridge.

IMS Tools Interoperability has received rather less fanfare, but is a valuable specification which takes a web services approach to seamlessly integrating different tools.  It allows specialist tools to be ‘plugged in’ to a learning management system, such as integrating a sophisticated assessment management system with a VLE which only provides limited native support for assessment, or discipline-specific tools such as simulators.  It also supports accessibility requirements through the (optional) incorporation of the user’s IMS Accessibility Learner Information Package profile to allow silent interface configuration.  Warwick Bailey of Icodeon will be discussing his experiences with the specification.

In the morning, Steve Lay of CARET, University of Cambridge, will be providing an update on the current state of IMS QTI v2.1.  Steve is co-chair of the IMS QTI working group (with Pierre Gorissen of SURF and Fontys University).

The afternoon will feature a presentation by Linn van der Zanden of the Scottish Qualifications Authority on the use of wikis and blogs in education and assessment, picking up on an increasing interest in the use and potential of Web 2.0 technologies in this domain.

The meeting is colocated with a workshop by the three JISC Capital Programme projects focusing on assessment to which you are also invited

Capital Programme dissemination workshop announced

The three JISC Capital Programme projects working on assessment will be hosting a dissemination workshop at the University of Cambridge the day before the joint CETIS Assessment and Educational Content SIGs meeting at the same venue.  The workshop will feature demonstrations of the tools, discussion on future directions for the programme and explore ways of building an open source development community to support it.

In the morning, participants will have a chance to see the tools demonstrated and the role of web services in delivering an end-to-end assessment process. The afternoon session will split into two tracks.  Track I, Building an Open Source Community to support QTI-based tools, will have a technical focus, incorporating discussion on implementation issues and introducing participants to the projects’ open source development support.  Track II, Innovation and Interoperability in Assessment, will look at some of the issues around assessment and evaluation software within the community together with more innovative and imaginative uses of QTI.

The projects on display will be of considerable interest.  Offering the first implementations of IMS Question and Test Interoperability v2.1 freely available to the community, they provide functionality to support assessment from authoring to delivery.  AQuRate, based at Kingston University, supports item authoring, with one particularly notable feature being its attractive and friendly user interface.  Minibix, based at the University of Cambridge, provides item banking functionality suitable for both high stakes private item banks for summative assessment and low stakes item banks for resource sharing and formative assessment.  The trio is completed by AsDel, based at the University of Southampton, which provides a range of small web-based tools for test delivery, test validation, test management and basic test construction.

As with the SIG meeting, registration is free and open to all.

Assessment in 2008: looking forward

Gales are howling, trains in chaos, so it must be January and time to look ahead to what 2008 has in store…

The final release of QTI v2.1 should be out this spring, and it’ll be interesting to see what uptake is like.  This will be the most stable and mature version of the specification to date, supported by a long public draft stage and a number of implementations.  Angel Learning are a significant commercial early adopter, and other vendors are bound to be looking at their experiences and whether Angel’s embracing of the specification has an impact on their own customer demand for QTI 2.1. 

Other significant implementors of 2.1 are the JISC Capital Programme projects which will be concluding around March.  AQuRate offers an item authoring tool, Minibix provides support for a range of item banking functions while ASDEL is an assessment delivery engine which supports both standalone use and integration with a VLE.    These projects should deliver quality resources to the community which will provide a firm foundation for use of the specification.  There was a sneak preview of these projects at our last SIG meeting.

Talking of SIG meetings, dates for the next two meetings can now be confirmed. 

On 19 February there will be a joint meeting with the CETIS Educational Content SIG in Cambridge.  This meeting will cover a range of shared concerns such as new content related specifications such as Common Cartridge and Tools Interoperability, and innovative approaches to educational material and assessment.  Information about this meeting and online registration will be available very soon.  This will be preceded by a workshop hosted by the Capital Programme projects discussed above.

The focus shifts from assessment as content to assessment as process with another joint meeting on 1 May in Glasgow.  This meeting will be a joint meeting with the CETIS Portfolio and Enterprise SIGs and will offer an opportunity to explore some of the shared issues in these domains.  Again, information on the event will be available on the mailing lists, on this blog and on the website in due course.

Another event of note is the annual International Computer Assisted Assessment Conference on 8 and 9 July at Loughborough.  The call for papers is already out, with submissions due by 29 February.  As always, this should be a lively and important event in the CAA calendar.  Alt-C 2008, Rethinking the Digital Divide, will be held in Leeds on 9 – 11 September; again, the closing date for submissions is 29 February.  There’s also a regularly updated list of non-CETIS assessment related events on the wiki.

And what about the trends for eassessment in 2008?  The results of Sheila’s poll, with a strong emphasis on Web 2.0 technologies and possibilities, do seem to reflect to some extent the comments on the last meeting’s evaluation forms which suggested increasing interest in innovative technologies, signficant concern with transforming and enhancing the assessment experience and direct engagement with teaching and learning rather than the more abstract issues of standards and specifications for their own sake.  It will be interesting to see how the more ‘traditional’ XML-based QTI v2.1 fares in the light of the increasing popularity of mashups and web services in 2008.

Assessment SIG meeting, 26 September 2007

Academics and developers met in Glasgow recently to participate in the most recent Assessment SIG meeting. The very full agenda covered a range of topics, both technical and pedagogic, and presentations led to some lively discussions.

Myles Danson of JISC opened the day by presenting JISC’s views and priorities for eassessment, as well as pointing to some future work they will be undertaking in the domain.

Yongwu Miao of the Open University of the Netherlands discussed work undertaken by the TENCompetence Project, with a particular focus on the relationship between IMS QTI and IMS Learning Design and the work they have done in this area. Dick Bacon of the University of Surrey and the HEA discussed the relationship between different varieties or ‘dialects’ of QTI, exploring some of the implementation and interpretation issues that hinder or break interoperability between systems nominally implementing the same version of the specification. CAL Consultant Graham Smith pleased the audience with news that a new Java version of his QTI demonstrator will be available shortly with updated support for QTI 2.0 items, which should help in the identification and resolution of implementation problems.

Martin Hawksey of the University of Strathclyde presented the work of the Re-Engineering Assessment Practices project. With a focus on real world assessment experiences, including an impressive collection of case studies exploring the impact of transformation within assessment practices, the REAP project was of particular interest to participants. Also of great interest, and perhaps unsuprisingly sparking the greatest amount of debate, was the exploration of ‘Assessment 2.0′ presented by Bobby Elliott of the Scottish Qualifications Authority. Bobby looked at ways in which Web 2.0 technologies can be used to enhance and modernise assessment in ways which can engage and appeal to increasingly digitally literate learners.

The day also featured several demonstrations of tools under development. Niall Barr of NB Software demonstrated his current work, an assessment tool which utilises the IMS QTI, Content Packaging and Common Cartridge specifications, while Steve Bennett of the University of Hertfordshire demonstrated MCQFM, a JISC-funded tool which provides a simple text-based format for converting and editing items between formats. Two more JISC projects closed the day. AQuRate, presented by Alicia Campos and David Livingstone of Kingstone University, is an elegant item authoring tool while ASDEL, presented by Jon Hare of the University of Southampton, is an assessment delivery tool which builds on the R2Q2 project to provide a fuller test tool. A third project, Minibix (University of Cambridge) on item banking, is working closely with AQuRate and ASDEL.

Links to presentations (via slideshare), project websites and other information can all be found on our wiki: http://wiki.cetis.org.uk/JISC_CETIS_Assessment_SIG_meeting%2C_26_September_2007.

Assessment for Learner Responsibility

On Monday, I attended a Learning Enhancement Network event here at Strathclyde on Assessment for Learner Responsibility.  Strathclyde is in the process of revising its assessment policy, and this event brought together staff from across the university, together with a number of student representatives and some colleagues from other universities. 

The University of Edinburgh has recently completed a similar process, and Nigel Seaton from Edinburgh’s College of Science and Engineering presented the outcomes of this process.  It was particularly interesting to see that all assessment in the College is now formative, in that students receive feedback for all the work they do – including formal examinations.  I really like this: as both a student and a tutor I always found it hugely frustrating to not be able to get or give feedback from the most important assessments beyond a bald grade or classification.  This is particularly important for students who experience signficantly worse performance in these assessments than in earlier coursework, and who are often bewildered, demoralised and demotivated by the lack of information provided.  The Data Protection and Freedom of Information Acts make the disclosure of examiners’ comments a legal obligation, but it’s nice to see this spun positively and used as a real learning opportunity rather than just warning markers not to make rude comments on exam scripts.

The College will also be introducing an eportfolio system, not for PDP but for use as a subject-specific learning and reflection aid for students, another very appealing idea. 

Jim Baxter from Strathclyde’s Department of Psychology gave a very entertaining presentation on collaborative WebCT-based activities introduced to the first year course in collaboration with the REAP project.  This involved a lot of collaborative work, something which had already been raised by Nigel and which was returned to in the afternoon’s breakout discussions.  I’ve never been a fan of group assessment, peer assessment, and other activities which force people into social models regardless of whether that is what is right for them, and I was pleased to see that I wasn’t the only one who felt some unease about compelling students to take part in such activities.  There was genuine agreement that there’s a need to respect all different learning styles and that there may be a tension between fashionable approaches and what’s actually best for an individual.  Staff reported that the majority of students themselves said that they dislike group work, although it’s regarded far more positively in post-graduation surveys – which is rather interesting itself, perhaps extroverts are more likely to complete surveys..?

One issue which concerned me is the observation that a student can be prevented from sitting the final examination for a course if they haven’t participated in group activities.  The justification for this is that these activities are detailed in the course materials so they knew what they were signing up for – but surely students should be studying a course because they’re interested in the subject and not because they can cope with the teaching style, and really shouldn’t be prevented from studying their chosen subject because they are don’t have a particular learning style.  The principle that students should have a choice in the methods and timing of assessment is therefore very welcome.

David Nicol presented the eleven principles of good assessment practive which were the initial outcomes from the university’s working group examining the assessment policy and from David’s long interest in this area.  These covered engagement to stimulate learning through clarificiation of what constitutes good practice, encouraging ‘time and effort’ on educationally purposeful tasks, high quality feedback, opportunties to close the feedback look, encourage positive motivational beliefs and self-esteem and encourage dialogue about learning between all stakeholders.  Other principles focused on empowerment, sharing responsibility for learning with students by facilitating self-assessment and reflection, giving learners choice in the nature, methods, criteria and timing of assessment, involving students in policy and practice decisions and supporting the development of learner communities and social integration. 

A particular strength of the day was the involvement of Strathclyde students and the opportunity the day gave for dialogue between staff and learners.  The greatest concerns which emerged from the afternoon breakout discussions were time, feedback and over-examining.

Deadline coordination is perhaps a particular issue at Strathclyde which has a very broad first year curriculum: in the Faculty of Law, Arts and Social Sciences, for example, students have to study five potentially quite disparate subjects, all of which tend to set the same deadlines for work.  There were occasional examples of the principle of ‘giving learners choice in the timing of assessment tasks’, for example the lecturer who negotiates deadlines with his classes.  Our group spent some time lusting after a hypothetical ‘assessment booking system’ which course coordinators could use to pick ‘slots’ for assessment deadlines – perhaps something the Enterprise SIG could look at :-)

Feedback was another major issue for the students.  They appreciate detailed constructive feedback, particularly where it explains marks in relation to published assessment criteria and offers suggestions for improving weak areas as well as highlighting strengths.  There were a few examples of bad practice, with ‘feedback’ which consisted of a smiley face and a mark being a particular low, but not many examples of truly detailed feedback.  The timeliness of feedback was also a concern: as one student observed, ‘how can I learn from my feedback if I don’t get it until after I’ve submitted my next assignment?’

Students and staff were both concerned that students are being over-examined.  Virtually every piece of work a student submits contributes towards the final mark for the course, meaning that students don’t have a space to fail.  The students in our group actually wanted more formative assessment, practice excercises particularly when undertaking a new type of task they hadn’t encountered before, and the opportunity to experiment and learn before being summatively assessed.

Other issues that emerged were the desire to offer staff incentives and reward innovative and engaging projects, training for staff in how to write content that actually will engage learners, and the real desire to share innovation throughout the university.

Massively Multi Learner

The HEA Information and Computer Science subject centre recently ran a workshop, ‘Massively Multi Learner’, on learning in multi user virtual environments which I was fortunate enough to be able to attend.

Perhaps inevitably, the presentations on the day were heavily skewed towards Second Life, a fact that I was glad to see the organisers themselves acknowledged as not necessarily ideal.  Unfortunately, Carl Potts, who had been scheduled to speak on learning within guilds in World of Warcraft, was unable to attend, but Laz Allen of TPLD (standing in for Helen Routledge) provided a non-SL and more game-orientated perspective on emerging technologies.  Of particular interest was the emphasis in this presentation on the assessment of game-based learning and of gaming activities, through reflection and debriefing, and through the logging and interpretation of ingame activities with reference to an identified set of skills.  Unlike commercial off-the-shelf games (COTS) and other resources such as SL, games specifically designed for learning can offer a more effective balance of learning objectives, subject matter content and gameplay, with assessment – often itself highly innovative – integrated from the outset.

The rest of the presentations all referenced SL to a greater or lesser extent.  I hugely enjoyed Aleks Krotoski‘s work on social networking in virtual worlds, in particular her identification of 75 avatars (“they know who they are”) who form “the feted [fetid?] inner core of Second Life”.  Unlike either single-player or MMO games, MUVEs such as SL are inherently socially orientated rather than goal-orientated; ‘success’ doesn’t come necessarily from accumulation of in-game objects or from PvP or PvE pwnage but from occupying key, extremely powerful positions within social networks.  As an infrequent and rather ‘resistive’ SLer, I feel strongly that the lack of scaffolding within SL, in contrast to the carefully balanced quest structure in games such as WoW which directs players through the game world and encourages casual grouping, makes social relationships within SL disproportionately important.

Other presentations explored some of the many purposes to which SL is being put.  Dave Taylor of the National Physical Laboratory discussed some very exciting international collaboration which has been taking place in the Space Island cluster, while Peter Twining demonstrated the Schome island pilot on the teen grid which is trialing SL as a learning space for a group of ‘gifted and talented’ learners.  Jeremy Kemp discussed Sloodle, an integration of SL and Moodle which uses mashups to connect the two systems.  The integration of SL and Moodle also offers the potential for resolving accessibility issues around SL by offering meaningful real time alternatives to inworld communications.

The final three speakers had all integrated SL closely into their teaching practice.  Mike Hobbs of Anglia Rushkin University described scripting tasks undertaken by second year Computing Science students to create learning resources used to explain computing concepts to first year students, while Annabeth Robinson (well known in SL as AngryBeth for her creative and practical objects) described the options her Design for Digital Media students had for woriking in SL and particularly for using it as a tool for machinima.  Mike Reddy provided an entertaining end to the day, looking at various ways in which Second Life can be integrated into a range of courses.

Digital Literacy, Podcasting and eLearning – trainerspod webinar

Yesterday afternoon saw my second webinar in two days, this time a session on digital literacy, podcasting and elearning led by Graham Attwell of Pontydysgu.  Around 40 people took part from many countries.  Because of the way in which the session was run (which I’ll discuss in yet another blog post), the following is just a few impressions from the event rather than a proper report.  The archived session will be available online soon for those who would like to learn more about this topic.

The session was split into three sections: digital literacies and new pedagogic approaches; what is a podcast and how should it be used in education; and how to make a podcast.

Graham pointed out that traditional LMSs use a traditional, didactic, ‘push’ approach to learning.  In the new era of ‘elearning 2.0′, this should change to a more constructivist approach; however, there are many activities around at the moment that are constructivist in name only – as always, there is a need to examine what we’re actually doing instead of just optimistically applying labels like plasters and hoping they stick.

One quote, from Harry Jenkins, which particularly struck me was: ‘We need to shift the focus of the conversation about the digital divide from questions of technological access to those of opportunities to participate and to develop the cultural competencies and social skills needed for full involvement’.  Graham cited the example of George W Bush discussing what he’s ‘used on the Google‘ to help illustrate this point, and Senator Ted Stevens’ own personal internet is also always worth remembering (no, it’s not funny).

Clarence Fisher’s ‘eleven skills for participation’ were mentioned, and are worth repeating: play, performance, simulation, appropriation, multitasking, distributed cognition, collective intelligence, judgement, transmedia navigation, networking and negotiation. 

It was interesting to learn that I’m not the only person who dislikes the term ‘podcasting’ because of its close relationship to Apple and the iPod.  A few alternatives were suggested, with ‘audio report’ being the most popular, but – as with the whole Web 2.0 business- it’s not got that snappy, cliquey, in-the-know connotation that will probably leave us with podcasting for some time to come.

It’s also worth mentioning that enhanced podcasts (enhanced audio reports?) which allow users to embed still images such as the ubiquitous PowerPoint slides into podcasts, can be created with tools such as Garageband, Audacity and Divicast; we made presentations from our joint Assessment and MDR SIGs meeting available as Breeze presentations, integrating MP3 recordings and PowerPoint presentations and received a generally favourable reaction.

A key part of the new models of education and elearning is sharing, yet I felt that Graham made one of the most important points of all when he said that it’s about ‘learning to share, learning how to share, and learning how to have the right not to share’.  That’s something that will be very relevant to the TrainersPod webinar he’ll be leading on eportfolios in the new year: I’ll certainly be there.