2020 vision

The third Future of the Internet survey sponsored by Pew Internet will be available online for the next few weeks and is well worth participating in.  Participants are encouraged to express their views on topics such as digital inclusion, DRM, privacy and digital identity, and virtual and mirror worlds in the year 2020, and can remain anonymous or identify themselves as they wish.  It’s a stimulating and thought-provoking exercise, as well as an opportunity to contribute to a significant study on perceptions of our online futures.

You can also check out the results from previous surveys and a range of other internet-related resources; particularly fun are the predictions from the early 90s.  One that stood out for me was Eric Hughes’s 1992 comment, ‘In the world of the future, people will use low-cost Radio Shack equipment to spy on themselves to find out who they are': in the world of FOAF and Facebook fakes, we need to spy on ourselves to find out who we’ve been constructed as.  No comments about tinfoil hats, thank you very much.

Assessment in 2008: looking forward

Gales are howling, trains in chaos, so it must be January and time to look ahead to what 2008 has in store…

The final release of QTI v2.1 should be out this spring, and it’ll be interesting to see what uptake is like.  This will be the most stable and mature version of the specification to date, supported by a long public draft stage and a number of implementations.  Angel Learning are a significant commercial early adopter, and other vendors are bound to be looking at their experiences and whether Angel’s embracing of the specification has an impact on their own customer demand for QTI 2.1. 

Other significant implementors of 2.1 are the JISC Capital Programme projects which will be concluding around March.  AQuRate offers an item authoring tool, Minibix provides support for a range of item banking functions while ASDEL is an assessment delivery engine which supports both standalone use and integration with a VLE.    These projects should deliver quality resources to the community which will provide a firm foundation for use of the specification.  There was a sneak preview of these projects at our last SIG meeting.

Talking of SIG meetings, dates for the next two meetings can now be confirmed. 

On 19 February there will be a joint meeting with the CETIS Educational Content SIG in Cambridge.  This meeting will cover a range of shared concerns such as new content related specifications such as Common Cartridge and Tools Interoperability, and innovative approaches to educational material and assessment.  Information about this meeting and online registration will be available very soon.  This will be preceded by a workshop hosted by the Capital Programme projects discussed above.

The focus shifts from assessment as content to assessment as process with another joint meeting on 1 May in Glasgow.  This meeting will be a joint meeting with the CETIS Portfolio and Enterprise SIGs and will offer an opportunity to explore some of the shared issues in these domains.  Again, information on the event will be available on the mailing lists, on this blog and on the website in due course.

Another event of note is the annual International Computer Assisted Assessment Conference on 8 and 9 July at Loughborough.  The call for papers is already out, with submissions due by 29 February.  As always, this should be a lively and important event in the CAA calendar.  Alt-C 2008, Rethinking the Digital Divide, will be held in Leeds on 9 – 11 September; again, the closing date for submissions is 29 February.  There’s also a regularly updated list of non-CETIS assessment related events on the wiki.

And what about the trends for eassessment in 2008?  The results of Sheila’s poll, with a strong emphasis on Web 2.0 technologies and possibilities, do seem to reflect to some extent the comments on the last meeting’s evaluation forms which suggested increasing interest in innovative technologies, signficant concern with transforming and enhancing the assessment experience and direct engagement with teaching and learning rather than the more abstract issues of standards and specifications for their own sake.  It will be interesting to see how the more ‘traditional’ XML-based QTI v2.1 fares in the light of the increasing popularity of mashups and web services in 2008.

Assessment SIG meeting, 26 September 2007

Academics and developers met in Glasgow recently to participate in the most recent Assessment SIG meeting. The very full agenda covered a range of topics, both technical and pedagogic, and presentations led to some lively discussions.

Myles Danson of JISC opened the day by presenting JISC’s views and priorities for eassessment, as well as pointing to some future work they will be undertaking in the domain.

Yongwu Miao of the Open University of the Netherlands discussed work undertaken by the TENCompetence Project, with a particular focus on the relationship between IMS QTI and IMS Learning Design and the work they have done in this area. Dick Bacon of the University of Surrey and the HEA discussed the relationship between different varieties or ‘dialects’ of QTI, exploring some of the implementation and interpretation issues that hinder or break interoperability between systems nominally implementing the same version of the specification. CAL Consultant Graham Smith pleased the audience with news that a new Java version of his QTI demonstrator will be available shortly with updated support for QTI 2.0 items, which should help in the identification and resolution of implementation problems.

Martin Hawksey of the University of Strathclyde presented the work of the Re-Engineering Assessment Practices project. With a focus on real world assessment experiences, including an impressive collection of case studies exploring the impact of transformation within assessment practices, the REAP project was of particular interest to participants. Also of great interest, and perhaps unsuprisingly sparking the greatest amount of debate, was the exploration of ‘Assessment 2.0′ presented by Bobby Elliott of the Scottish Qualifications Authority. Bobby looked at ways in which Web 2.0 technologies can be used to enhance and modernise assessment in ways which can engage and appeal to increasingly digitally literate learners.

The day also featured several demonstrations of tools under development. Niall Barr of NB Software demonstrated his current work, an assessment tool which utilises the IMS QTI, Content Packaging and Common Cartridge specifications, while Steve Bennett of the University of Hertfordshire demonstrated MCQFM, a JISC-funded tool which provides a simple text-based format for converting and editing items between formats. Two more JISC projects closed the day. AQuRate, presented by Alicia Campos and David Livingstone of Kingstone University, is an elegant item authoring tool while ASDEL, presented by Jon Hare of the University of Southampton, is an assessment delivery tool which builds on the R2Q2 project to provide a fuller test tool. A third project, Minibix (University of Cambridge) on item banking, is working closely with AQuRate and ASDEL.

Links to presentations (via slideshare), project websites and other information can all be found on our wiki: http://wiki.cetis.org.uk/JISC_CETIS_Assessment_SIG_meeting%2C_26_September_2007.

Assessment for Learner Responsibility

On Monday, I attended a Learning Enhancement Network event here at Strathclyde on Assessment for Learner Responsibility.  Strathclyde is in the process of revising its assessment policy, and this event brought together staff from across the university, together with a number of student representatives and some colleagues from other universities. 

The University of Edinburgh has recently completed a similar process, and Nigel Seaton from Edinburgh’s College of Science and Engineering presented the outcomes of this process.  It was particularly interesting to see that all assessment in the College is now formative, in that students receive feedback for all the work they do – including formal examinations.  I really like this: as both a student and a tutor I always found it hugely frustrating to not be able to get or give feedback from the most important assessments beyond a bald grade or classification.  This is particularly important for students who experience signficantly worse performance in these assessments than in earlier coursework, and who are often bewildered, demoralised and demotivated by the lack of information provided.  The Data Protection and Freedom of Information Acts make the disclosure of examiners’ comments a legal obligation, but it’s nice to see this spun positively and used as a real learning opportunity rather than just warning markers not to make rude comments on exam scripts.

The College will also be introducing an eportfolio system, not for PDP but for use as a subject-specific learning and reflection aid for students, another very appealing idea. 

Jim Baxter from Strathclyde’s Department of Psychology gave a very entertaining presentation on collaborative WebCT-based activities introduced to the first year course in collaboration with the REAP project.  This involved a lot of collaborative work, something which had already been raised by Nigel and which was returned to in the afternoon’s breakout discussions.  I’ve never been a fan of group assessment, peer assessment, and other activities which force people into social models regardless of whether that is what is right for them, and I was pleased to see that I wasn’t the only one who felt some unease about compelling students to take part in such activities.  There was genuine agreement that there’s a need to respect all different learning styles and that there may be a tension between fashionable approaches and what’s actually best for an individual.  Staff reported that the majority of students themselves said that they dislike group work, although it’s regarded far more positively in post-graduation surveys – which is rather interesting itself, perhaps extroverts are more likely to complete surveys..?

One issue which concerned me is the observation that a student can be prevented from sitting the final examination for a course if they haven’t participated in group activities.  The justification for this is that these activities are detailed in the course materials so they knew what they were signing up for – but surely students should be studying a course because they’re interested in the subject and not because they can cope with the teaching style, and really shouldn’t be prevented from studying their chosen subject because they are don’t have a particular learning style.  The principle that students should have a choice in the methods and timing of assessment is therefore very welcome.

David Nicol presented the eleven principles of good assessment practive which were the initial outcomes from the university’s working group examining the assessment policy and from David’s long interest in this area.  These covered engagement to stimulate learning through clarificiation of what constitutes good practice, encouraging ‘time and effort’ on educationally purposeful tasks, high quality feedback, opportunties to close the feedback look, encourage positive motivational beliefs and self-esteem and encourage dialogue about learning between all stakeholders.  Other principles focused on empowerment, sharing responsibility for learning with students by facilitating self-assessment and reflection, giving learners choice in the nature, methods, criteria and timing of assessment, involving students in policy and practice decisions and supporting the development of learner communities and social integration. 

A particular strength of the day was the involvement of Strathclyde students and the opportunity the day gave for dialogue between staff and learners.  The greatest concerns which emerged from the afternoon breakout discussions were time, feedback and over-examining.

Deadline coordination is perhaps a particular issue at Strathclyde which has a very broad first year curriculum: in the Faculty of Law, Arts and Social Sciences, for example, students have to study five potentially quite disparate subjects, all of which tend to set the same deadlines for work.  There were occasional examples of the principle of ‘giving learners choice in the timing of assessment tasks’, for example the lecturer who negotiates deadlines with his classes.  Our group spent some time lusting after a hypothetical ‘assessment booking system’ which course coordinators could use to pick ‘slots’ for assessment deadlines – perhaps something the Enterprise SIG could look at :-)

Feedback was another major issue for the students.  They appreciate detailed constructive feedback, particularly where it explains marks in relation to published assessment criteria and offers suggestions for improving weak areas as well as highlighting strengths.  There were a few examples of bad practice, with ‘feedback’ which consisted of a smiley face and a mark being a particular low, but not many examples of truly detailed feedback.  The timeliness of feedback was also a concern: as one student observed, ‘how can I learn from my feedback if I don’t get it until after I’ve submitted my next assignment?’

Students and staff were both concerned that students are being over-examined.  Virtually every piece of work a student submits contributes towards the final mark for the course, meaning that students don’t have a space to fail.  The students in our group actually wanted more formative assessment, practice excercises particularly when undertaking a new type of task they hadn’t encountered before, and the opportunity to experiment and learn before being summatively assessed.

Other issues that emerged were the desire to offer staff incentives and reward innovative and engaging projects, training for staff in how to write content that actually will engage learners, and the real desire to share innovation throughout the university.

Massively Multi Learner

The HEA Information and Computer Science subject centre recently ran a workshop, ‘Massively Multi Learner’, on learning in multi user virtual environments which I was fortunate enough to be able to attend.

Perhaps inevitably, the presentations on the day were heavily skewed towards Second Life, a fact that I was glad to see the organisers themselves acknowledged as not necessarily ideal.  Unfortunately, Carl Potts, who had been scheduled to speak on learning within guilds in World of Warcraft, was unable to attend, but Laz Allen of TPLD (standing in for Helen Routledge) provided a non-SL and more game-orientated perspective on emerging technologies.  Of particular interest was the emphasis in this presentation on the assessment of game-based learning and of gaming activities, through reflection and debriefing, and through the logging and interpretation of ingame activities with reference to an identified set of skills.  Unlike commercial off-the-shelf games (COTS) and other resources such as SL, games specifically designed for learning can offer a more effective balance of learning objectives, subject matter content and gameplay, with assessment – often itself highly innovative – integrated from the outset.

The rest of the presentations all referenced SL to a greater or lesser extent.  I hugely enjoyed Aleks Krotoski‘s work on social networking in virtual worlds, in particular her identification of 75 avatars (“they know who they are”) who form “the feted [fetid?] inner core of Second Life”.  Unlike either single-player or MMO games, MUVEs such as SL are inherently socially orientated rather than goal-orientated; ‘success’ doesn’t come necessarily from accumulation of in-game objects or from PvP or PvE pwnage but from occupying key, extremely powerful positions within social networks.  As an infrequent and rather ‘resistive’ SLer, I feel strongly that the lack of scaffolding within SL, in contrast to the carefully balanced quest structure in games such as WoW which directs players through the game world and encourages casual grouping, makes social relationships within SL disproportionately important.

Other presentations explored some of the many purposes to which SL is being put.  Dave Taylor of the National Physical Laboratory discussed some very exciting international collaboration which has been taking place in the Space Island cluster, while Peter Twining demonstrated the Schome island pilot on the teen grid which is trialing SL as a learning space for a group of ‘gifted and talented’ learners.  Jeremy Kemp discussed Sloodle, an integration of SL and Moodle which uses mashups to connect the two systems.  The integration of SL and Moodle also offers the potential for resolving accessibility issues around SL by offering meaningful real time alternatives to inworld communications.

The final three speakers had all integrated SL closely into their teaching practice.  Mike Hobbs of Anglia Rushkin University described scripting tasks undertaken by second year Computing Science students to create learning resources used to explain computing concepts to first year students, while Annabeth Robinson (well known in SL as AngryBeth for her creative and practical objects) described the options her Design for Digital Media students had for woriking in SL and particularly for using it as a tool for machinima.  Mike Reddy provided an entertaining end to the day, looking at various ways in which Second Life can be integrated into a range of courses.

Assessment SIG meeting, 22 February 2007

I’ve finally managed to get the notes and reports from our most recent assessment SIG meeting at the University of Southampton up on the wiki

It was a hugely enjoyable meeting which brought together people with a wide range of interests in the CAA area.  JISC activities were well represented, with three new one year Capital Programme projects presenting a brief introduction to their work: AQuRate, for assessment authoring, Minibix for item banking and AsDel for assessment delivery.

Other JISC-funded work presented included the FREMA semantic wiki which includes a huge amount of information on the eassessment domain, PeerPigeon for services to support peer review (and which would have won had there been a prize for best project logo), MCQFM which is developing a web-based question generator, and XMarks, which is working on web services for exchanging assessment-related information including the development of an XML information model for handling this data.

Gillian Palmer of ElementE looked at some of the issues around assessment delivery in the wider European context, including the difficulty of getting different nations to adopt or support standards which are identified with particular nations, while our own Clive Church (who we share with Edexcel) looked at the challenges facing accreditation and assessment bodies in England, Wales and Northern Ireland with the introduction of the 14-19 specialised diploma. In both contexts, the use of agreed standards is crucial to the success of the assessment and accreditation process.

The importance of standards and differences between implementations were explored by Dick Bacon of the University of Surrey, who looked at some of the issues he encountered when developing a physical sciences question bank using content generated in a range of assessment tools.

Overall, it was a lively and stimulating day, and very encouraging to see how much activity and enthusiasm there is in the field.  Many thanks to all who took part, and to our hosts at Southampton for an excellent day.

Back in the blogosphere

Well, I’m back blogging again after my long silence.  The blog was getting overrun with comment spam and was becoming pretty unusable and pretty unpleasant.  Many thanks to Sam for installing Akismet antispam and solving the problem.

Serendipity

Endemol, creators of some of the most successful television formats of recent years, announced yesterday that they would be launching the first Virtual Big Brother within Second Life.  It’s a fascinating concept, and it’ll be fascinating to see how it’s received by both the Second Life community and those who so far haven’t engaged with it.  Will the intrusive and often prurient appeal of ‘real’ Big Brother with ‘real’ people really transfer to avatars within a virtual world?  Whether it succeeds or fails, it should tell us a lot about how we negotiate our own and others’ identities within real and virtual communities.

And hopefully something to look forward to in January 2007: Boris Johnson, inimitable Shadow Minister for Higher Education, is widely rumoured to have been approached to appear in the next series of Celebrity Big Brother.  I will of course be watching carefully in case he has anything to say on eassessment and elearning in any potential future Tory government.

Now all I need is someone to pay me to play WoW and my life would be complete…

Digital Literacy, Podcasting and eLearning – trainerspod webinar

Yesterday afternoon saw my second webinar in two days, this time a session on digital literacy, podcasting and elearning led by Graham Attwell of Pontydysgu.  Around 40 people took part from many countries.  Because of the way in which the session was run (which I’ll discuss in yet another blog post), the following is just a few impressions from the event rather than a proper report.  The archived session will be available online soon for those who would like to learn more about this topic.

The session was split into three sections: digital literacies and new pedagogic approaches; what is a podcast and how should it be used in education; and how to make a podcast.

Graham pointed out that traditional LMSs use a traditional, didactic, ‘push’ approach to learning.  In the new era of ‘elearning 2.0′, this should change to a more constructivist approach; however, there are many activities around at the moment that are constructivist in name only – as always, there is a need to examine what we’re actually doing instead of just optimistically applying labels like plasters and hoping they stick.

One quote, from Harry Jenkins, which particularly struck me was: ‘We need to shift the focus of the conversation about the digital divide from questions of technological access to those of opportunities to participate and to develop the cultural competencies and social skills needed for full involvement’.  Graham cited the example of George W Bush discussing what he’s ‘used on the Google‘ to help illustrate this point, and Senator Ted Stevens’ own personal internet is also always worth remembering (no, it’s not funny).

Clarence Fisher’s ‘eleven skills for participation’ were mentioned, and are worth repeating: play, performance, simulation, appropriation, multitasking, distributed cognition, collective intelligence, judgement, transmedia navigation, networking and negotiation. 

It was interesting to learn that I’m not the only person who dislikes the term ‘podcasting’ because of its close relationship to Apple and the iPod.  A few alternatives were suggested, with ‘audio report’ being the most popular, but – as with the whole Web 2.0 business- it’s not got that snappy, cliquey, in-the-know connotation that will probably leave us with podcasting for some time to come.

It’s also worth mentioning that enhanced podcasts (enhanced audio reports?) which allow users to embed still images such as the ubiquitous PowerPoint slides into podcasts, can be created with tools such as Garageband, Audacity and Divicast; we made presentations from our joint Assessment and MDR SIGs meeting available as Breeze presentations, integrating MP3 recordings and PowerPoint presentations and received a generally favourable reaction.

A key part of the new models of education and elearning is sharing, yet I felt that Graham made one of the most important points of all when he said that it’s about ‘learning to share, learning how to share, and learning how to have the right not to share’.  That’s something that will be very relevant to the TrainersPod webinar he’ll be leading on eportfolios in the new year: I’ll certainly be there.