The MOOC just got better!

I’ve just finished Stanford University’s HCI (Human-Computer Interaction) MOOC (see my previous post MOOC is not a dirty word… at least for the student). Personally, I’ve found it a very enjoyable, but challenging experience (due to my lack of skills, but isn’t that the whole point of learning?).

The course tutor rounded off the course with a short video of his reflections. For those of you who like facts and figures:

  • 29,568 students watched at least some of the video lectures
  • 20,443 students did at least one of the automatically marked multiple choice quizzes
  • 3,203 students completed at least one of the assignments
  • 765 students completed all 5 assignments
  • students came from all around the world, with at least 130 countries being represented.

As students, we’ve had ample opportunity to provide feedback to the teaching team about the Coursera platform and the course as a whole. That feedback has been acted on quickly with tweaks being made to class materials or assignments, while students are still working on them. MOOCs therefore offer an agile solution that takes the student’s needs into account.

It hasn’t just been a one-way transaction. As a student, I’ve learned a tremendous amount from both the teaching team and my peers. The teaching team has also learnt from the students, who have shared resources, reading lists, articles, etc and helped other students. Taking an online course doesn’t mean that the student is isolated. Many students have held their own meet-ups, either face-to-face or virtually. You could say, using the classic cybernetics term, that they were part of a self-organising system, building up communities to support and help each other long after the course has finished.

Just one year ago, there was no Coursera. So everything I’ve used on the course has been created over a very short period of time. But you wouldn’t know. Aside from a few bugs and minor niggles, the whole thing ran very smoothly. One thing to note is that Stanford doesn’t need to run this course. It already has a great reputation, but that hasn’t stopped the teaching team from working hard to pull together the content and make it freely available to everyone.

And now the MOOC has just got better. I’ve just had an email from Coursera to tell me that it now has a Career Service to help Coursera students find jobs. Should I wish to take part (and I may need to shortly), they will share my details with selected partner companies (likely to be US based). This could be good for me as a student, although it’s not without concerns. In the (probably very near) future, a company could cherry pick the best students from online courses, because they’ll be able to follow students with potential as they submit their coursework. They may even influence the course itself. Coursera will no doubt get its revenue from acting as a matchmaking service. However, this needs to be handled carefully. Issues could include companies bombarding students with advertising, a limited pool of companies being able to select students (but who wouldn’t be flattered to be offered a job by the likes of Google or Apple?), US only companies, companies that only support (financially?) Stanford (or other Coursera universities), etc. It’s not without its potential difficulties. However, from a student point of view, it seems like a great idea.

So did I finish the course? I certainly did and can now quite legitimately say that I have a Distinction from Stanford University!

Technologies in use in the JISC Assessment and Feedback programme Strand B (evidence and evaluation)

The JISC Assessment and Feedback Programme is now in its fifth month, looking at a wide range of technological innovations around assessment and feedback in HE and FE.  Strand B is focused on the evaluation of earlier work, gathering and evaluating evidence on the impact of these innovations and producing guidelines and supporting material to facilitate their adoption in other subject areas and institutions.  These projects cover a broad range of technologies, but are themselves not involved in technological development but rather in examining and reporting the impact of such developments.

The information here was gathered through fairly informal conversations with the projects, building on the information initially provided in their funding applications.  Information from these calls is added to our project database (PROD) – you can see some of the amazing uses this information can be put to in the series of blog posts Martin Hawksey has produced as part of his work on visualisations of the OER programme, as well as some of the work by my colleagues David, Sheila and Wilbert.

This blog post is rather less ambitious than their work (!), and is intended to provide a quick snapshot of technologies that projects in this specific programme strand are finding valuable for their work.  For more information on the projects in general you can find all my posts on this programme linked from here.

Underlying technologies

Although the underlying technologies – that is, the technologies used by the innovations they’re evaluating – aren’t the direct focus of these projects, I’ve included them as they’re obviously of interest.  They also show the very broad range of approaches and methods being evaluated by this project.

Several of the projects expressed a strong desire to reuse existing tools and resources  such as MS Office and other commercial software solutions, rather than reinvent the wheel by developing new software; there were also very compelling issues around staff training for new systems, staff familiarity and comfort with existing systems and strong pressure from staff, students and management to work within institutional VLEs.

Purpose

Technology

Feedback delivery

MS Word (annotated markup documents)

eTMA (electronic tutor marked assignment) system

Assignment timetables (diaries)

MS Access

VLE

Moodle

Blackboard

Online marking

GradeMark

Student generation of assessment content, social network functionality supported

PeerWise

Plagiarism detection

Turnitin

Blackboard Safe Assign

Bug reporting

Pivotal Tracker

Surface tables to improve online marking process

Pen devices to improve online marking process

Managing self-reflection workflow

eReflect

Online learning diary

Automated writing technique evaluation tool

Turnitin eRater

Communication with students, course news, deadline reminders

FaceBook

Peer assessment tool

PeerMark

Centralised email account, blog and microblog for managing assignment submissions and communicating with students and staff

TQFE-Tutor

Communication with students

Twitter

Blog for discussion of common Q&As, general assignment feedback

WordPress

Webinars

Adobe Connect

EVS

Evidence gathering

As these projects are about collecting and evaluating evidence, the approaches taken to this are of obvious interest.

There was a strong emphasis on interviewing as the main approach, with audio and video interviews being recorded for subsequent analysis and dissemination where appropriate approval has been given.  Jing was the main recording system cited for this.  Surveys (which can be considered a kind of asynchronous interview) were also mentioned, with Survey Monkey being the tool of choice for this.

Less structured impressions were also sought, with Jing again being cited as a valuable tool for capturing staff and student feedback.  Twitter was also mentioned for this purpose.

Evidence analysis

The emphasis of this strand is on qualitative rather than quantitative outcomes, with users’ experiences, case studies and the development of guidance documents and staff development resources being the main focus.

Nvivo was cited as the tool of choice for the transcription and coding of audio and written feedback for subsequent analysis.  Collaborative writing, analysis and version control are the main concern for this part of the projects, and are being addressed through the use of Google Docs and SharePoint.

Standards referenced

The standards used by projects in this programme are fairly generic.  None of these projects are using standards such as those produced by IMS as they were felt to be not really relevant to this level of work.  One project was looking at the use of IMS Learning Tools Interoperability as providing an approach to integrating their software development with a number of different VLEs being used by institutions within their consortium.  Beyond this, the standards referenced were unremarkable: primarily MP3 and HTML.

Dissemination

All the projects have thorough dissemination plans in place to ensure that their findings are shared as widely as possible.  It was great to see that all the projects referenced the JISC Design Studio, a fantastic resource that is well worth digging around in.  Overall there is a wide range of technologies being used to ensure that the findings from these projects reach as broad an audience as possible.  Again, there is a clear mix between established, proprietary software and free services, reflecting the range of technologies in use within institutions and the different institutional contexts of these projects.

Purpose

Technology

Recording seminars

Panopto

Publishing videos

YouTube

Dissemination

JISC Design Studio

Reports

Guidance documents

Peer reviewed publications

Project website

Workshop

Elluiminate Live

Dissemination and community building

Cloudworks

Case studies

Dissemination

Yammer

Dissemination and community building

Twitter

Dissemination

MS Office Communicator (now Lync)

Dissemination

Google docs

Sharing stable versions

Sharepoint

Screen capture – staff development

Jing

Camtasia

Toolkits

Project blog

WordPress

Conference attendance

Evaluating Electronic Voting Systems for Enhancing Student Experience

The eighth project in Strand B (Evidence and Evaluation) of the JISC Assessment and Feedback Programme is Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS), based at the University of Hertfordshire.  This one year project is undertaking an extensive review of the use of electronic voting systems (EVS) in a range of schools across the institution, gathering testimony from both staff and students on their experiences, insights and identified issues and success factors.

Hertfordshire has invested substantially in assessment and feedback in recent years, with an extensive programme of innovations including the purchase of nearly four thousand EVS handsets for use in teaching in eight schools.  The initial response to their introduction, from both staff and students, has been very positive, with the system seen as improving both classroom interaction and staff and student workloads.

The EEVS project will produce a thorough longitudinal study of the impact of EVS, including audio and video interviews, reflective writing and interviews over the course of the academic year.  This long term view will enable the project team to examine key periods in the academic year such as students’ initial encounters with the system, the perceived value and impact on exam performance of interactive revision lectures, technological issues around introduction in new classroom environments, and so on.

The project will produce a number of outputs, including valuable evidence to the sector on the impact of such large scale implementation, detailed guidance on the installation and deployment of EVS and subject-specific case studies, as well as a series vox pop snapshots from teaching staff, students and support staff on their experiences of EVS.  You can follow their progress on their project blog.

Deterrents don’t deter?

A recent article in THES reports on research by Robert J. Youmans at California State University Northridge that found that

Students who are aware that their work will be checked by plagiarism-detection software are just as likely to cheat as those who are not.

Conventional wisdom – and intuition – suggests that the threat of discovery, and subsequent punishment, is an effective deterrent against plagiarism – indeed, one of the comments on the article points to another study that suggested that students’ awareness of the use of Turnitin on a course significantly reduced plagiarism.

It’s not always clear whether plagiarism is an intentional and cynical attempt to deceive, the result of bad time management and poor writing or referencing skills, or due to genuine lack of understanding of the concept of plagiarism or differing cultural norms around it.  The first category of student is the category most likely to resort to essay mills as a safer alternative where it’s made clear that plagiarism detection is in use, which suggests that the majority of students ‘caught’ by Turnitin and other text matching techniques when their use is advertised as a supposed deterrent are those whose main problem is not a desire to cheat but academic or personal factors.

Findings like this seem to strengthen the arguments in favour of using Turnitin formatively, as part of a student’s academic development and the essay writing process, rather than as a way of detecting problems once it’s too late to do anything about them and the student has entered the disciplinary process.  The use of plagiarism detection only after submission seems to be based on the assumption that plagiarism only occurs through a deliberate desire to cheat, and as I’ve argued before, positions all students as potential cheats rather than as developing academics who may be in need of guidance and support to achieve their potential.

InterACT: modelling feedback flow

The InterACT project at the University of Dundee, part of the JISC Assessment and Feedback Programme Strand A (institutional change) is working on enhancing feedback dialogue, reflection and feed-forward in a large postgraduate online distance learning course in medical education.

The course is unusual in that progress is heavily learner-driven: as students are working professionals they are able to enrol and submit assignments at any time they chose rather than according to a predetermined course timetable, and while this significantly increases the flexibility and accessibility of the course, this lack of external structure can impact, together with the higher attrition rates noted in online distance learning in general, on student progress and retention.

Assessment feedback has traditionally been offered at the end of each module of study, when assignments are submitted, which clearly limits the potential for reflection and learning from feedback.  The InterACT project will transform this model through the integration of technology to support a more dynamic, ongoing feed-forward process that actively encourages learners to reflect and act on feedback and builds dialogue between learners and tutors.

The project team have now released their first draft of their proposed new model of feedback, and are actively seeking comment from the wider community.  Dialogue between tutor and learner is focused around cover sheets appended to submitted work which encourage self-evaluation and reflection on assessment performance as well as making explicit the intention that past feedback should impact on future work.  The use of private blogs or wikis as a personal reflective space is intended to encourage this focus on the ongoing interplay of past and future performance.

Do get involved with the discussion, either via the blog post or through contacting the project team.

eAssessment Scotland 2012 call for posters, presentations and workshops

The call for posters, presentations and workshops for eAssessment Scotland 2012 – Feeding Back, Forming the Future is now available.  Proposals should be submitted by 1 May.

This annual conference has become a valuable part of the eassessment calendar, as can be seen by the rich and varied content in the archive of past events.  The conference is being held on 31 August at the University of Dundee, with an online conference running from 23 August – 6 September.  The conference will also host the annual Scottish eAssessment Awards, for which submissions open in early March.  Registration for both events will open in mid-March.

The e-Feedback Evaluation Project

Assessment of language learning naturally presents some unique challenges for both teaching staff and learners.  Regular practice of both spoken and written language production is a vital part of language training and requires a significant amount of ongoing feedback to support the acquisition of competence in the subject.  In a distance learning context in particular, but similarly in any setting where feedback is provided asynchronously rather than face-to-face, providing meaningful feedback on spoken texts especially is challenging, often requiring spoken feedback to correct pronunciation and structuring errors.

There have been a number of exciting projects around audio feedback in recent years, including the Optimising Audio Feedback project at Aberystwyth University, Sounds Good at Leeds (both funded by JISC) and Audio Supported Enhanced Learning, a collaboration between the Universities of Bradford and Hertfordshire.  The focus of the eFeedback Evaluation Project (eFEP), however, is the impact of the combination of both spoken and written feedback on language learning.

The eFEP project is led by the Department of Languages at The Open University, an institution with unique experience in providing language training through distance learning, a large part of which involves teaching through both formative and summative assessment and feedback.  The OU has a mature and robust eTMA (electronic tutor marked assignment) system which supports assessment across the institution, and provides feedback either via MP3 files or marked-up MS Word documents, as appropriate for the individual assignment.  Each form of feedback is supplemented with an HTML form (an example of which can be seen on the poster submitted by the project to the programme’s startup meeting) containing administrative information, marks awarded and additional feedback.

The project will examine the ways in which students and tutors interact and engage with their feedback, identify common perceptions and issues, and recommend areas requiring further support and guidelines for good practice.  In order to examine the applicability of this feedback approach in traditional settings, the project will also look at the impact of audio feedback in Italian modules at the University of Manchester.

The insight into the use of audio feedback across a variety of environments, and the range of training and support materials to be produced, should make eFEP a valuable addition to our understanding of the value of audio feedback as well as offering clear practical guidance to those considering adopting it.

Online Coursework Management Evaluation

The University of Exeter has developed an entirely online end-to-end coursework management system which is the subject of the Online Coursework Management Evaluation (OCME) project funded by JISC as part of the Assessment and Feedback programme Strand B.

This system sees the integration of Moodle and Turnitin within the university’s Exeter Learning Environment (ELE).  Assignments are submitted through the ELE, assigned an originality score by Turnitin, then available for marking through GradeMark (a commercial online marking system within Turnitin) or MS Word markup.  Feedback is returned to students either via uploaded forms or bespoke feedback forms, and are made available for viewing by both individual students and the personal tutor assigned to support them.  Initially deployed through a small 2011 pilot project funded by HEFCE, the system is now available institution-wide, although for practical reasons this evaluation project will concentrate on working with smaller groups across various disciplines.

Exeter’s Moodle support is provided by the University of London Computer Centre, who are developing the interface between Moodle and Turnitin.  There is strong internal support for the system which will be maintained and further developed well beyond the lifetime of this one year project.  What the OCME project will provide is a series of reports and briefing papers which will explore the pedagogic, technological and institutional aspects to transforming practice, and guidelines for future implementers and for those considering introducing such transformative technologies within their own institutions.  The experiences and lessons learned from this project should be of value across the sector.

Evaluating the Benefits of Electronic Assessment Management

Examining the embedding of electronic assessment management (EAM) within both administrative and teaching and learning practice is the main focus of the Evaluating the Benefits of Electronic Assessment Management (EBEAM) project running at the University of Huddersfield as part of the JISC Assessment and Feedback programme Strand B.  This 18 month project will look at how Turnitin, incorporating GradeMark and eRater, addresses student, staff and institutional requirements for timely, invidiualised and focused feedback, reduced staff workloads and increasing reflection on practice, and cost-effective, scaleable and sustainable innovation.

The dual focus on administrative and pedagogic aspects is crucial for real uptake of any new technology or process.  By providing a supportive administrative and technological infrastructure, institutions can enable academic staff to fully realise the benefits of innovative systems and practice, and provide a significantly enhanced learning environment for students.  The dynamic interplay of these factors is vividly illustrated in the poster the project submitted for the programme kick off meeting.  The impact on student satisfaction, achievement and retention rates already apparent at Huddersfield reflects the success of such an approach.

Like the Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan project, EBEAM is grounded in previous evaluation work investigating the benefits of Turnitin on staff and students.  As with other projects, the decision to adopt existing technologies incorporated through the institutional VLE (in this case, Blackboard) is a pragmatic choice, adopting known and proven technology rather than expending time and resources in developing yet more tools to do the same things.  Being able to pick up such tools as needed greatly increases institutional agility, and provides ready access to existing user groups and a wealth of shared practice.

EBEAM project staff also have a keen awareness of the need for meaningful and effective staff development to enable teaching staff to make full use of new technologies and achieve the integration of new approaches within their teaching practice, a theme covered in several posts on their excellent project blog.  The project will produce a wide range of development materials, including practically-focused toolkits, webinars and screencasts, which will be available through the project site and the JISC Design Studio.  In addition, they’re looking at ways of fully exploiting the extensive amount of data generated by these EAM systems to further enhance teaching and learning support as well as engaging administrative departments in discussions on topics such as data warehousing and change management.

The EBEAM project should provide an excellent study in the benefits of eassessment and of methods of integration that take a holistic approach to institutions and stakeholders.  I’m very much looking forward to seeing the outcomes of their work.

Clive 1st June 2007

It has been some times since I ‘blogged’

Since I last ‘put fingers to keyboard’ I have been boring folk with my view that over the last year the ‘centre of gravity’ of learning technology standards developments have moved to the schools and FE sectors from HE. Driven by the e-Strategy and other governemnt schools based agendas such as ‘Every Child Matters’, Becta, MIAP and others have been obliged to deliver solutions. to strictly imposed deadlines. The focus so far has been in two areas: joining social service systems up with school administration systems and learning platforms for schools. Both have required standards based developments: the Schools Interoperability Framework (SIF) underpinned by the adoption of a Unique learner Number (ULN) has been adopted for the former and Becta has produced specifications (all around standards and extendability) that suppliers of learning platforms have to satisfy. Additionally, the Qualification and Curriculum Authority, in order to meet the requirements of the new vocational Specilaised Diploma will have to produce data standards by the Autumn for course details and qualification achievements.

Obviously such developments will have an impact for the JISC communities. Firstly, there will be a concentration  of minds around where standards are really needed (rather than ‘could be useful’) and there will be a requirement for JISC to focus on those areas of detail which could impede national projects if not attended to. Solutions to the problems of Identity Management and the scalability of SOA implementations are just two that need urgent attention.

So to survive , JISC has to be sufficiently engaged in influencing and engaging with learning technology based solutions in the schools and tertiary sector in order to anticpate those areas that need the efforts and expertise of our community?

So what have I been doing for my one day per week in addition to boring my colleagues at mangement meetings with the above?

Well I have been supportng Peter with ePortfolio develovepment (around assessment) and with the help of Nottingham University finding out about Lifelong Learning Networks,Â