Technologies in use in the JISC Assessment and Feedback programme Strand B (evidence and evaluation)

The JISC Assessment and Feedback Programme is now in its fifth month, looking at a wide range of technological innovations around assessment and feedback in HE and FE.  Strand B is focused on the evaluation of earlier work, gathering and evaluating evidence on the impact of these innovations and producing guidelines and supporting material to facilitate their adoption in other subject areas and institutions.  These projects cover a broad range of technologies, but are themselves not involved in technological development but rather in examining and reporting the impact of such developments.

The information here was gathered through fairly informal conversations with the projects, building on the information initially provided in their funding applications.  Information from these calls is added to our project database (PROD) – you can see some of the amazing uses this information can be put to in the series of blog posts Martin Hawksey has produced as part of his work on visualisations of the OER programme, as well as some of the work by my colleagues David, Sheila and Wilbert.

This blog post is rather less ambitious than their work (!), and is intended to provide a quick snapshot of technologies that projects in this specific programme strand are finding valuable for their work.  For more information on the projects in general you can find all my posts on this programme linked from here.

Underlying technologies

Although the underlying technologies – that is, the technologies used by the innovations they’re evaluating – aren’t the direct focus of these projects, I’ve included them as they’re obviously of interest.  They also show the very broad range of approaches and methods being evaluated by this project.

Several of the projects expressed a strong desire to reuse existing tools and resources  such as MS Office and other commercial software solutions, rather than reinvent the wheel by developing new software; there were also very compelling issues around staff training for new systems, staff familiarity and comfort with existing systems and strong pressure from staff, students and management to work within institutional VLEs.

Purpose

Technology

Feedback delivery

MS Word (annotated markup documents)

eTMA (electronic tutor marked assignment) system

Assignment timetables (diaries)

MS Access

VLE

Moodle

Blackboard

Online marking

GradeMark

Student generation of assessment content, social network functionality supported

PeerWise

Plagiarism detection

Turnitin

Blackboard Safe Assign

Bug reporting

Pivotal Tracker

Surface tables to improve online marking process

Pen devices to improve online marking process

Managing self-reflection workflow

eReflect

Online learning diary

Automated writing technique evaluation tool

Turnitin eRater

Communication with students, course news, deadline reminders

FaceBook

Peer assessment tool

PeerMark

Centralised email account, blog and microblog for managing assignment submissions and communicating with students and staff

TQFE-Tutor

Communication with students

Twitter

Blog for discussion of common Q&As, general assignment feedback

WordPress

Webinars

Adobe Connect

EVS

Evidence gathering

As these projects are about collecting and evaluating evidence, the approaches taken to this are of obvious interest.

There was a strong emphasis on interviewing as the main approach, with audio and video interviews being recorded for subsequent analysis and dissemination where appropriate approval has been given.  Jing was the main recording system cited for this.  Surveys (which can be considered a kind of asynchronous interview) were also mentioned, with Survey Monkey being the tool of choice for this.

Less structured impressions were also sought, with Jing again being cited as a valuable tool for capturing staff and student feedback.  Twitter was also mentioned for this purpose.

Evidence analysis

The emphasis of this strand is on qualitative rather than quantitative outcomes, with users’ experiences, case studies and the development of guidance documents and staff development resources being the main focus.

Nvivo was cited as the tool of choice for the transcription and coding of audio and written feedback for subsequent analysis.  Collaborative writing, analysis and version control are the main concern for this part of the projects, and are being addressed through the use of Google Docs and SharePoint.

Standards referenced

The standards used by projects in this programme are fairly generic.  None of these projects are using standards such as those produced by IMS as they were felt to be not really relevant to this level of work.  One project was looking at the use of IMS Learning Tools Interoperability as providing an approach to integrating their software development with a number of different VLEs being used by institutions within their consortium.  Beyond this, the standards referenced were unremarkable: primarily MP3 and HTML.

Dissemination

All the projects have thorough dissemination plans in place to ensure that their findings are shared as widely as possible.  It was great to see that all the projects referenced the JISC Design Studio, a fantastic resource that is well worth digging around in.  Overall there is a wide range of technologies being used to ensure that the findings from these projects reach as broad an audience as possible.  Again, there is a clear mix between established, proprietary software and free services, reflecting the range of technologies in use within institutions and the different institutional contexts of these projects.

Purpose

Technology

Recording seminars

Panopto

Publishing videos

YouTube

Dissemination

JISC Design Studio

Reports

Guidance documents

Peer reviewed publications

Project website

Workshop

Elluiminate Live

Dissemination and community building

Cloudworks

Case studies

Dissemination

Yammer

Dissemination and community building

Twitter

Dissemination

MS Office Communicator (now Lync)

Dissemination

Google docs

Sharing stable versions

Sharepoint

Screen capture – staff development

Jing

Camtasia

Toolkits

Project blog

WordPress

Conference attendance

Evaluating Electronic Voting Systems for Enhancing Student Experience

The eighth project in Strand B (Evidence and Evaluation) of the JISC Assessment and Feedback Programme is Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS), based at the University of Hertfordshire.  This one year project is undertaking an extensive review of the use of electronic voting systems (EVS) in a range of schools across the institution, gathering testimony from both staff and students on their experiences, insights and identified issues and success factors.

Hertfordshire has invested substantially in assessment and feedback in recent years, with an extensive programme of innovations including the purchase of nearly four thousand EVS handsets for use in teaching in eight schools.  The initial response to their introduction, from both staff and students, has been very positive, with the system seen as improving both classroom interaction and staff and student workloads.

The EEVS project will produce a thorough longitudinal study of the impact of EVS, including audio and video interviews, reflective writing and interviews over the course of the academic year.  This long term view will enable the project team to examine key periods in the academic year such as students’ initial encounters with the system, the perceived value and impact on exam performance of interactive revision lectures, technological issues around introduction in new classroom environments, and so on.

The project will produce a number of outputs, including valuable evidence to the sector on the impact of such large scale implementation, detailed guidance on the installation and deployment of EVS and subject-specific case studies, as well as a series vox pop snapshots from teaching staff, students and support staff on their experiences of EVS.  You can follow their progress on their project blog.

Deterrents don’t deter?

A recent article in THES reports on research by Robert J. Youmans at California State University Northridge that found that

Students who are aware that their work will be checked by plagiarism-detection software are just as likely to cheat as those who are not.

Conventional wisdom – and intuition – suggests that the threat of discovery, and subsequent punishment, is an effective deterrent against plagiarism – indeed, one of the comments on the article points to another study that suggested that students’ awareness of the use of Turnitin on a course significantly reduced plagiarism.

It’s not always clear whether plagiarism is an intentional and cynical attempt to deceive, the result of bad time management and poor writing or referencing skills, or due to genuine lack of understanding of the concept of plagiarism or differing cultural norms around it.  The first category of student is the category most likely to resort to essay mills as a safer alternative where it’s made clear that plagiarism detection is in use, which suggests that the majority of students ‘caught’ by Turnitin and other text matching techniques when their use is advertised as a supposed deterrent are those whose main problem is not a desire to cheat but academic or personal factors.

Findings like this seem to strengthen the arguments in favour of using Turnitin formatively, as part of a student’s academic development and the essay writing process, rather than as a way of detecting problems once it’s too late to do anything about them and the student has entered the disciplinary process.  The use of plagiarism detection only after submission seems to be based on the assumption that plagiarism only occurs through a deliberate desire to cheat, and as I’ve argued before, positions all students as potential cheats rather than as developing academics who may be in need of guidance and support to achieve their potential.

InterACT: modelling feedback flow

The InterACT project at the University of Dundee, part of the JISC Assessment and Feedback Programme Strand A (institutional change) is working on enhancing feedback dialogue, reflection and feed-forward in a large postgraduate online distance learning course in medical education.

The course is unusual in that progress is heavily learner-driven: as students are working professionals they are able to enrol and submit assignments at any time they chose rather than according to a predetermined course timetable, and while this significantly increases the flexibility and accessibility of the course, this lack of external structure can impact, together with the higher attrition rates noted in online distance learning in general, on student progress and retention.

Assessment feedback has traditionally been offered at the end of each module of study, when assignments are submitted, which clearly limits the potential for reflection and learning from feedback.  The InterACT project will transform this model through the integration of technology to support a more dynamic, ongoing feed-forward process that actively encourages learners to reflect and act on feedback and builds dialogue between learners and tutors.

The project team have now released their first draft of their proposed new model of feedback, and are actively seeking comment from the wider community.  Dialogue between tutor and learner is focused around cover sheets appended to submitted work which encourage self-evaluation and reflection on assessment performance as well as making explicit the intention that past feedback should impact on future work.  The use of private blogs or wikis as a personal reflective space is intended to encourage this focus on the ongoing interplay of past and future performance.

Do get involved with the discussion, either via the blog post or through contacting the project team.

eAssessment Scotland 2012 call for posters, presentations and workshops

The call for posters, presentations and workshops for eAssessment Scotland 2012 – Feeding Back, Forming the Future is now available.  Proposals should be submitted by 1 May.

This annual conference has become a valuable part of the eassessment calendar, as can be seen by the rich and varied content in the archive of past events.  The conference is being held on 31 August at the University of Dundee, with an online conference running from 23 August – 6 September.  The conference will also host the annual Scottish eAssessment Awards, for which submissions open in early March.  Registration for both events will open in mid-March.

The e-Feedback Evaluation Project

Assessment of language learning naturally presents some unique challenges for both teaching staff and learners.  Regular practice of both spoken and written language production is a vital part of language training and requires a significant amount of ongoing feedback to support the acquisition of competence in the subject.  In a distance learning context in particular, but similarly in any setting where feedback is provided asynchronously rather than face-to-face, providing meaningful feedback on spoken texts especially is challenging, often requiring spoken feedback to correct pronunciation and structuring errors.

There have been a number of exciting projects around audio feedback in recent years, including the Optimising Audio Feedback project at Aberystwyth University, Sounds Good at Leeds (both funded by JISC) and Audio Supported Enhanced Learning, a collaboration between the Universities of Bradford and Hertfordshire.  The focus of the eFeedback Evaluation Project (eFEP), however, is the impact of the combination of both spoken and written feedback on language learning.

The eFEP project is led by the Department of Languages at The Open University, an institution with unique experience in providing language training through distance learning, a large part of which involves teaching through both formative and summative assessment and feedback.  The OU has a mature and robust eTMA (electronic tutor marked assignment) system which supports assessment across the institution, and provides feedback either via MP3 files or marked-up MS Word documents, as appropriate for the individual assignment.  Each form of feedback is supplemented with an HTML form (an example of which can be seen on the poster submitted by the project to the programme’s startup meeting) containing administrative information, marks awarded and additional feedback.

The project will examine the ways in which students and tutors interact and engage with their feedback, identify common perceptions and issues, and recommend areas requiring further support and guidelines for good practice.  In order to examine the applicability of this feedback approach in traditional settings, the project will also look at the impact of audio feedback in Italian modules at the University of Manchester.

The insight into the use of audio feedback across a variety of environments, and the range of training and support materials to be produced, should make eFEP a valuable addition to our understanding of the value of audio feedback as well as offering clear practical guidance to those considering adopting it.

Online Coursework Management Evaluation

The University of Exeter has developed an entirely online end-to-end coursework management system which is the subject of the Online Coursework Management Evaluation (OCME) project funded by JISC as part of the Assessment and Feedback programme Strand B.

This system sees the integration of Moodle and Turnitin within the university’s Exeter Learning Environment (ELE).  Assignments are submitted through the ELE, assigned an originality score by Turnitin, then available for marking through GradeMark (a commercial online marking system within Turnitin) or MS Word markup.  Feedback is returned to students either via uploaded forms or bespoke feedback forms, and are made available for viewing by both individual students and the personal tutor assigned to support them.  Initially deployed through a small 2011 pilot project funded by HEFCE, the system is now available institution-wide, although for practical reasons this evaluation project will concentrate on working with smaller groups across various disciplines.

Exeter’s Moodle support is provided by the University of London Computer Centre, who are developing the interface between Moodle and Turnitin.  There is strong internal support for the system which will be maintained and further developed well beyond the lifetime of this one year project.  What the OCME project will provide is a series of reports and briefing papers which will explore the pedagogic, technological and institutional aspects to transforming practice, and guidelines for future implementers and for those considering introducing such transformative technologies within their own institutions.  The experiences and lessons learned from this project should be of value across the sector.

Evaluating the Benefits of Electronic Assessment Management

Examining the embedding of electronic assessment management (EAM) within both administrative and teaching and learning practice is the main focus of the Evaluating the Benefits of Electronic Assessment Management (EBEAM) project running at the University of Huddersfield as part of the JISC Assessment and Feedback programme Strand B.  This 18 month project will look at how Turnitin, incorporating GradeMark and eRater, addresses student, staff and institutional requirements for timely, invidiualised and focused feedback, reduced staff workloads and increasing reflection on practice, and cost-effective, scaleable and sustainable innovation.

The dual focus on administrative and pedagogic aspects is crucial for real uptake of any new technology or process.  By providing a supportive administrative and technological infrastructure, institutions can enable academic staff to fully realise the benefits of innovative systems and practice, and provide a significantly enhanced learning environment for students.  The dynamic interplay of these factors is vividly illustrated in the poster the project submitted for the programme kick off meeting.  The impact on student satisfaction, achievement and retention rates already apparent at Huddersfield reflects the success of such an approach.

Like the Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan project, EBEAM is grounded in previous evaluation work investigating the benefits of Turnitin on staff and students.  As with other projects, the decision to adopt existing technologies incorporated through the institutional VLE (in this case, Blackboard) is a pragmatic choice, adopting known and proven technology rather than expending time and resources in developing yet more tools to do the same things.  Being able to pick up such tools as needed greatly increases institutional agility, and provides ready access to existing user groups and a wealth of shared practice.

EBEAM project staff also have a keen awareness of the need for meaningful and effective staff development to enable teaching staff to make full use of new technologies and achieve the integration of new approaches within their teaching practice, a theme covered in several posts on their excellent project blog.  The project will produce a wide range of development materials, including practically-focused toolkits, webinars and screencasts, which will be available through the project site and the JISC Design Studio.  In addition, they’re looking at ways of fully exploiting the extensive amount of data generated by these EAM systems to further enhance teaching and learning support as well as engaging administrative departments in discussions on topics such as data warehousing and change management.

The EBEAM project should provide an excellent study in the benefits of eassessment and of methods of integration that take a holistic approach to institutions and stakeholders.  I’m very much looking forward to seeing the outcomes of their work.

Evaluating feedback for elearning: centralised tutors

Providing fast, focused feedback to a cohort of 200 busy professionals undertaking vocational distance learning with tuition provided by a diminishing number of tutors, a number of whom are part-time, is definitely a challenging undertaking, and one for which the TQFE-Tutor system at the University of Dundee provides an innovative centralised approach.  The Evaluating Feedback for eLearning: Centralised Tutors (EFFECT) project, part of the JISC Assessment and Feedback programme Strand B, will be exploring the impact of this system and considering ways of further refining the process to maximise efficiency and student benefits.

Students studying on the Teaching Qualification (Further Education) programme at Dundee since the start of the 2010-11 session have been supported by a centralised tutor system that enables consistency and timeliness of feedback across the entire programme.  TQFE-tutor consists of a centralised email account, blog and microblogging site to which all tutors on the course have access.  Rather than students being assigned a personal tutor (who may have as little as 0.1 FTE allocated to the programme), support is provided by the entire team acting through the centralised account.  Students may email the TQFE-Tutor email address or post comments via the programme blog, with duty staff picking up queries and assignments as they arrive.  Programme announcements can be disseminated via the programme Twitter account, offering time and potentially cost savings.

As well as significantly increasing efficiency – students are guaranteed a response to any submission within two days, and usually receive one much faster – there are more subtle but equally important pedagogic benefits.  Feedback and advice provided to an individual can then be disseminated, suitably anonymised, to the rest of the current cohort via the TQFE-tutor blog; these entries also remain available for future years.  The accumulation of an effectively tagged bank of data supports independent learning while peer interaction and support enriches the learning process.  The use of Blackboard Safe Assign in place of paper submission has also helped streamline the assessment process and reduce administrative workloads.  Student achievement rates have risen, and the system may well contribute to increased retention.

You can follow the project’s progress over the next few months via their project blog, from which project outputs will also be available in due course.

Student-generated content for learning

Finding ways of engaging learners while maximising their learning, without negatively impacting on either staff or student workloads and that fit constrained institutional budgets is no small task, but the Student-Generated Content for Learning: Enhancing Engagement, Feedback and Performance project based at the University of Edinburgh is evaluating the use of technology that seems to do just that.

PeerWise, developed by the Department of Computer Science at the University of Auckland, is a free system designed not only to allow students to develop and publish their own questions but also to support a variety of social activities around those questions.  As well as simply answering others’ questions to test their own knowledge, students can also comment on questions, rate them and develop discussions around questions.  Of even more value is the ability for students to develop their own questions which – as the site explains – provides a number of pedagogic benefits to learners.  Students rapidly developed a strong sense of ownership of the online space and actively maintained and nurtured it, developing a very strongly supportive environment based on collaboration and shared responsibility for information accuracy and quality.

Perhaps one of the greatest strengths of the system is the way in which it can be used outside scheduled class times, as it is an inherently asynchronous model of dialogue and interaction.  This has proven particularly effective in supporting distance and placement students as it provides a very real sense of engagement with their peers and with their academic studies even when out of regular physical contact.

Edinburgh’s use of PeerWise was initially piloted in  undergraduate courses in Physics and Biology, and it is being extended to a wider range of courses and subject areas as part of this evaluation work.  In order to engage as many students as possible a small amount of the overall course mark was allocated to activity in PeerWise, with student-authored questions forming the basis for a question on the final examination of the course.  Scaffolding activities around question design encouraged students to consider their own misunderstandings of course material and resulted in exceptionally high quality materials reflecting the depth of learning and quality of engagement.

The project’s website is a rich source of resources around this work while the team’s blog provides reflection on the day-to-day progress of the project.  SGC4L is funded by JISC as part of the Assessment and Feedback programme Strand B.