The e-Feedback Evaluation Project

Assessment of language learning naturally presents some unique challenges for both teaching staff and learners.  Regular practice of both spoken and written language production is a vital part of language training and requires a significant amount of ongoing feedback to support the acquisition of competence in the subject.  In a distance learning context in particular, but similarly in any setting where feedback is provided asynchronously rather than face-to-face, providing meaningful feedback on spoken texts especially is challenging, often requiring spoken feedback to correct pronunciation and structuring errors.

There have been a number of exciting projects around audio feedback in recent years, including the Optimising Audio Feedback project at Aberystwyth University, Sounds Good at Leeds (both funded by JISC) and Audio Supported Enhanced Learning, a collaboration between the Universities of Bradford and Hertfordshire.  The focus of the eFeedback Evaluation Project (eFEP), however, is the impact of the combination of both spoken and written feedback on language learning.

The eFEP project is led by the Department of Languages at The Open University, an institution with unique experience in providing language training through distance learning, a large part of which involves teaching through both formative and summative assessment and feedback.  The OU has a mature and robust eTMA (electronic tutor marked assignment) system which supports assessment across the institution, and provides feedback either via MP3 files or marked-up MS Word documents, as appropriate for the individual assignment.  Each form of feedback is supplemented with an HTML form (an example of which can be seen on the poster submitted by the project to the programme’s startup meeting) containing administrative information, marks awarded and additional feedback.

The project will examine the ways in which students and tutors interact and engage with their feedback, identify common perceptions and issues, and recommend areas requiring further support and guidelines for good practice.  In order to examine the applicability of this feedback approach in traditional settings, the project will also look at the impact of audio feedback in Italian modules at the University of Manchester.

The insight into the use of audio feedback across a variety of environments, and the range of training and support materials to be produced, should make eFEP a valuable addition to our understanding of the value of audio feedback as well as offering clear practical guidance to those considering adopting it.

Online Coursework Management Evaluation

The University of Exeter has developed an entirely online end-to-end coursework management system which is the subject of the Online Coursework Management Evaluation (OCME) project funded by JISC as part of the Assessment and Feedback programme Strand B.

This system sees the integration of Moodle and Turnitin within the university’s Exeter Learning Environment (ELE).  Assignments are submitted through the ELE, assigned an originality score by Turnitin, then available for marking through GradeMark (a commercial online marking system within Turnitin) or MS Word markup.  Feedback is returned to students either via uploaded forms or bespoke feedback forms, and are made available for viewing by both individual students and the personal tutor assigned to support them.  Initially deployed through a small 2011 pilot project funded by HEFCE, the system is now available institution-wide, although for practical reasons this evaluation project will concentrate on working with smaller groups across various disciplines.

Exeter’s Moodle support is provided by the University of London Computer Centre, who are developing the interface between Moodle and Turnitin.  There is strong internal support for the system which will be maintained and further developed well beyond the lifetime of this one year project.  What the OCME project will provide is a series of reports and briefing papers which will explore the pedagogic, technological and institutional aspects to transforming practice, and guidelines for future implementers and for those considering introducing such transformative technologies within their own institutions.  The experiences and lessons learned from this project should be of value across the sector.

Evaluating the Benefits of Electronic Assessment Management

Examining the embedding of electronic assessment management (EAM) within both administrative and teaching and learning practice is the main focus of the Evaluating the Benefits of Electronic Assessment Management (EBEAM) project running at the University of Huddersfield as part of the JISC Assessment and Feedback programme Strand B.  This 18 month project will look at how Turnitin, incorporating GradeMark and eRater, addresses student, staff and institutional requirements for timely, invidiualised and focused feedback, reduced staff workloads and increasing reflection on practice, and cost-effective, scaleable and sustainable innovation.

The dual focus on administrative and pedagogic aspects is crucial for real uptake of any new technology or process.  By providing a supportive administrative and technological infrastructure, institutions can enable academic staff to fully realise the benefits of innovative systems and practice, and provide a significantly enhanced learning environment for students.  The dynamic interplay of these factors is vividly illustrated in the poster the project submitted for the programme kick off meeting.  The impact on student satisfaction, achievement and retention rates already apparent at Huddersfield reflects the success of such an approach.

Like the Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan project, EBEAM is grounded in previous evaluation work investigating the benefits of Turnitin on staff and students.  As with other projects, the decision to adopt existing technologies incorporated through the institutional VLE (in this case, Blackboard) is a pragmatic choice, adopting known and proven technology rather than expending time and resources in developing yet more tools to do the same things.  Being able to pick up such tools as needed greatly increases institutional agility, and provides ready access to existing user groups and a wealth of shared practice.

EBEAM project staff also have a keen awareness of the need for meaningful and effective staff development to enable teaching staff to make full use of new technologies and achieve the integration of new approaches within their teaching practice, a theme covered in several posts on their excellent project blog.  The project will produce a wide range of development materials, including practically-focused toolkits, webinars and screencasts, which will be available through the project site and the JISC Design Studio.  In addition, they’re looking at ways of fully exploiting the extensive amount of data generated by these EAM systems to further enhance teaching and learning support as well as engaging administrative departments in discussions on topics such as data warehousing and change management.

The EBEAM project should provide an excellent study in the benefits of eassessment and of methods of integration that take a holistic approach to institutions and stakeholders.  I’m very much looking forward to seeing the outcomes of their work.

Evaluating feedback for elearning: centralised tutors

Providing fast, focused feedback to a cohort of 200 busy professionals undertaking vocational distance learning with tuition provided by a diminishing number of tutors, a number of whom are part-time, is definitely a challenging undertaking, and one for which the TQFE-Tutor system at the University of Dundee provides an innovative centralised approach.  The Evaluating Feedback for eLearning: Centralised Tutors (EFFECT) project, part of the JISC Assessment and Feedback programme Strand B, will be exploring the impact of this system and considering ways of further refining the process to maximise efficiency and student benefits.

Students studying on the Teaching Qualification (Further Education) programme at Dundee since the start of the 2010-11 session have been supported by a centralised tutor system that enables consistency and timeliness of feedback across the entire programme.  TQFE-tutor consists of a centralised email account, blog and microblogging site to which all tutors on the course have access.  Rather than students being assigned a personal tutor (who may have as little as 0.1 FTE allocated to the programme), support is provided by the entire team acting through the centralised account.  Students may email the TQFE-Tutor email address or post comments via the programme blog, with duty staff picking up queries and assignments as they arrive.  Programme announcements can be disseminated via the programme Twitter account, offering time and potentially cost savings.

As well as significantly increasing efficiency – students are guaranteed a response to any submission within two days, and usually receive one much faster – there are more subtle but equally important pedagogic benefits.  Feedback and advice provided to an individual can then be disseminated, suitably anonymised, to the rest of the current cohort via the TQFE-tutor blog; these entries also remain available for future years.  The accumulation of an effectively tagged bank of data supports independent learning while peer interaction and support enriches the learning process.  The use of Blackboard Safe Assign in place of paper submission has also helped streamline the assessment process and reduce administrative workloads.  Student achievement rates have risen, and the system may well contribute to increased retention.

You can follow the project’s progress over the next few months via their project blog, from which project outputs will also be available in due course.

Student-generated content for learning

Finding ways of engaging learners while maximising their learning, without negatively impacting on either staff or student workloads and that fit constrained institutional budgets is no small task, but the Student-Generated Content for Learning: Enhancing Engagement, Feedback and Performance project based at the University of Edinburgh is evaluating the use of technology that seems to do just that.

PeerWise, developed by the Department of Computer Science at the University of Auckland, is a free system designed not only to allow students to develop and publish their own questions but also to support a variety of social activities around those questions.  As well as simply answering others’ questions to test their own knowledge, students can also comment on questions, rate them and develop discussions around questions.  Of even more value is the ability for students to develop their own questions which – as the site explains – provides a number of pedagogic benefits to learners.  Students rapidly developed a strong sense of ownership of the online space and actively maintained and nurtured it, developing a very strongly supportive environment based on collaboration and shared responsibility for information accuracy and quality.

Perhaps one of the greatest strengths of the system is the way in which it can be used outside scheduled class times, as it is an inherently asynchronous model of dialogue and interaction.  This has proven particularly effective in supporting distance and placement students as it provides a very real sense of engagement with their peers and with their academic studies even when out of regular physical contact.

Edinburgh’s use of PeerWise was initially piloted in  undergraduate courses in Physics and Biology, and it is being extended to a wider range of courses and subject areas as part of this evaluation work.  In order to engage as many students as possible a small amount of the overall course mark was allocated to activity in PeerWise, with student-authored questions forming the basis for a question on the final examination of the course.  Scaffolding activities around question design encouraged students to consider their own misunderstandings of course material and resulted in exceptionally high quality materials reflecting the depth of learning and quality of engagement.

The project’s website is a rich source of resources around this work while the team’s blog provides reflection on the day-to-day progress of the project.  SGC4L is funded by JISC as part of the Assessment and Feedback programme Strand B.

The evaluation of assessment diaries and GradeMark at the University of Glamorgan

Two major, institution-wide innovations introduced in recent years at the University of Glamorgan are the subject of this project, funded as part of the JISC Assessment and Feedback Programme Strand B.

Arising as a result of a Change academy project running from 2008-10, the use of assessment diaries for scheduling and planning assessment, and GradeMark for online marking, have been adopted across the institution to various extents within different schools and faculties.  This new JISC project will examine the reasons for variation in adoption and explore staff and student experiences of these technologies as well as exploring strategies for staff development to encourage wider uptake.

The assessment diary system is a very simple, but very elegant approach to dealing with the issue of assessment bunching, identified by Glamorgan students as a major issue undermining learning and assessment performance.  Initially the problem was addressed simply by widely sharing assessment due dates for all courses within staff Outlook calendars, but considerable development work since then has resulted in an MS Access database backend with a web interface in Blackboard.  Seamless integration with Blackboard through a building block provides a single point of access to this information which can be fully personalised by both students and staff.  This provides a highly visual way of understanding the rhythms of course workload and has been very successful in helping students manage their time and plan their work in relation to assessment deadlines.  Staff have also found that the ease of access to detailed information on course timings has facilitated dialogue amongst staff across a range of courses and reflect more effectively on the student learning experience when redesigning or rescheduling modules.  Those departments that were early adopters of the diaries have seen significant improvement in their National Student Survey scores for feedback, and the diaries are also believed to have positively impacted on student retention rates.

By contrast, GradeMark is a commercial online marking tool within Turnitin that is gathering increasing adoption in the UK.  Adopted across the University of Glamorgan since 2009, it is seen as addressing a number of student dissatisfactions with feedback, including timeliness, level of detail and, importantly, the development and maintenance of a meaningful dialogue between both students and teachers, and amongst teachers themselves.  Both students and staff have responded very positively to the tool in an initial evaluation of its impact, and the current project will be able to explore its impact in greater detail.

Project outputs will include a series of video interviews with students and both teaching and administrative staff which will be freely available as OERs on YouTube, together with Panopto recordings of staff development and training sessions for asynchronous viewing.  The team is also exploring the use of online avatars for staff development discussions and scenario roleplaying, an exciting approach that has worked very successfully at the University of Hong Kong and which I hope to follow up in a later post.

The project’s blog is well worth a read to see how the team go about their evaluation and some of the issues they encounter on their journey.

Making Assessment Count Evaluation project

The Making Assessment Count (MAC) project ran from November 2008 to October 2010, funded by JISC as part of their Transforming Curriculum Delivery through Technology programme and led by the University of Westminster.  Focused on the desire to engage students with assessment feedback provided to them, it explored processes for encouraging and guiding student reflection on feedback and developing a dialogue between learners and teachers around feedback.  Participants at a joint Making Assessment Count/JISC CETIS event on Assessment and Feedback back in February 2011 heard not only from project staff but also from students who were actively using the system and whose enthusiasm for it and recognition of its impact on their development as learners was genuinely inspiring.

Although the project team developed the eReflect tool to support the Making Assessment Count process, the primary driver of the project was the conceptual model underlying the tool and the development of eReflect was a pragmatic decision based on the lack of suitable alternative technologies.  The team also contributed a service usage model (SUM) on their feedback process to the eFramework.

The Making Assessment Count Evaluation (MACE) project will see the Westminster team overseeing the implementation of the MAC model in an number of partner institutions, and will see the eReflect tool in use in the Universities of Bedfordshire and Greenwich, and Cardiff Metropolitan University (formerly the University of Wales Institute Cardiff).  By contrast, City University London and the University of Reading will be focusing on implementing the MAC model within Moodle (City) and BlackBoard (Reading), and exploring how the components already provided within those VLEs can be used to support this very different process and workflow.

It is perhaps the experiences of City and Reading that will be of most interest.  It’s becoming increasingly evident that there are very strong drivers for the use of existing technologies, particularly extensive systems such as VLEs, for new activities: there is already clear institutional buy-in and commitment to these technologies, institutional processes will (or at least, should!) have been adapted to reflect their integration in teaching practice, and embedding innovative practice within these technologies increases the sustainability of them beyond a limited funding or honeymoon period.  The challenges around getting MAC into Moodle and BlackBoard are those that led to the need for the eReflect tool in the first place: traditional assessment and survey technology simply isn’t designed to accommodate the process of dialogue and the engagement of a student with feedback provided that the MAC process drives.  Of course, Moodle and BlackBoard, representing community-driven open source software and proprietary commercial systems respectively, present very different factors in integrating new processes, and the project teams’ experiences and findings should be of great interest for future developers more generally.

Deployment of the MAC process in such a wide range of institutions, subject areas and technologies will provide a very thorough test of the model’s suitability and relevance for learners, as well as providing a range of implementer experiences and guidance.  I’m looking forward to following the progress of what should be a fascinating implementation and evaluation project, and hope to see learners in other institutions engage as enthusiastically and with such good outcomes as the participants in the original MAC project.

Under development: SWANI

Anyone who’s ever worked on a European funded project or programme will be all too familiar with the volume of paperwork and time spent on administration and auditing to meet European funding and reporting requirements.  Digital signatures, although highly time and cost efficient, are not acceptable for auditing purposes with only hand signed documentation being permitted.

As part of a consortium providing a significant amount of European funded work based learning in Wales, Coleg Sir Gâr were keen to find a solution that would meet both European and Welsh Assembly Government requirements for hand written signatures as well as providing the elegance and efficiency of the online learner management and learner support systems colleges and tutors wished for.

The Secure Work-Based Learning Administration through Networked Infrastructure (SWANI) project, funded under the JISC Learning and Teaching Innovation Grants SWaNI FE programme, therefore set out to identify ways of addressing this tension and establish a pilot project as a proof of concept to form the basis of a long term solution.

After some research the project team settled on the Fastdox digital document system as offering exactly the combination of hand signed originals and timestamped digital copies necessary to meet the needs of all parties.

The documents to be signed are created in a MySQL database supported by a very user friendly and remotely accessible web interface.  These are then printed using the Fastdox software which applies a unique pattern of microscopic dots to the physical document to communicate with the digital pen.  The pen functions just like an ordinary pen, allowing trainers to sign the documents normally and therefore produce the required hand signed physical document, but the pen also stores all the written information, time stamped, for later downloading into the online learner management and auditing system: an excellent overview of the entire process is available from the product site itself and an exploration of how it was put into practice can be found on the project’s blog.  At between £4-500 for each pen and software package it represents a one-time investment that fulfils a long term requirement, requires little training for tutors to use and meets all the requirements the project set out to address – indeed, the biggest problem the project team ran into was the lack of standardisation in documents across WBL providers and changes to the document design part way through the project which required some revision.

With the pilot now coming to a close, the project team will be adding further information to the project website and undertaking a series of dissemination activities.  Their solution should be useful not only to FE colleges with similar funding and auditing requirements but for anyone looking for efficient and effective digital document management and tracking.

Under development: xGames

xgames-logoThe xGames project, a collaboration between Reid Kerr and Anniesland colleges, has been running for nearly a year and is currently in the final stages of piloting its innovative use of wireless xBox360 controllers for classroom quizzes.  Funded as part of the JISC Learning and Teaching Innovation Grants: SWaNI (Scotland, Wales and Northern Ireland) FE programme, the project has produced a highly user friendly question editor to allow complete novices to quiz and game design to easily author questions.  These questions can then be played in one of several games designed by the project on a large screen linked to a standard Windows PC fitted with USB receivers for up to four wireless xBox controllers.  Using wireless controllers is crucial as the range of the sensors allows a great deal of flexibility in classroom set up, permitting the use of breakout groups to discuss topics and feedback, for example.  Additionally, xBox controllers are familiar to many learners who are more confident using them than PC gaming.

The video below demonstrates the system’s use in a primary school classroom, and the engagement and enthusiasm of the children is immediately obvious, with lively discussions about the quiz questions and clear enjoyment of the session, the immediate indication of correct and incorrect answers providing instant feedback to the pupils.  The use of the large screen allows the teacher to constantly maintain a clear overview of the progress of the entire class, allowing her to identify topics that are generally not understood and which require whole class revision or struggling individuals within the group.  Discussion amongst the older group of college students is more muted, but their focus on the game mechanics and subject matter is evident.

[youtube]http://www.youtube.com/watch?v=ZRZZj1u9KQ0[/youtube]

The games, screenshots for which can be found under the games menu on the project site (software will be available from this site in due course), are designed using industry standard software such as XNA Game Studio, 3D Studio Max, Fireworks and Illustrator, with the question editor using a Visual Basic form for generating plain text files containing the question stem, distractors and correct response.  Unlike a commercial system such as Quia, questions are stored in a shared public folder so they can easily be shared and reused by teachers in different institutions.  XGames has generated interest from FE and, particularly, schools, and may well see further uptake as an affordable and easily adopted way of bringing game based learning into more classrooms.

JISC Assessment and Feedback Programme Strand C

The final part of the current JISC Assessment and Feedback Programme, Strand C provides support for technical development work to ‘package a technology innovation in assessment and feedback for re-use (with associated processes and practice), and support its transfer to two or more named external institutions’.  This will see a number of innovative systems, including those developed over recent years with direct support from JISC, that have reached sufficient maturity adopted outside their originating institution and used to directly enhance teaching and learning.

The Open Mentor Technology Transfer (OMtetra) project will see The Open University’s Open Mentor system packaged and transferred to the University of Southampton and King’s College London.  This unique system profiles tutor feedback to enhance the tutor’s ability to reflect on the feedback she provides, enhancing both the quality of the feedback students receive and the tutor’s own professional development and understanding.

QTI Delivery Interaction (QTIDI) led by the University of Glasgow in partnership with the Universities of Edinburgh, Southampton and Strathclyde, Kingston University and Harper Adams University College will package the JISC-funded MathAssessEngine – itself a development of earlier JISC-funded work – for deployment in the partner institutions through Moodle and other VLEs through the development of a thin Learning Tools Interoperability layer to launch assessments from within the VLE and return scores to the VLE gradebook.  As the name suggests, this system will be fully compliant with the IMS Question and Test Interoperability (QTI) v2.1 specification.

Uniqurate is led by Kingston University, working in collaboration with the Universities of Southampton and Strathclyde, to produce and share a high quality QTI assessment content authoring system.  Again building on extensive previous work in this area, the focus of this project will be to produce user friendly interfaces to allow newcomers to eassessment and those unfamiliar with QTI to easily and confidently create high quality interoperable content.  As a sister project to QTIDI, the two project teams are working closely together to provide a suite of tools for interoperable eassessment.

The Rogo OSS project will see the University of Nottingham package their in-house eassessment system and support its implementation at De Montford University and the Universities of East Anglia, Bedford, Oxford and the West of Scotland.  This is an extensive and mature open source system, with over seven years development and deployment behind it, supporting a wide range of question types, QTI import and export, support for LaTeX, foreign languages, accessibility, a range of multimedia formats, LDAP authentication, invigilator support and embedded workflows covering the range of activities within the eassessment lifecycle.

All four projects represent an exciting stage in the work JISC has been funding, directly and indirectly, for several years, and will see these tools becoming available to the wider and non-specialist HE community supported by detailed resources and lively and engaged user communities.