The JISC Assessment and Feedback Programme is now in its fifth month, looking at a wide range of technological innovations around assessment and feedback in HE and FE. Strand B is focused on the evaluation of earlier work, gathering and evaluating evidence on the impact of these innovations and producing guidelines and supporting material to facilitate their adoption in other subject areas and institutions. These projects cover a broad range of technologies, but are themselves not involved in technological development but rather in examining and reporting the impact of such developments.
The information here was gathered through fairly informal conversations with the projects, building on the information initially provided in their funding applications. Information from these calls is added to our project database (PROD) – you can see some of the amazing uses this information can be put to in the series of blog posts Martin Hawksey has produced as part of his work on visualisations of the OER programme, as well as some of the work by my colleagues David, Sheila and Wilbert.
This blog post is rather less ambitious than their work (!), and is intended to provide a quick snapshot of technologies that projects in this specific programme strand are finding valuable for their work. For more information on the projects in general you can find all my posts on this programme linked from here.
Underlying technologies
Although the underlying technologies – that is, the technologies used by the innovations they’re evaluating – aren’t the direct focus of these projects, I’ve included them as they’re obviously of interest. They also show the very broad range of approaches and methods being evaluated by this project.
Several of the projects expressed a strong desire to reuse existing tools and resources such as MS Office and other commercial software solutions, rather than reinvent the wheel by developing new software; there were also very compelling issues around staff training for new systems, staff familiarity and comfort with existing systems and strong pressure from staff, students and management to work within institutional VLEs.
Purpose
|
Technology
|
Feedback delivery
|
MS Word (annotated markup documents)
eTMA (electronic tutor marked assignment) system
|
Assignment timetables (diaries)
|
MS Access
|
VLE
|
Moodle
Blackboard
|
Online marking
|
GradeMark
|
Student generation of assessment content, social network functionality supported
|
PeerWise
|
Plagiarism detection
|
Turnitin
Blackboard Safe Assign
|
Bug reporting
|
Pivotal Tracker
|
Surface tables to improve online marking process
|
|
Pen devices to improve online marking process
|
|
Managing self-reflection workflow
|
eReflect
|
Online learning diary
|
|
Automated writing technique evaluation tool
|
Turnitin eRater
|
Communication with students, course news, deadline reminders
|
FaceBook
|
Peer assessment tool
|
PeerMark
|
Centralised email account, blog and microblog for managing assignment submissions and communicating with students and staff
|
TQFE-Tutor
|
Communication with students
|
Twitter
|
Blog for discussion of common Q&As, general assignment feedback
|
WordPress
|
Webinars
|
Adobe Connect
|
EVS
|
|
Evidence gathering
As these projects are about collecting and evaluating evidence, the approaches taken to this are of obvious interest.
There was a strong emphasis on interviewing as the main approach, with audio and video interviews being recorded for subsequent analysis and dissemination where appropriate approval has been given. Jing was the main recording system cited for this. Surveys (which can be considered a kind of asynchronous interview) were also mentioned, with Survey Monkey being the tool of choice for this.
Less structured impressions were also sought, with Jing again being cited as a valuable tool for capturing staff and student feedback. Twitter was also mentioned for this purpose.
Evidence analysis
The emphasis of this strand is on qualitative rather than quantitative outcomes, with users’ experiences, case studies and the development of guidance documents and staff development resources being the main focus.
Nvivo was cited as the tool of choice for the transcription and coding of audio and written feedback for subsequent analysis. Collaborative writing, analysis and version control are the main concern for this part of the projects, and are being addressed through the use of Google Docs and SharePoint.
Standards referenced
The standards used by projects in this programme are fairly generic. None of these projects are using standards such as those produced by IMS as they were felt to be not really relevant to this level of work. One project was looking at the use of IMS Learning Tools Interoperability as providing an approach to integrating their software development with a number of different VLEs being used by institutions within their consortium. Beyond this, the standards referenced were unremarkable: primarily MP3 and HTML.
Dissemination
All the projects have thorough dissemination plans in place to ensure that their findings are shared as widely as possible. It was great to see that all the projects referenced the JISC Design Studio, a fantastic resource that is well worth digging around in. Overall there is a wide range of technologies being used to ensure that the findings from these projects reach as broad an audience as possible. Again, there is a clear mix between established, proprietary software and free services, reflecting the range of technologies in use within institutions and the different institutional contexts of these projects.