Cetis Blogs - expert commentary on educational technology » jiscassess http://blogs.cetis.org.uk Specialists in educational technology and standards Tue, 12 May 2015 11:45:38 +0000 en-US hourly 1 http://wordpress.org/?v=4.1.22 Learning Analytics for Assessment and Feedback Webinar, 15 May http://blogs.cetis.org.uk/sheilamacneill/2013/05/13/learning-analytics-for-assessment-and-feedback-webinar-15-may/ http://blogs.cetis.org.uk/sheilamacneill/2013/05/13/learning-analytics-for-assessment-and-feedback-webinar-15-may/#comments Mon, 13 May 2013 12:22:29 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2257 **update 16 May** Link to session recording Later this week I’ll be chairing a (free) webinar on Learning Analytics for Assessment and Feeback. Featuring work from three projects in the current Jisc Assessment and Feedback Programme. I’m really looking forward to hearing first hand about the different approaches being developed across the programme. “The concept [...]

The post Learning Analytics for Assessment and Feedback Webinar, 15 May appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
**update 16 May**
Link to session recording

Later this week I’ll be chairing a (free) webinar on Learning Analytics for Assessment and Feeback. Featuring work from three projects in the current Jisc Assessment and Feedback Programme. I’m really looking forward to hearing first hand about the different approaches being developed across the programme.

“The concept of learning analytics is gaining traction in education as an approach to using learner data to gain insights into different trends and patterns but also to inform timely and appropriate support interventions. This webinar will explore a number of different approaches to integrating learning analytics into the context of assessment and feedback design; from overall assessment patterns and VLE usage in an institution, to creating student facing workshops, to developing principles for dashboards.”

The presentations will feature current thinking and approaches from teams from the following projects:
*TRAFFIC, Manchester Metropolitan University
*EBEAM, University of Huddersfield,
*iTeam, University of Hertfordshire

The webinar takes place Wednesday 15 May at 1pm (UK time) and is free to attend. A recording will also be available after the session. You can register by following this link.

The post Learning Analytics for Assessment and Feedback Webinar, 15 May appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/05/13/learning-analytics-for-assessment-and-feedback-webinar-15-may/feed/ 0
Question and Test tools demonstrate interoperability http://blogs.cetis.org.uk/wilbert/2012/03/16/question-and-test-tools-demonstrate-interoperability/ http://blogs.cetis.org.uk/wilbert/2012/03/16/question-and-test-tools-demonstrate-interoperability/#comments Fri, 16 Mar 2012 13:32:31 +0000 http://blogs.cetis.org.uk/wilbert/?p=162 As the QTI 2.1 specification gets ready for final release, and new communities start picking it up, conforming tools demonstrated their interoperability at the JISC - CETIS 2012 conference.

The post Question and Test tools demonstrate interoperability appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
As the QTI 2.1 specification gets ready for final release, and new communities start picking it up, conforming tools demonstrated their interoperability at the JISC – CETIS 2012 conference.

The latest version of the world’s only open computer aided assessment interoperability specification, IMS’ QTI 2.1, has been in public beta for some time. That was time well spent, because it allowed groups from across at least eight nations across four continents to apply it to their assessment tools and practices, surface shortcomings with the spec, and fix them.

Nine of these groups came together at the JISC – CETIS conference in Nottingham this year to test a range of QTI packages with their tools, ranging from the very simple to the increasingly specialised. In the event, only three interoperability bugs were uncovered in the tools, and those are being vigorously stamped on right now.

Where it gets more complex is who supports what part of the specification. The simplest profile, provisionally called CC QTI, was supported by all players and some editors in the Nottingham bash. Beyond that, it’s a matter of particular communities matching their needs to particular features of the specification.

In the US, the Accessible Portable Item Profile (APIP) group brings together a group of major test and tool vendors, that are building a profile for summative testing in schools. Their major requirement is the ability to finely adjust the presentation of questions to learners with diverse needs, which is why they have accomplished by building an extension to QTI 2.1. The material also works in QTI tools that haven’t been built explicitly for APIP yet.

A similar group has sprung up in the Netherlands, where the goal is to define all computer aided high stakes school testing in the country in QTI 2.1 That means that a fairly large infrastructure of authoring tools and players is being built at the moment. Since the testing material covers so many subjects and levels, there will be a series of profiles to cover them all.

An informal effort has also sprung up to define a numerate profile for higher education, that may yet be formalised. In practice, it already works in the tools made by the French MOCAH project, and the JISC Assessment and Feedback sponsored QTI-DI and Uniqurate projects.

For the rest of us, it’s likely that IMS will publish something very like the already proven CC QTI as the common core profile that comes with the specification.

More details about the tools that were demonstrated are available at the JISC – CETIS conference pages.

The post Question and Test tools demonstrate interoperability appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
http://blogs.cetis.org.uk/wilbert/2012/03/16/question-and-test-tools-demonstrate-interoperability/feed/ 0
Technologies in use in the JISC Assessment and Feedback programme Strand B (evidence and evaluation) http://blogs.cetis.org.uk/rowin/2012/02/09/strand-b-synthesis/ http://blogs.cetis.org.uk/rowin/2012/02/09/strand-b-synthesis/#comments Thu, 09 Feb 2012 12:31:42 +0000 http://blogs.cetis.org.uk/rowin/?p=563 The JISC Assessment and Feedback Programme is now in its fifth month, looking at a wide range of technological innovations around assessment and feedback in HE and FE.  Strand B is focused on the evaluation of earlier work, gathering and evaluating evidence on the impact of these innovations and producing guidelines and supporting material to [...]

The post Technologies in use in the JISC Assessment and Feedback programme Strand B (evidence and evaluation) appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
The JISC Assessment and Feedback Programme is now in its fifth month, looking at a wide range of technological innovations around assessment and feedback in HE and FE.  Strand B is focused on the evaluation of earlier work, gathering and evaluating evidence on the impact of these innovations and producing guidelines and supporting material to facilitate their adoption in other subject areas and institutions.  These projects cover a broad range of technologies, but are themselves not involved in technological development but rather in examining and reporting the impact of such developments.

The information here was gathered through fairly informal conversations with the projects, building on the information initially provided in their funding applications.  Information from these calls is added to our project database (PROD) – you can see some of the amazing uses this information can be put to in the series of blog posts Martin Hawksey has produced as part of his work on visualisations of the OER programme, as well as some of the work by my colleagues David, Sheila and Wilbert.

This blog post is rather less ambitious than their work (!), and is intended to provide a quick snapshot of technologies that projects in this specific programme strand are finding valuable for their work.  For more information on the projects in general you can find all my posts on this programme linked from here.

Underlying technologies

Although the underlying technologies – that is, the technologies used by the innovations they’re evaluating – aren’t the direct focus of these projects, I’ve included them as they’re obviously of interest.  They also show the very broad range of approaches and methods being evaluated by this project.

Several of the projects expressed a strong desire to reuse existing tools and resources  such as MS Office and other commercial software solutions, rather than reinvent the wheel by developing new software; there were also very compelling issues around staff training for new systems, staff familiarity and comfort with existing systems and strong pressure from staff, students and management to work within institutional VLEs.

Purpose

Technology

Feedback delivery

MS Word (annotated markup documents)

eTMA (electronic tutor marked assignment) system

Assignment timetables (diaries)

MS Access

VLE

Moodle

Blackboard

Online marking

GradeMark

Student generation of assessment content, social network functionality supported

PeerWise

Plagiarism detection

Turnitin

Blackboard Safe Assign

Bug reporting

Pivotal Tracker

Surface tables to improve online marking process

Pen devices to improve online marking process

Managing self-reflection workflow

eReflect

Online learning diary

Automated writing technique evaluation tool

Turnitin eRater

Communication with students, course news, deadline reminders

FaceBook

Peer assessment tool

PeerMark

Centralised email account, blog and microblog for managing assignment submissions and communicating with students and staff

TQFE-Tutor

Communication with students

Twitter

Blog for discussion of common Q&As, general assignment feedback

WordPress

Webinars

Adobe Connect

EVS

Evidence gathering

As these projects are about collecting and evaluating evidence, the approaches taken to this are of obvious interest.

There was a strong emphasis on interviewing as the main approach, with audio and video interviews being recorded for subsequent analysis and dissemination where appropriate approval has been given.  Jing was the main recording system cited for this.  Surveys (which can be considered a kind of asynchronous interview) were also mentioned, with Survey Monkey being the tool of choice for this.

Less structured impressions were also sought, with Jing again being cited as a valuable tool for capturing staff and student feedback.  Twitter was also mentioned for this purpose.

Evidence analysis

The emphasis of this strand is on qualitative rather than quantitative outcomes, with users’ experiences, case studies and the development of guidance documents and staff development resources being the main focus.

Nvivo was cited as the tool of choice for the transcription and coding of audio and written feedback for subsequent analysis.  Collaborative writing, analysis and version control are the main concern for this part of the projects, and are being addressed through the use of Google Docs and SharePoint.

Standards referenced

The standards used by projects in this programme are fairly generic.  None of these projects are using standards such as those produced by IMS as they were felt to be not really relevant to this level of work.  One project was looking at the use of IMS Learning Tools Interoperability as providing an approach to integrating their software development with a number of different VLEs being used by institutions within their consortium.  Beyond this, the standards referenced were unremarkable: primarily MP3 and HTML.

Dissemination

All the projects have thorough dissemination plans in place to ensure that their findings are shared as widely as possible.  It was great to see that all the projects referenced the JISC Design Studio, a fantastic resource that is well worth digging around in.  Overall there is a wide range of technologies being used to ensure that the findings from these projects reach as broad an audience as possible.  Again, there is a clear mix between established, proprietary software and free services, reflecting the range of technologies in use within institutions and the different institutional contexts of these projects.

Purpose

Technology

Recording seminars

Panopto

Publishing videos

YouTube

Dissemination

JISC Design Studio

Reports

Guidance documents

Peer reviewed publications

Project website

Workshop

Elluiminate Live

Dissemination and community building

Cloudworks

Case studies

Dissemination

Yammer

Dissemination and community building

Twitter

Dissemination

MS Office Communicator (now Lync)

Dissemination

Google docs

Sharing stable versions

Sharepoint

Screen capture – staff development

Jing

Camtasia

Toolkits

Project blog

WordPress

Conference attendance

The post Technologies in use in the JISC Assessment and Feedback programme Strand B (evidence and evaluation) appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
http://blogs.cetis.org.uk/rowin/2012/02/09/strand-b-synthesis/feed/ 0
Evaluating Electronic Voting Systems for Enhancing Student Experience http://blogs.cetis.org.uk/rowin/2012/01/26/eevs/ http://blogs.cetis.org.uk/rowin/2012/01/26/eevs/#comments Thu, 26 Jan 2012 14:26:09 +0000 http://blogs.cetis.org.uk/rowin/?p=553 The eighth project in Strand B (Evidence and Evaluation) of the JISC Assessment and Feedback Programme is Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS), based at the University of Hertfordshire.  This one year project is undertaking an extensive review of the use of electronic voting systems (EVS) in a range of schools across the [...]

The post Evaluating Electronic Voting Systems for Enhancing Student Experience appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
The eighth project in Strand B (Evidence and Evaluation) of the JISC Assessment and Feedback Programme is Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS), based at the University of Hertfordshire.  This one year project is undertaking an extensive review of the use of electronic voting systems (EVS) in a range of schools across the institution, gathering testimony from both staff and students on their experiences, insights and identified issues and success factors.

Hertfordshire has invested substantially in assessment and feedback in recent years, with an extensive programme of innovations including the purchase of nearly four thousand EVS handsets for use in teaching in eight schools.  The initial response to their introduction, from both staff and students, has been very positive, with the system seen as improving both classroom interaction and staff and student workloads.

The EEVS project will produce a thorough longitudinal study of the impact of EVS, including audio and video interviews, reflective writing and interviews over the course of the academic year.  This long term view will enable the project team to examine key periods in the academic year such as students’ initial encounters with the system, the perceived value and impact on exam performance of interactive revision lectures, technological issues around introduction in new classroom environments, and so on.

The project will produce a number of outputs, including valuable evidence to the sector on the impact of such large scale implementation, detailed guidance on the installation and deployment of EVS and subject-specific case studies, as well as a series vox pop snapshots from teaching staff, students and support staff on their experiences of EVS.  You can follow their progress on their project blog.

The post Evaluating Electronic Voting Systems for Enhancing Student Experience appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
http://blogs.cetis.org.uk/rowin/2012/01/26/eevs/feed/ 0
InterACT: modelling feedback flow http://blogs.cetis.org.uk/rowin/2012/01/25/interact-modelling-feedback-flow/ http://blogs.cetis.org.uk/rowin/2012/01/25/interact-modelling-feedback-flow/#comments Wed, 25 Jan 2012 12:13:23 +0000 http://blogs.cetis.org.uk/rowin/?p=548 The InterACT project at the University of Dundee, part of the JISC Assessment and Feedback Programme Strand A (institutional change) is working on enhancing feedback dialogue, reflection and feed-forward in a large postgraduate online distance learning course in medical education. The course is unusual in that progress is heavily learner-driven: as students are working professionals [...]

The post InterACT: modelling feedback flow appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
The InterACT project at the University of Dundee, part of the JISC Assessment and Feedback Programme Strand A (institutional change) is working on enhancing feedback dialogue, reflection and feed-forward in a large postgraduate online distance learning course in medical education.

The course is unusual in that progress is heavily learner-driven: as students are working professionals they are able to enrol and submit assignments at any time they chose rather than according to a predetermined course timetable, and while this significantly increases the flexibility and accessibility of the course, this lack of external structure can impact, together with the higher attrition rates noted in online distance learning in general, on student progress and retention.

Assessment feedback has traditionally been offered at the end of each module of study, when assignments are submitted, which clearly limits the potential for reflection and learning from feedback.  The InterACT project will transform this model through the integration of technology to support a more dynamic, ongoing feed-forward process that actively encourages learners to reflect and act on feedback and builds dialogue between learners and tutors.

The project team have now released their first draft of their proposed new model of feedback, and are actively seeking comment from the wider community.  Dialogue between tutor and learner is focused around cover sheets appended to submitted work which encourage self-evaluation and reflection on assessment performance as well as making explicit the intention that past feedback should impact on future work.  The use of private blogs or wikis as a personal reflective space is intended to encourage this focus on the ongoing interplay of past and future performance.

Do get involved with the discussion, either via the blog post or through contacting the project team.

The post InterACT: modelling feedback flow appeared first on Cetis Blogs - expert commentary on educational technology.

]]>
http://blogs.cetis.org.uk/rowin/2012/01/25/interact-modelling-feedback-flow/feed/ 0