Rowin Young http://blogs.cetis.org.uk/rowin Cetis Blog Tue, 21 Feb 2012 14:42:49 +0000 en-US hourly 1 http://wordpress.org/?v=4.1.22 Games animals play http://blogs.cetis.org.uk/rowin/2012/02/21/games-animals-play/ http://blogs.cetis.org.uk/rowin/2012/02/21/games-animals-play/#comments Tue, 21 Feb 2012 14:42:49 +0000 http://blogs.cetis.org.uk/rowin/?p=577 Play is an important part of animal development, as with child development: animals learn to hunt and fight just as children learn to perform tasks and socialise.  And as with humans, animal play isn’t just limited to learning for future survival, but is a valuable part of day-to-day wellbeing.  Providing adequate mental stimulation and engagement is particularly important for captive animals, confined in relatively small environments where normal behaviour such as hunting is very limited, and with feeding and other activities subject to external schedules.

The TOUCH (Technology to Orangutans for Understanding and Communicating cross-species for greater Harmony) project, based at the Hong Kong Polytechnic University’s School of Design, is working on the design of digital systems to enable humans and orangutans to play games together – and, in particular, games where orangutans will almost certainly beat their human competitors.  Orangutans perform particularly well on games similar to pelmanism that rely on visual memory, and will almost invariably out-perform any human challenger.

The Hong Kong orangutans aren’t the first to engage with computer games: Samatran orangutans at Zoo Atlanta have been using them for several years as researchers attempt to understand their cognitive processes in order to help plan interventions to increase the survivability of the species in the wild.  Where the TOUCH project differs is in looking at games primarily as entertainment for non-humans, and as a focal point for enhancing cross-species communication and interaction.

In both projects, as in others, tangible rewards such as food or ‘social praise’ from their human playmates are provided to help train the animals to play within the rules or framework of the game, but many are content to continue playing even without such rewards: game play itself is ‘inherently rewarding‘ for them.  Playing within the rules, or consciously transgressing them, is fundamental to a ludological view of games: construction of the fourth wall, acceptance of the ability to only go up ladders and down snakes and the impossibility of going up snakes or down ladders, is what gives play structure and meaning.  YouTube is full of wonderful clips of all kinds of animals interacting with digital games, but not playing in the sense of following rules; the actual pleasure they get from them is also debatable.

Engaging cats in digital games, either solo or with a human partner, is the focus of Cat Cat Revolution, which is exploring the development of games on the iPad to enable this.  The project’s video, below, shows some varying results, but it’s clear that the game captures the attention and curiosity of the cats, in particular the youngest kitten in the study.  Similarly, iPad Game for Cats, a free game with paid-for additional levels available, clearly provides great entertainment for cats of all sizes.  Unlike TOUCH, which found that many of the orangutans were very happy to play purely for praise and interaction, the extent of the engagement between feline and human participants isn’t clear: while it’s obvious that the humans are getting a great deal of pleasure from playing with and watching their pets, the cats seem interested purely in the game with the human interaction being incidental (but then, they are cats ;) ).

[youtube]http://www.youtube.com/watch?v=t0ytTQZ5-Kc[/youtube]

These studies are fascinating.  Positioning animals as digital gamers, and knowing participants within multiplayer, multi-species games can enable us to learn so much more about them, ourselves, and the nature and universals of play.  Most of all, improving the welfare of captive animals and potentially increasing their ability to survive in the wild through skills learned through digital play would be the greatest outcomes of all.

Of course, like kids everywhere, sometimes it’s not the game but the box it came in that provides the most entertainment ;)

]]>
http://blogs.cetis.org.uk/rowin/2012/02/21/games-animals-play/feed/ 3
Technologies in use in the JISC Assessment and Feedback programme Strand B (evidence and evaluation) http://blogs.cetis.org.uk/rowin/2012/02/09/strand-b-synthesis/ http://blogs.cetis.org.uk/rowin/2012/02/09/strand-b-synthesis/#comments Thu, 09 Feb 2012 12:31:42 +0000 http://blogs.cetis.org.uk/rowin/?p=563 The JISC Assessment and Feedback Programme is now in its fifth month, looking at a wide range of technological innovations around assessment and feedback in HE and FE.  Strand B is focused on the evaluation of earlier work, gathering and evaluating evidence on the impact of these innovations and producing guidelines and supporting material to facilitate their adoption in other subject areas and institutions.  These projects cover a broad range of technologies, but are themselves not involved in technological development but rather in examining and reporting the impact of such developments.

The information here was gathered through fairly informal conversations with the projects, building on the information initially provided in their funding applications.  Information from these calls is added to our project database (PROD) – you can see some of the amazing uses this information can be put to in the series of blog posts Martin Hawksey has produced as part of his work on visualisations of the OER programme, as well as some of the work by my colleagues David, Sheila and Wilbert.

This blog post is rather less ambitious than their work (!), and is intended to provide a quick snapshot of technologies that projects in this specific programme strand are finding valuable for their work.  For more information on the projects in general you can find all my posts on this programme linked from here.

Underlying technologies

Although the underlying technologies – that is, the technologies used by the innovations they’re evaluating – aren’t the direct focus of these projects, I’ve included them as they’re obviously of interest.  They also show the very broad range of approaches and methods being evaluated by this project.

Several of the projects expressed a strong desire to reuse existing tools and resources  such as MS Office and other commercial software solutions, rather than reinvent the wheel by developing new software; there were also very compelling issues around staff training for new systems, staff familiarity and comfort with existing systems and strong pressure from staff, students and management to work within institutional VLEs.

Purpose

Technology

Feedback delivery

MS Word (annotated markup documents)

eTMA (electronic tutor marked assignment) system

Assignment timetables (diaries)

MS Access

VLE

Moodle

Blackboard

Online marking

GradeMark

Student generation of assessment content, social network functionality supported

PeerWise

Plagiarism detection

Turnitin

Blackboard Safe Assign

Bug reporting

Pivotal Tracker

Surface tables to improve online marking process

Pen devices to improve online marking process

Managing self-reflection workflow

eReflect

Online learning diary

Automated writing technique evaluation tool

Turnitin eRater

Communication with students, course news, deadline reminders

FaceBook

Peer assessment tool

PeerMark

Centralised email account, blog and microblog for managing assignment submissions and communicating with students and staff

TQFE-Tutor

Communication with students

Twitter

Blog for discussion of common Q&As, general assignment feedback

WordPress

Webinars

Adobe Connect

EVS

Evidence gathering

As these projects are about collecting and evaluating evidence, the approaches taken to this are of obvious interest.

There was a strong emphasis on interviewing as the main approach, with audio and video interviews being recorded for subsequent analysis and dissemination where appropriate approval has been given.  Jing was the main recording system cited for this.  Surveys (which can be considered a kind of asynchronous interview) were also mentioned, with Survey Monkey being the tool of choice for this.

Less structured impressions were also sought, with Jing again being cited as a valuable tool for capturing staff and student feedback.  Twitter was also mentioned for this purpose.

Evidence analysis

The emphasis of this strand is on qualitative rather than quantitative outcomes, with users’ experiences, case studies and the development of guidance documents and staff development resources being the main focus.

Nvivo was cited as the tool of choice for the transcription and coding of audio and written feedback for subsequent analysis.  Collaborative writing, analysis and version control are the main concern for this part of the projects, and are being addressed through the use of Google Docs and SharePoint.

Standards referenced

The standards used by projects in this programme are fairly generic.  None of these projects are using standards such as those produced by IMS as they were felt to be not really relevant to this level of work.  One project was looking at the use of IMS Learning Tools Interoperability as providing an approach to integrating their software development with a number of different VLEs being used by institutions within their consortium.  Beyond this, the standards referenced were unremarkable: primarily MP3 and HTML.

Dissemination

All the projects have thorough dissemination plans in place to ensure that their findings are shared as widely as possible.  It was great to see that all the projects referenced the JISC Design Studio, a fantastic resource that is well worth digging around in.  Overall there is a wide range of technologies being used to ensure that the findings from these projects reach as broad an audience as possible.  Again, there is a clear mix between established, proprietary software and free services, reflecting the range of technologies in use within institutions and the different institutional contexts of these projects.

Purpose

Technology

Recording seminars

Panopto

Publishing videos

YouTube

Dissemination

JISC Design Studio

Reports

Guidance documents

Peer reviewed publications

Project website

Workshop

Elluiminate Live

Dissemination and community building

Cloudworks

Case studies

Dissemination

Yammer

Dissemination and community building

Twitter

Dissemination

MS Office Communicator (now Lync)

Dissemination

Google docs

Sharing stable versions

Sharepoint

Screen capture – staff development

Jing

Camtasia

Toolkits

Project blog

WordPress

Conference attendance

]]>
http://blogs.cetis.org.uk/rowin/2012/02/09/strand-b-synthesis/feed/ 0
Evaluating Electronic Voting Systems for Enhancing Student Experience http://blogs.cetis.org.uk/rowin/2012/01/26/eevs/ http://blogs.cetis.org.uk/rowin/2012/01/26/eevs/#comments Thu, 26 Jan 2012 14:26:09 +0000 http://blogs.cetis.org.uk/rowin/?p=553 The eighth project in Strand B (Evidence and Evaluation) of the JISC Assessment and Feedback Programme is Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS), based at the University of Hertfordshire.  This one year project is undertaking an extensive review of the use of electronic voting systems (EVS) in a range of schools across the institution, gathering testimony from both staff and students on their experiences, insights and identified issues and success factors.

Hertfordshire has invested substantially in assessment and feedback in recent years, with an extensive programme of innovations including the purchase of nearly four thousand EVS handsets for use in teaching in eight schools.  The initial response to their introduction, from both staff and students, has been very positive, with the system seen as improving both classroom interaction and staff and student workloads.

The EEVS project will produce a thorough longitudinal study of the impact of EVS, including audio and video interviews, reflective writing and interviews over the course of the academic year.  This long term view will enable the project team to examine key periods in the academic year such as students’ initial encounters with the system, the perceived value and impact on exam performance of interactive revision lectures, technological issues around introduction in new classroom environments, and so on.

The project will produce a number of outputs, including valuable evidence to the sector on the impact of such large scale implementation, detailed guidance on the installation and deployment of EVS and subject-specific case studies, as well as a series vox pop snapshots from teaching staff, students and support staff on their experiences of EVS.  You can follow their progress on their project blog.

]]>
http://blogs.cetis.org.uk/rowin/2012/01/26/eevs/feed/ 0
Deterrents don’t deter? http://blogs.cetis.org.uk/rowin/2012/01/26/deterrents-dont-deter/ http://blogs.cetis.org.uk/rowin/2012/01/26/deterrents-dont-deter/#comments Thu, 26 Jan 2012 13:19:24 +0000 http://blogs.cetis.org.uk/rowin/?p=546 A recent article in THES reports on research by Robert J. Youmans at California State University Northridge that found that

Students who are aware that their work will be checked by plagiarism-detection software are just as likely to cheat as those who are not.

Conventional wisdom – and intuition – suggests that the threat of discovery, and subsequent punishment, is an effective deterrent against plagiarism – indeed, one of the comments on the article points to another study that suggested that students’ awareness of the use of Turnitin on a course significantly reduced plagiarism.

It’s not always clear whether plagiarism is an intentional and cynical attempt to deceive, the result of bad time management and poor writing or referencing skills, or due to genuine lack of understanding of the concept of plagiarism or differing cultural norms around it.  The first category of student is the category most likely to resort to essay mills as a safer alternative where it’s made clear that plagiarism detection is in use, which suggests that the majority of students ‘caught’ by Turnitin and other text matching techniques when their use is advertised as a supposed deterrent are those whose main problem is not a desire to cheat but academic or personal factors.

Findings like this seem to strengthen the arguments in favour of using Turnitin formatively, as part of a student’s academic development and the essay writing process, rather than as a way of detecting problems once it’s too late to do anything about them and the student has entered the disciplinary process.  The use of plagiarism detection only after submission seems to be based on the assumption that plagiarism only occurs through a deliberate desire to cheat, and as I’ve argued before, positions all students as potential cheats rather than as developing academics who may be in need of guidance and support to achieve their potential.

]]>
http://blogs.cetis.org.uk/rowin/2012/01/26/deterrents-dont-deter/feed/ 1
InterACT: modelling feedback flow http://blogs.cetis.org.uk/rowin/2012/01/25/interact-modelling-feedback-flow/ http://blogs.cetis.org.uk/rowin/2012/01/25/interact-modelling-feedback-flow/#comments Wed, 25 Jan 2012 12:13:23 +0000 http://blogs.cetis.org.uk/rowin/?p=548 The InterACT project at the University of Dundee, part of the JISC Assessment and Feedback Programme Strand A (institutional change) is working on enhancing feedback dialogue, reflection and feed-forward in a large postgraduate online distance learning course in medical education.

The course is unusual in that progress is heavily learner-driven: as students are working professionals they are able to enrol and submit assignments at any time they chose rather than according to a predetermined course timetable, and while this significantly increases the flexibility and accessibility of the course, this lack of external structure can impact, together with the higher attrition rates noted in online distance learning in general, on student progress and retention.

Assessment feedback has traditionally been offered at the end of each module of study, when assignments are submitted, which clearly limits the potential for reflection and learning from feedback.  The InterACT project will transform this model through the integration of technology to support a more dynamic, ongoing feed-forward process that actively encourages learners to reflect and act on feedback and builds dialogue between learners and tutors.

The project team have now released their first draft of their proposed new model of feedback, and are actively seeking comment from the wider community.  Dialogue between tutor and learner is focused around cover sheets appended to submitted work which encourage self-evaluation and reflection on assessment performance as well as making explicit the intention that past feedback should impact on future work.  The use of private blogs or wikis as a personal reflective space is intended to encourage this focus on the ongoing interplay of past and future performance.

Do get involved with the discussion, either via the blog post or through contacting the project team.

]]>
http://blogs.cetis.org.uk/rowin/2012/01/25/interact-modelling-feedback-flow/feed/ 0
eAssessment Scotland 2012 call for posters, presentations and workshops http://blogs.cetis.org.uk/rowin/2012/01/24/eassessment-scotland-2012-cfp/ http://blogs.cetis.org.uk/rowin/2012/01/24/eassessment-scotland-2012-cfp/#comments Tue, 24 Jan 2012 13:58:42 +0000 http://blogs.cetis.org.uk/rowin/?p=544 The call for posters, presentations and workshops for eAssessment Scotland 2012 – Feeding Back, Forming the Future is now available.  Proposals should be submitted by 1 May.

This annual conference has become a valuable part of the eassessment calendar, as can be seen by the rich and varied content in the archive of past events.  The conference is being held on 31 August at the University of Dundee, with an online conference running from 23 August – 6 September.  The conference will also host the annual Scottish eAssessment Awards, for which submissions open in early March.  Registration for both events will open in mid-March.

]]>
http://blogs.cetis.org.uk/rowin/2012/01/24/eassessment-scotland-2012-cfp/feed/ 0
2011: a CETIS year in blogging http://blogs.cetis.org.uk/rowin/2012/01/17/2011-cetis-review/ http://blogs.cetis.org.uk/rowin/2012/01/17/2011-cetis-review/#comments Tue, 17 Jan 2012 12:24:39 +0000 http://blogs.cetis.org.uk/rowin/?p=529 If you subscribe to any of our CETIS mailing lists you’ll probably be aware that each month I send out a newsletter summarising our blog posts and news stories over the previous month as well as information on our publications, events and sector funding opportunities.  As part of this I always include a Top Five posts section, highlighting the five most popular posts of the month – a really interesting look at what our audiences are actually interested in.  So with the new year now firmly in place, it seemed like the ideal time to take a look back at what you enjoyed reading – and we enjoyed writing – in 2011…

What you liked reading

The top 20 most read posts of 2011 were:

  1. UKOER 2: Dissemination protocols in use and Jorum representation (26 August 2011) John Robertson
  2. Mobile Web Apps: a briefing paper (2 March 2011) Mark Power
  3. A TAACCCTful mandate? OER, SCORM and the $2bn grant (25 January 2011) Lorna Campbell
  4. Weak Signals and Text Mining II – Text Mining Background and Application Ideas (12 May 2011) Adam Cooper
  5. W3C Opens UK & Ireland Office (19 April 2011) Mark Power
  6. Analysis and structure of competence (4 January 2011) Simon Grant
  7. British Standards in ICT for Learning Education and Training – What of it? (24 January 2011) Adam Cooper
  8. Playing with canvas and webgl (21 April 2011) David Sherlock
  9. eBooks in Education – Looking at Trends (10 March 2011) Adam Cooper
  10. Google custom search for UKOER (20 January 2011) Phil Barker
  11. JISC CETIS OER Technical Interest Group (6 January 2011 ) Lorna Campbell
  12. ÜberStudent, Edubuntu – A sign of what is to come? (8 February 2011) Adam Cooper
  13. JISC CETIS OER Technical Mini Projects Call (2 March 2011) Phil Barker
  14. Crib sheet for 2011 Educause Horizon Report (9 February 2011) Sheila MacNeill
  15. Weak Signals and Text Mining I – An Introduction to Weak Signals (12 May 2011) Adam Cooper
  16. From Design to implementation – DVLE programme Strand A Showcase (31 January 2011) Sheila MacNeill
  17. Considering OAI-PMH (21 January 2011) John Robertson
  18. The Learning Registry: “Social Networking for Metadata” (22 March 2011) Dan Rehak (othervoices)
  19. Using video to capture reflection and evidence (17 March 2011) Sheila MacNeill
  20. Google Apps for Education UK User Group (16 February 2011) Sheila MacNeill

This information was generated by AWStats for our blogs.cetis.org.uk domain, although we’ve recently begun using Google Analytics for tracking, as discussed in David’s excellent post on developing a web analytics strategy for a distributed organisation such as CETIS.

The majority of the most popular posts are from the early part of 2011.  While this is unsurprising – the longer a post has been up, the more chance there is for people to find it – it’s also quite reassuring that the information we’re posting is still relevant and of interest to people after its original appearance!

As stats are collected only for our self-hosted blogs, those that are hosted elsewhere are unfortunately missing.  We don’t have any figures for Scott Wilson’s excellent blog which is always very well worth a read, or for Mark Power’s after May 2011 when he moved to his own domain – again, very well worth keeping track of.

What we enjoyed writing

While some of our posts are obviously of wider interest to our community than others, it’s not necessarily the ones with the most hits that are our personal favourites of the year.  I asked my colleagues which was their favourite story they blogged in 2011 and why…

Adam Cooper: Mine is Preparing for a Thaw – Seven Questions to Make Sense of the Future because I wrote it in the car park of Leeds University and because I enjoyed doing the visualisation which is still hidden in the comments (DOH!)

Christina Smart: My favourite post was Business Adopts Archi Modelling Tool which was an interview with Phil Beauvoir. I’ve done a number of interviews this year, and I always enjoy an excuse to chat to people who are so enthusiastic about what they do. I’ve picked this one because Archi had a great year last year, and to see a JISC funded tool gaining traction outside HE is quite rare, and clearly something to celebrate. (although no zombies)

David Sherlock: I’m not very good at writing and find blogging quite stressful so I’d say my favourite links are more to do with the things I was playing with that I found interesting behind the scenes rather than the writing bit.  I’m going to go with Playing with canvas and webgl because I found using the canvas element to draw shapes was fun, it reminded me of of when coding was fun on my Commodore 64.

Li Yuan: Big Data and analytics in education and learning.  “Big Data” and “analytics” is one of the topics that the JISC observatory working group have agreed to further investigate and look at since they are being applied to all sectors, including government, health, business, etc. This blog post was just an introduction to the concept of Big Data and the implications in teaching, learning and administration in institutions, many aspects are worth further exploring, such as technical, pedagogical and organisational issues in relation to application of big data and analytics in education.

Lisa Corley: Well, i’m not a prolific blogger, but would probably choose What’s in a Word(le)? Lifelong Learning and Work Based Learner experiences… mainly because it brought together lots of work in the programme I had just finished supporting, and after reading all the final reports and summarising them it felt useful to have the summaries in the public domain rather than in some report hidden away somewhere. I also really liked doing the ‘visualisations’ as I think it helps to look at the information in a different way.

Lorna Campbell: Suppose it would have to be a A TAACCCTful mandate? OER, SCORM and the $2bn grant which was an attempt to cut through the crap and  provide a rational summary of a rather overheated situation.  *cough* It also happened to be nominated for the second annual “Downes Prize”.

John Robertson: Hmm, I have to admit that some of the posts from last year which I like the best are ones that never quite got properly started or finished. Perhaps partially because of that and partially because it was a “throw away” response to a tweet which ended up drawing together and developing some of my thinking about open ed.  There’s lots about it that I think is imperfect (e.g. using the word “manifesto”) but it got some things right and there was a certain serendipity to its creation which makes me smile: An OER manifesto in twenty minutes.

Phil Barker: Modern Art of Metadata.  Unexpected interest during a meeting of the advisory group of the Resource Discovery Task Force Vision Implementation Plan Management Framework (I kid you not).

Rowin Young: My favourite post is my look at the excitement that surrounded the Mozilla Open Badges Initiative after the announcement of a substantial prize fund for developments, Badges, identity and the $2million prize fund.  It touched on a number of areas that are of particular interest to me, including gaming achievement systems as both motivators and exploiters and the increasing trend for using elements from gaming in other contexts, identity management in both the technical and social aspects, assessment and accreditation.  Writing the post provided me with an opportunity to work out a lot of my thinking around the topic, and I really enjoyed working on it.

Scott Wilson: My personal fave is this one: Converting Chrome Installed Web Apps into W3C Widgets.  Not because its that great a post, but because of all the chaos that ensued at W3C and elsewhere. This got picked up by Opera, who used it to publicly berate Google and Mozilla about supporting open standards, which drew in Microsoft, and before the end of the month even Adobe had joined in. It actually led more or less directly to the “future of offline web apps” event which was a huge success, and so there may even be a positive outcome.

Sharon Perry: Although I’m not a prolific blogger (only 3 posts in 2011!), I did like the story about using crowdsourcing to highlight and help companies repair inaccessible websites (Crowdsourcing to Fix the Web). I think crowdsourcing is becoming a very important part of social interaction on the web.  Not only can it help solve larger problems by developing micro-solutions but it encourages people to interact and engage with the area concerned.  There is often no financial advantage for those who take part, but the pay-off is perhaps more intangible, i.e. a person who provides such support or help may in turn get that “feel good factor” and a greater sense of well-being for being involved in the greater good.  I suppose I’m also highlighting this story again because the “Fix the Web” cause is now running out of funds and is struggling to survive.  I hope it gets the funds it needs to continue and that it may act as an example to other social enterprises.  Long may crowdsourcing continue!

Sheila MacNeill: My favourite post of last year was called Betweenness Centrality – helping us understand our networks. There are a couple of reasons I’ve picked it. Firstly what started out as as a serendipitous twitter conversation introduced me to a new concept (betweenness centrality)  which I was able to reflect on in terms of CETIS and its networks.  It also helped me to begin to consolidate some thoughts around SNA (social network analysis) and in relation to CETIS as how we can visualise, share understand and build our networks.  Over the past year I’ve been experimenting with Storify as a way to re-publish tweets into coherent stories, and this post allowed me to combine this technique within a more contextualised post.  And finally, the original conversation help brighten up quite a dull bank holiday Monday and legitimately referencing zombies in a work related post was just too hard to resist.

Simon Grant: I nominate Grasping the future (which I had completely forgotten about) for several reasons.  First, it wasn’t something I was thinking about self-consciously and deliberately, but thoughts that came to me from interaction with other people in IEC. I think often that’s the best tradition in blogging: something that would probably not see the light of day were it not for a convenient public platform. Second, because the comments it attracted are really interesting and stimulating in their own right. And third, because re-reading it makes me think, yes, there is something there that I or we really should take forward, something waiting to grasp in the future.

Many thanks to all my colleagues for their contributions to this post, and to all our readers for engaging with, commenting on and sharing what we write – here’s to a great 2012!

]]> http://blogs.cetis.org.uk/rowin/2012/01/17/2011-cetis-review/feed/ 6 The e-Feedback Evaluation Project http://blogs.cetis.org.uk/rowin/2011/12/15/efep/ http://blogs.cetis.org.uk/rowin/2011/12/15/efep/#comments Thu, 15 Dec 2011 10:20:29 +0000 http://blogs.cetis.org.uk/rowin/?p=523 Assessment of language learning naturally presents some unique challenges for both teaching staff and learners.  Regular practice of both spoken and written language production is a vital part of language training and requires a significant amount of ongoing feedback to support the acquisition of competence in the subject.  In a distance learning context in particular, but similarly in any setting where feedback is provided asynchronously rather than face-to-face, providing meaningful feedback on spoken texts especially is challenging, often requiring spoken feedback to correct pronunciation and structuring errors.

There have been a number of exciting projects around audio feedback in recent years, including the Optimising Audio Feedback project at Aberystwyth University, Sounds Good at Leeds (both funded by JISC) and Audio Supported Enhanced Learning, a collaboration between the Universities of Bradford and Hertfordshire.  The focus of the eFeedback Evaluation Project (eFEP), however, is the impact of the combination of both spoken and written feedback on language learning.

The eFEP project is led by the Department of Languages at The Open University, an institution with unique experience in providing language training through distance learning, a large part of which involves teaching through both formative and summative assessment and feedback.  The OU has a mature and robust eTMA (electronic tutor marked assignment) system which supports assessment across the institution, and provides feedback either via MP3 files or marked-up MS Word documents, as appropriate for the individual assignment.  Each form of feedback is supplemented with an HTML form (an example of which can be seen on the poster submitted by the project to the programme’s startup meeting) containing administrative information, marks awarded and additional feedback.

The project will examine the ways in which students and tutors interact and engage with their feedback, identify common perceptions and issues, and recommend areas requiring further support and guidelines for good practice.  In order to examine the applicability of this feedback approach in traditional settings, the project will also look at the impact of audio feedback in Italian modules at the University of Manchester.

The insight into the use of audio feedback across a variety of environments, and the range of training and support materials to be produced, should make eFEP a valuable addition to our understanding of the value of audio feedback as well as offering clear practical guidance to those considering adopting it.

]]>
http://blogs.cetis.org.uk/rowin/2011/12/15/efep/feed/ 0
Online Coursework Management Evaluation http://blogs.cetis.org.uk/rowin/2011/12/15/ocme/ http://blogs.cetis.org.uk/rowin/2011/12/15/ocme/#comments Thu, 15 Dec 2011 09:51:51 +0000 http://blogs.cetis.org.uk/rowin/?p=512 The University of Exeter has developed an entirely online end-to-end coursework management system which is the subject of the Online Coursework Management Evaluation (OCME) project funded by JISC as part of the Assessment and Feedback programme Strand B.

This system sees the integration of Moodle and Turnitin within the university’s Exeter Learning Environment (ELE).  Assignments are submitted through the ELE, assigned an originality score by Turnitin, then available for marking through GradeMark (a commercial online marking system within Turnitin) or MS Word markup.  Feedback is returned to students either via uploaded forms or bespoke feedback forms, and are made available for viewing by both individual students and the personal tutor assigned to support them.  Initially deployed through a small 2011 pilot project funded by HEFCE, the system is now available institution-wide, although for practical reasons this evaluation project will concentrate on working with smaller groups across various disciplines.

Exeter’s Moodle support is provided by the University of London Computer Centre, who are developing the interface between Moodle and Turnitin.  There is strong internal support for the system which will be maintained and further developed well beyond the lifetime of this one year project.  What the OCME project will provide is a series of reports and briefing papers which will explore the pedagogic, technological and institutional aspects to transforming practice, and guidelines for future implementers and for those considering introducing such transformative technologies within their own institutions.  The experiences and lessons learned from this project should be of value across the sector.

]]>
http://blogs.cetis.org.uk/rowin/2011/12/15/ocme/feed/ 0
Evaluating the Benefits of Electronic Assessment Management http://blogs.cetis.org.uk/rowin/2011/12/14/ebea/ http://blogs.cetis.org.uk/rowin/2011/12/14/ebea/#comments Wed, 14 Dec 2011 12:09:03 +0000 http://blogs.cetis.org.uk/rowin/?p=518 Examining the embedding of electronic assessment management (EAM) within both administrative and teaching and learning practice is the main focus of the Evaluating the Benefits of Electronic Assessment Management (EBEAM) project running at the University of Huddersfield as part of the JISC Assessment and Feedback programme Strand B.  This 18 month project will look at how Turnitin, incorporating GradeMark and eRater, addresses student, staff and institutional requirements for timely, invidiualised and focused feedback, reduced staff workloads and increasing reflection on practice, and cost-effective, scaleable and sustainable innovation.

The dual focus on administrative and pedagogic aspects is crucial for real uptake of any new technology or process.  By providing a supportive administrative and technological infrastructure, institutions can enable academic staff to fully realise the benefits of innovative systems and practice, and provide a significantly enhanced learning environment for students.  The dynamic interplay of these factors is vividly illustrated in the poster the project submitted for the programme kick off meeting.  The impact on student satisfaction, achievement and retention rates already apparent at Huddersfield reflects the success of such an approach.

Like the Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan project, EBEAM is grounded in previous evaluation work investigating the benefits of Turnitin on staff and students.  As with other projects, the decision to adopt existing technologies incorporated through the institutional VLE (in this case, Blackboard) is a pragmatic choice, adopting known and proven technology rather than expending time and resources in developing yet more tools to do the same things.  Being able to pick up such tools as needed greatly increases institutional agility, and provides ready access to existing user groups and a wealth of shared practice.

EBEAM project staff also have a keen awareness of the need for meaningful and effective staff development to enable teaching staff to make full use of new technologies and achieve the integration of new approaches within their teaching practice, a theme covered in several posts on their excellent project blog.  The project will produce a wide range of development materials, including practically-focused toolkits, webinars and screencasts, which will be available through the project site and the JISC Design Studio.  In addition, they’re looking at ways of fully exploiting the extensive amount of data generated by these EAM systems to further enhance teaching and learning support as well as engaging administrative departments in discussions on topics such as data warehousing and change management.

The EBEAM project should provide an excellent study in the benefits of eassessment and of methods of integration that take a holistic approach to institutions and stakeholders.  I’m very much looking forward to seeing the outcomes of their work.

]]>
http://blogs.cetis.org.uk/rowin/2011/12/14/ebea/feed/ 1