A new website from the LearnHigher Centre for Excellence in Teaching and Learning is now available, making resources on a range of topics freely available to support students’ learning development. LearnHigher is a collaboration between sixteen UK universities, lead by Liverpool Hope, each with responsibility for a particular learning area; the University of York, for example, coordinates work on assessment. There are still some tweaks that I hope to see in the site over time, such as changes to the rather unhelpful descriptions attached to search results, but the use of creative commons licensing and the team’s clear commitment to the project suggest that this should be a useful resource for all involved in learning and teaching.
Category Archives: Assessment
CAA Conference 2008 proceedings now available
Proceedings from this year’s CAA Conference are now available at the event archive. As always, this makes the latest research and knowledge on eassessment freely available to the wider community and will be an invaluable resource for all working in this area. Enjoy!
UCISA Technology Enhanced Learning survey outcomes
The Universities and Colleges Information Systems Association (UCISA) have just released a report on the outcomes of the 2008 survey on the use of technology enhanced learning in the UK Further and Higher Education sectors.
There’s a wealth of valuable information here, particularly when viewed in the context of the surveys conducted in 2001, 2003 and 2005. Like the 2008 Horizon Report, streaming media, mobile computing, and podcasting and other Web 2.0 activities are identified as major trends emerging as support priorities, with use of eassessment, eportfolios, blogs and wikis being surprisingly prevalent.
The statistics for VLE use within institutions are of particular interest. In response to the question ‘what VLE, if any, is currently used in your institution?’, Moodle has a clear lead over second-placed BlackBoard; however, when looking at enterprise-wide adoptions Moodle is in a very poor third place behind BlackBoard and WebCT, suggesting that while community-based open source solutions effectively meet pedagogic needs at departmental level, institutional management may prefer the apparent security of lock-in to traditional vendor-and-client models.
Eassessment systems are also addressed. Eassessment tools are the most common centrally supported systems used by students, followed by blogs, podcasting and with eportfolios in fourth place. Respondents identified BlackBoard as the most commonly used eassessment system, followed by QuestionMark Perception, but this result should be viewed with caution as BlackBoard was also identified as the most commonly used blogging tool despite the fact ‘that BlackBoard does not have a hosted blogging tool’. The report suggests that respondents may be confusing the VLE itself with third party plugins such as Learning Objects, which does at least suggest a seamless user experience
There’s a great deal more intriguing information in the report, including full data from the 2008 survey and a longitudinal analysis between the 2001, 2003, 2005 and 2008 surveys.
When is a fail not a fail?
The UCU is reporting on an employment tribunal at Bournemouth University over a dispute relating to the marking of examination scripts.
The case focused on the remarking of papers of fourteen students who had failed both their original examination in archaeology and the subsequent resit. Despite the marks awarded by the complainant, Dr Paul Buckland, being confirmed in both diets by internal double marking and approved by the course’s exam board, the papers were subsequently remarked after intervention by the programme leader resulting in a number of the previously failing students gaining borderline marks that could allow them to gain an overall pass enabling them to remain on the course. The revised marks were accepted by the chair of the exam board on behalf of the board.
Is this situation, as the UCU representatives’ comments suggest, a consequence of the ‘marketisation of higher education’ and emerging ‘consumerist attitudes to degrees’, or simply the result of inevitable subjectivity in the marking process?
Testing the value of SATs
This article in The Progressive, One Teacher’s Cry: why I hate no child left behind by Susan J Hobart, explores one teacher’s frustrations with an education system dominated by SATs, political policy and league tables. Although based on Hobart’s experiences within the American education system, the world of doublethink (‘teach to the tests we don’t teach to’) and inflexible policy that hampers the very children it’s supposed to benefit will be familiar to many.
Hobart’s concerns echo those raised by Mike Baker in the wake of the SATs chaos in England. Both Baker and Hobart highlight the way in which pressure on schools to achieve good results can result in narrowing of the range of content taught in those areas subject to assessment, specifically English, maths and science, with classroom time being diverted from teaching subject matter to teaching children how to pass tests and accurately colour in the circles on multiple choice test papers.
Tests obviously have a vital place within the education system, and an especial importance at the end of primary education in order to provide a record of competencies for a child entering secondary education to ensure that their abilities and needs are catered for as effectively as possible. In a culture where test results have as much to do with league tables and house prices as with the benefit to the pupil, however, it’s hard to see who the real winners are.
UKCDR assessment tools survey outcomes
BTL recently announced that their Surpass Suite of eassessment tools came out top of a survey conducted by the UK Collaboration for a Digital Repository for High Stakes Assessments (UKCDR) project.
UKCDR, which ran from June 2005 – May 2007, was funded by JISC to create a methodology for developing a UK-wide high stakes assessment infrastructure. Based at the University of Manchester, with partners from a range of institutions, one of the project’s outputs is a needs calculator which allows users to specify their requirements for an assessment system and identifies appropriate software for their needs. There are two sets of results, one based on vendor self-appraisal and the other on the UKCDR evaluation; selecting ‘tick all’ and viewing the results reveals some dramatic differences between some of the vendor and UKCDR evaluations. The UKCDR results show BTL as providing the most extensive functionality, with Questionmark and Thomson Prometric good second and third respectively.
There are also a lot of useful resources on the UKCDR site, including use cases, presentations and an item bank survey, which are all well worth exploring.
First IMS European TestFest to come to UK
Developers of tools that implement IMS specifications will have an opportunity to participate in a formal IMS TestFest as part of the next IMS Quarterly Meeting, being held in Birmingham on 15-18 September. Seven QTI v2.1 tools and products have already signed up for the event on Tuesday 16th September, and will also be covered in the open session on QTI the day before: RM, ASDEL, JAssess, QTI Constraints Editor, AQuRate, Onyx and the QTI Migration Tool.
Amongst other items of interest on Wednesday 17th is the final of the regional Learning Impact competition – entries for which must be in by 25 August, so there’s still time to get your application in! The meeting closes on the 18th with a summit on Interoperability Now and Next, covering a broad range of issues and featuring a number of high profile speakers.
European research on eassessment
Spotted via René Meijer’s excellent blog is a new publication on eassessment from the EC’s Joint Research Centre. Towards a Research Agenda on Computer-Based Assessment is one of the outcomes of an expert workshop held last November in Italy to explore innovative approaches to large-scale skills assessment.
The fourteen papers in this volume cover a wide range of perspectives and experiences, with topics ranging from quality aspects of eassessment to innovative approaches and open source solutions and authors from a range of sectors. This is an excellent resource worthy of a wide audience.
Legal challenge to MCQs
A report this morning discusses a dyslexic medical student’s proposed legal action against the use of multiple choice tests on the grounds of discrimination.
Naomi Gadian, a student at the Peninsula Medical School in Devon, is taking action against the General Medical Council in a move which her solicitor suggests could force all providers and monitors of professional qualifications to adapt their examinations to remove MCQs. Although dyslexic candidates face challenges with most examination formats, the report cites John Stein as stating that MCQs are problematic for dyslexic candidates because of specific difficulties caused by confusing letter order.
The outcome of this case will be very interesting, though it seems likely that MCQs won’t be going anywhere for a while yet.
Update: The BBC has posted a useful exploration of some of the issues around dyslexia and MCQs – well worth a look.
JISC funding for eassessment
There’s still plenty of time to get bids in for JISC’s current assessment-related circular – the deadline is noon on 1 August.
Two parts of this circular, Calls I and II, directly relate to assessment and offer funding ranging from £45,000 for a six month demonstrator project up to £200,000 for a two year ‘transforming curriculum delivery’ project. Demonstrator projects will include an additional £15,000 funding available to the original developers of the toolkits with which their project will be working.
JISC have made a breakout room available at next week’s CAA Conference for potential bidders to use for scoping proposals and consortia. Myles Danson and John Winkley will be available on both days to provide advice and answer questions, and are likely to be found on the JISC stand at the event.
There are several invaluable presentations from last week’s community briefing event which prospective bidders will want to check out, in particular Myles‘s presentation on the demonstrator projects, presentations by Sarah Knight and Lisa Gray on the transforming curriculum delivery call, and Sarah Davies‘s extremely helpful guide to bidding. John Winkley‘s presentation to our last SIG meeting on various JISC assessment funding activities also provides useful information on both this call and some other imminent ITTs.
Lead institutions must be HE institutions funded by HEFCE or HEFCW, or FE institutions with 400+ FTE HE students; institutions and organisations which do not meet these criteria are welcome to apply as part of a consortium led by an eligible partner. Please check the full text of the circular for full details.
Update: an updated version of Resources for Writing Successful JISC Bids was posted on the eLearning Focus site as I was writing this post and is definitely worth a look.