Assessment think tank, at The Higher Education Academy, York, 31st January to 1st February 2008
Several of these events appear to have been arranged, and this one was with the Subject Centres both for History, Classics and Archaeology (HCA), and for Philosophy and Religious Studies (PRS).
Around 20 or so delegates were present, mostly from representative subject areas, but including from the JISC’s Plagiarism Advisory Service. Previously, I only recognised Sharon Waller from the HEA, and had talked with Costas Athanasopoulos (PRS Subject Centre) at the JISC CETIS conference: he was the one who invited me.
I won’t try to document the whole event, but to pick out a few things which were highlights for me.
The discussion around plagiarism was inspiring. There was very little on the mechanics and technology of plagiarism detection (Turnitin is popular now) and plenty on good practice to avoid the motive for plagiarising in the first place. This overlaps greatly with other good practice – heartening, I felt. George MacDonald Ross gave us links to some of his useful resources.
Also from George MacDonald Ross, there was an interesting treatment of multiple-choice questions, for use preferably in formative self-assessment, avoiding factual questions, and focusing on different possible interpretations, in his example within philosophy.
As I’m interested in definitions of ability and competence, I brought up the issue of subject benchmarks, but there was little interest in that specifically. However, for archaeology fieldwork, Nick Thorpe (University of Winchester) uses an assessment scheme where there are several practical criteria, each with descriptors for 5 levels. This perhaps comes closest to practice in vocational education and training, though to me it doesn’t quite reach the clarity and openness of UK National Occupational Standards. Generally, people don’t seem to be yet up to clearly defining the characteristics of graduates of their courses, or they feel that attempts to do that have been poor. And yet, what can be done to provide an overall positive vision, acceptable to staff and student alike, without a clear, shared view of aims? Just as MCQs don’t have to test factual knowledge, learning outcomes don’t have to be on a prosaic, instrumental level. I’d be interested to see more of attempts to define course outcomes at relatively abstract levels, as long as those are assessable, formally or informally, by learner, educator and potential employer alike.
One of the overarching questions of the day was, what assessment-related resources are wanted, and could be provided either through the HEA or JISC? In one of our group discussions, the group I was in raised the issue of what a resource was, anyway? And at the end, the question came back. Given the wide range of the discussion throughout the day and a half, there was no clear answer. But one thing came through in any case. Teaching staff have a sense that much good, innovative practice around assessment is constrained by HEI (and sometimes wider) policies and regulations. Materials which can help overcome these constraints would be welcome. Perhaps these could be case studies, which documented how innovative good practice was able to be reconciled with prevailing policy and regulations. Good examples here, presented in a place which was easy to find, could disseminate quickly – virally even. Elements of self-assessment, peer assessment, collaboration, relevance to life beyond the HEI, clarity of outcomes and assessment criteria, etc., if planted visibly in one establishment, could help others argue the case for better practice elsewhere.
Thank you for such an interesting post, Simon.
I agree with your thoughts on plagiarism. As a tutor I came across very few examples of outright copying-to-cheat, but there were instances of students producing work which was effectively if unintentionally plagiarised. I really don’t think there’s an intention to deceive when students submit work which includes unattributed extracts from text books and papers I’ve recommended they read (or even my own handouts!), and exploring the reasons why that happens and how to overcome it seems to me a much more positive and supportive route than harsh penalties and branding offenders as cheats right from the off. Some institutions allow students to upload their work to Turnitin before it’s formally submitted, so that the student can view, reflect on and act on the plagiarism reports, and that seems like a very beneficial approach.
The observation that ‘much good, innovative practice around assessment is constrained by HEI … policies and regulations’ seems to echo a discussion happening on the ISL list at the moment around modules, semesterisation and learning outcomes as the be all and end all of course planning. This blog post by James Atherton http://www.doceo.co.uk/reflection/2007/02/on-passing-tradition.htm might be ‘unashamedly academic, elitist and reactionary’, but it makes some very pertinent points.
Thanks, Rowin!
The only complaint I have about the otherwise inspiring James Atherton post which you cite is that he doesn’t even consider the possibility of assessment for such interesting learning outcomes. My own inclination is to encourage educators to work hard to formulate seriously (not jokingly) learning outcomes in the same spirit as Atherton dismisses jokingly, and the related assessment strategies. Now that would be a really open mind, without the closure and rigidity of supposing that the really valuable things can’t be assessed. Can’t be assessed traditionally, or within a reductionist paradigm, I can well believe. Push the boundaries of assessment, sure. Let us be creative!
Fair point and it *would* be a very interesting exercise!
Pingback: Simon Grant at JISC CETIS » Intellectual heritage tracing