Thanks also go to Adam for pointing out RM’s Test Authoring System which claims to be fully compliant with IMS QTI 2.1, making it one of the earliest commercial products to implement the revised specification. I couldn’t find a demo to try out, but it is good to see the specification finally being implemented in this type of system and market sector. Also on the website are also a couple of research reports on the impact of ICT in the classroom which are well worth reading.
Category Archives: Standards and specifications
QTI in Korea
Things may be quiet at the moment on the release of the final version of IMS QTI 2.1, but there’s been quite a bit of activity in the background around implementations and integration with other systems.
Keris, the Korea Education and Research Information Service (similar to the UK’s JISC), have been quite active within IMS and in February signed a formal memorandum of understanding with IMS to launch IMS Korea. Amongst their activities is involvement with Teaching Mate, a commercial product which aims to support QTI export by the end of the year. Also from Keris is a QTI 2.1 player that does import and export QTI 2.1 – it’s in Korean, but Adam cunningly pointed out that you can use the browser status bar to work out what each button actually does.
It’s worth having a look around the Keris site to see the extent to which they support ICT in education. Of particular interest is EDUNET, the National Teaching and Learning Center established in 1996 which, amongst a range of other services, provides a large range of school-level teaching materials and ‘an online testing service to evaluate students’ achievements’. Would teachers in the UK welcome a centrally provided eassessment service to support the government’s eassessment targets?
Joint CETIS Assessment and Educational Content SIGs meeting announced
Registrations are now open for our next Assessment SIG meeting, and you’re warmly invited to book your place for this event hosted by the University of Cambridge. It’s a joint meeting with the CETIS Educational Content SIG, something we’ve been planning to do for some time, looking in particular at two standards of interest to assessment: IMS Common Cartridge and IMS Tools Interoperability.
Common Cartridge hasn’t even been released yet, but has already generated significant interest amongst content vendors and publishers and has been heavily promoted by IMS. It combines profiles of a number of different standards, including IMS Content Packaging v1.1.4, IMS Question and Test Interoperability v1.2.1, IMS Tools Interoperability Guidelines v1.0, IEEE LOM v1.0 and SCORM v1.2 and 2004. The resultant learning object package or ‘cartridge’ is intended to be fully interoperable across any compliant system allowing content to be delivered to any authorised individual.
The appeal of Common Cartridge coupled with authentication and digital rights management systems to content publishers is clear, and the specification is particularly suited to the American educational system where there is a closer relationship between content vendor and courses than in UK Higher Education; in the UK, its primary impact may be in the schools and Further Education sectors where there is more of a history of buying content from publishers than HE. The inclination of many UK HE lecturers to produce their own content and the bespoke nature of many higher level courses are issues we’ve already encountered when looking at topics such as open content and item banking, but there is some interest within UK education, in particular from the Open University. As a major content producer whose resources are used far beyond their own courses, Common Cartridge has clear potential, and Ross McKenzie and Sarah Wood of OU OpenLearn will offer an insight into their experiences of implementing the specification and developing cartridges. There has been very little work to date on the delivery of assessment material through Common Cartridge, a topic which will be addressed by Niall Barr of NB Software. Our own Wilbert Kraan and Adam Cooper will update delegates on the current position of Common Cartridge.
IMS Tools Interoperability has received rather less fanfare, but is a valuable specification which takes a web services approach to seamlessly integrating different tools. It allows specialist tools to be ‘plugged in’ to a learning management system, such as integrating a sophisticated assessment management system with a VLE which only provides limited native support for assessment, or discipline-specific tools such as simulators. It also supports accessibility requirements through the (optional) incorporation of the user’s IMS Accessibility Learner Information Package profile to allow silent interface configuration. Warwick Bailey of Icodeon will be discussing his experiences with the specification.
In the morning, Steve Lay of CARET, University of Cambridge, will be providing an update on the current state of IMS QTI v2.1. Steve is co-chair of the IMS QTI working group (with Pierre Gorissen of SURF and Fontys University).
The afternoon will feature a presentation by Linn van der Zanden of the Scottish Qualifications Authority on the use of wikis and blogs in education and assessment, picking up on an increasing interest in the use and potential of Web 2.0 technologies in this domain.
The meeting is colocated with a workshop by the three JISC Capital Programme projects focusing on assessment to which you are also invited…
Assessment in 2008: looking forward
Gales are howling, trains in chaos, so it must be January and time to look ahead to what 2008 has in store…
The final release of QTI v2.1 should be out this spring, and it’ll be interesting to see what uptake is like. This will be the most stable and mature version of the specification to date, supported by a long public draft stage and a number of implementations. Angel Learning are a significant commercial early adopter, and other vendors are bound to be looking at their experiences and whether Angel’s embracing of the specification has an impact on their own customer demand for QTI 2.1.
Other significant implementors of 2.1 are the JISC Capital Programme projects which will be concluding around March. AQuRate offers an item authoring tool, Minibix provides support for a range of item banking functions while ASDEL is an assessment delivery engine which supports both standalone use and integration with a VLE. These projects should deliver quality resources to the community which will provide a firm foundation for use of the specification. There was a sneak preview of these projects at our last SIG meeting.
Talking of SIG meetings, dates for the next two meetings can now be confirmed.
On 19 February there will be a joint meeting with the CETIS Educational Content SIG in Cambridge. This meeting will cover a range of shared concerns such as new content related specifications such as Common Cartridge and Tools Interoperability, and innovative approaches to educational material and assessment. Information about this meeting and online registration will be available very soon. This will be preceded by a workshop hosted by the Capital Programme projects discussed above.
The focus shifts from assessment as content to assessment as process with another joint meeting on 1 May in Glasgow. This meeting will be a joint meeting with the CETIS Portfolio and Enterprise SIGs and will offer an opportunity to explore some of the shared issues in these domains. Again, information on the event will be available on the mailing lists, on this blog and on the website in due course.
Another event of note is the annual International Computer Assisted Assessment Conference on 8 and 9 July at Loughborough. The call for papers is already out, with submissions due by 29 February. As always, this should be a lively and important event in the CAA calendar. Alt-C 2008, Rethinking the Digital Divide, will be held in Leeds on 9 – 11 September; again, the closing date for submissions is 29 February. There’s also a regularly updated list of non-CETIS assessment related events on the wiki.
And what about the trends for eassessment in 2008? The results of Sheila’s poll, with a strong emphasis on Web 2.0 technologies and possibilities, do seem to reflect to some extent the comments on the last meeting’s evaluation forms which suggested increasing interest in innovative technologies, signficant concern with transforming and enhancing the assessment experience and direct engagement with teaching and learning rather than the more abstract issues of standards and specifications for their own sake. It will be interesting to see how the more ‘traditional’ XML-based QTI v2.1 fares in the light of the increasing popularity of mashups and web services in 2008.
Assessment SIG meeting, 26 September 2007
Academics and developers met in Glasgow recently to participate in the most recent Assessment SIG meeting. The very full agenda covered a range of topics, both technical and pedagogic, and presentations led to some lively discussions.
Myles Danson of JISC opened the day by presenting JISC’s views and priorities for eassessment, as well as pointing to some future work they will be undertaking in the domain.
Yongwu Miao of the Open University of the Netherlands discussed work undertaken by the TENCompetence Project, with a particular focus on the relationship between IMS QTI and IMS Learning Design and the work they have done in this area. Dick Bacon of the University of Surrey and the HEA discussed the relationship between different varieties or ‘dialects’ of QTI, exploring some of the implementation and interpretation issues that hinder or break interoperability between systems nominally implementing the same version of the specification. CAL Consultant Graham Smith pleased the audience with news that a new Java version of his QTI demonstrator will be available shortly with updated support for QTI 2.0 items, which should help in the identification and resolution of implementation problems.
Martin Hawksey of the University of Strathclyde presented the work of the Re-Engineering Assessment Practices project. With a focus on real world assessment experiences, including an impressive collection of case studies exploring the impact of transformation within assessment practices, the REAP project was of particular interest to participants. Also of great interest, and perhaps unsuprisingly sparking the greatest amount of debate, was the exploration of ‘Assessment 2.0′ presented by Bobby Elliott of the Scottish Qualifications Authority. Bobby looked at ways in which Web 2.0 technologies can be used to enhance and modernise assessment in ways which can engage and appeal to increasingly digitally literate learners.
The day also featured several demonstrations of tools under development. Niall Barr of NB Software demonstrated his current work, an assessment tool which utilises the IMS QTI, Content Packaging and Common Cartridge specifications, while Steve Bennett of the University of Hertfordshire demonstrated MCQFM, a JISC-funded tool which provides a simple text-based format for converting and editing items between formats. Two more JISC projects closed the day. AQuRate, presented by Alicia Campos and David Livingstone of Kingstone University, is an elegant item authoring tool while ASDEL, presented by Jon Hare of the University of Southampton, is an assessment delivery tool which builds on the R2Q2 project to provide a fuller test tool. A third project, Minibix (University of Cambridge) on item banking, is working closely with AQuRate and ASDEL.
Links to presentations (via slideshare), project websites and other information can all be found on our wiki: http://wiki.cetis.org.uk/JISC_CETIS_Assessment_SIG_meeting%2C_26_September_2007.
Integrated Assessment – IMS Webinar
On Monday night I attended IMS’s webinar on ‘Integrated assessment products and stategies: gauging student achievement and institutional performance’. This was the first IMS webinar I’d attended, and I found it a useful session. Over 80 people participated on Horizon Wimba for the session.
Rob Abel, CEO of IMS, introduced the session by describing integrated assessment as assessment which is designed into and throughout the learning experience. He discussed the outcomes of a recent survey on satisfaction with elearning tools which showed that tools for quizzing and assessment had the highest satisfaction ratings amongst users; of the 88 products surveyed, Respondus came top in terms of user satisfaction. This is a consequence both of the usefulness and maturity of this category as well as the availability and quality of tools available.
Rob also suggested that ‘standards are a platform for distributed innovation’, which is a nice phrase, although one of the criticisms often made of QTI is that it isn’t innovative. It’s hard to see, however, how true innovation could be standardised.
Neil Allison (BlackBoard Director of Product Marketing), Sarah Bradford (eCollege Vice President of Product Management) and Dave Smetters (Respondus President) all spoke briefly about how their tools could be used for integrated assessment.
Neil illustrated how BlackBoardcan ‘make assessment easier and more systematic’ by integration with other elements of the VLE such as enterprise surveys, portfolios and repositories. One comment I found particularly interesting was an outcome from a December 2005 Blackboard Survey of Priorities in Higher Education Assessment, which found that portfolios are used in 86% of public institutions but only 43% of private, while interviews and focus groups are used by 78% of private institutions and only 48% of public. I’ve tried to find this online without success; it’s referenced in slide 20 of the presentation.
Sarah noted that eCollegeusers are using Respondus and Questionmark’s secure browser to assess their learners. Her talk focused on the eCollege outcome repository or database, which is linked to their content manager, stressing the importance of a good tagging system. The eCollege Learning Outcome Manager addresses some of the problems for usage data management for quality assurance, an important issue given the current interest in item banking.
Dave’s talk was most wide-ranging, looking not only at the highly popular Respondus assessment authoring and management tool but at some of the wider issues around integrated eassessment. He referenced research which found that only between 13 – 20% of courses with an online presence have one or more online assessment as part of that course – yet market research consistently shows that online assessment capabilities are one of the most appealing elements in drawing users to esystems. As he said, once the system is in place, ‘reality kicks in': online assessment takes work, effort and time, raises difficulties in converting or creating content, and raises fears of the potential for cheating. He argued that only a very small number of students have the desire to cheat, yet the impact can affect an entire class. Students themselves like a secure assessment environment that minimises the possibilities for cheating. Locked browsers are a big issue for Respondus at the moment; security of online assessments is also addressed by BS7988 which is currently being adopted by ISO.
Colin Smythe, IMS’s Chief Specification Strategist, provided a brief survey of the standards context for integrated assessment. He noted that all specifications have some relevance for assessment, citing Tools Interoperability, Common Cartridge, ePortfolio, Content Packaging, LIP, Enterprise and Accessibility. He also posted a useful timeline (slide 69) which shows that QTI v2.1 is scheduled for final release in the second quarter of 2007, to be synchronised with the latest version of Content Packaging.
He also said that Common Cartridge provides ‘for the first time content integration with assessment'; how much this will be adopted remains to be seen but IMS are marketing it quite forcefully.
There was time for a short question and answer session at the end. I asked about the commitment of the vendors to QTI 2.1 and the use of QTI 2.1 in Common Cartridge. The Common Cartridge specification uses an earlier version of QTI partly because there were some migration issues with 2.0 which have been resolved through transforms in 2.1, and also because IMS ‘didn’t want to force the marketplace to adopt a new specification’. As Rob says, interoperability requires the ‘commitment of the marketplace’, and it would be useful to know what commitment these vendors have to the newer version.
The session concluded with a reminder about the Learning Impact 2007 conference being held in Vancouver on 16 – 19 April 2007, which should be of interest to many.