- is the term in JACS3?
- is there evidence of use of the term in HESA data returns?
- is the term’s definition and scope sufficiently clear and comprehensive to allow classification?
- is the term reliably distinguishable from other terms?
Category Archives: cetis-standards
QTI 2.1 tool tutorial
When does a book become a web platform?
During last week’s CETIS conference I ran a session to assess how ebooks can function as an educational medium beyond the paper textbook.
After reminding ourselves that etextbooks are not yet as widespread as ebook novels, and that paper books generally are still most widely read, we examined what ebook features make a good educational experience.
Though many features could have been mentioned, the majority were still about the experience itself. Top of the bill: formative assessment at the end of a chapter. Either online or offline, it needs to be interactive, and there need to be a lot of items readily available. Other notable features in the area include a desire for contextualised discussion about a text. Global is good, but chats limited to other learners in a course is better. A way of asking for clarification of a teacher by highlighting text was another notable request.
Using standards to make assessment in e-textbooks scalable, engaging but robust
During last week’s EDUPUB workshop, I presented a demo of how an IMS QTI 2.1 question item could be embedded in an EPUB3 e-book in a way that is engaging, but also works across many e-book readers. Here’s the why and how.
One of the most immediately obvious differences between a regular book and an e-textbook is the inclusion of little quizzes at the end of a chapter that allow the learner to check their understanding of what they’ve just learned. Formative assessment matters in textbooks.
Question and Test tools demonstrate interoperability
As the QTI 2.1 specification gets ready for final release, and new communities start picking it up, conforming tools demonstrated their interoperability at the JISC – CETIS 2012 conference.
The latest version of the world’s only open computer aided assessment interoperability specification, IMS’ QTI 2.1, has been in public beta for some time. That was time well spent, because it allowed groups from across at least eight nations across four continents to apply it to their assessment tools and practices, surface shortcomings with the spec, and fix them.
Nine of these groups came together at the JISC – CETIS conference in Nottingham this year to test a range of QTI packages with their tools, ranging from the very simple to the increasingly specialised. In the event, only three interoperability bugs were uncovered in the tools, and those are being vigorously stamped on right now.
Where it gets more complex is who supports what part of the specification. The simplest profile, provisionally called CC QTI, was supported by all players and some editors in the Nottingham bash. Beyond that, it’s a matter of particular communities matching their needs to particular features of the specification.
In the US, the Accessible Portable Item Profile (APIP) group brings together a group of major test and tool vendors, that are building a profile for summative testing in schools. Their major requirement is the ability to finely adjust the presentation of questions to learners with diverse needs, which is why they have accomplished by building an extension to QTI 2.1. The material also works in QTI tools that haven’t been built explicitly for APIP yet.
A similar group has sprung up in the Netherlands, where the goal is to define all computer aided high stakes school testing in the country in QTI 2.1 That means that a fairly large infrastructure of authoring tools and players is being built at the moment. Since the testing material covers so many subjects and levels, there will be a series of profiles to cover them all.
An informal effort has also sprung up to define a numerate profile for higher education, that may yet be formalised. In practice, it already works in the tools made by the French MOCAH project, and the JISC Assessment and Feedback sponsored QTI-DI and Uniqurate projects.
For the rest of us, it’s likely that IMS will publish something very like the already proven CC QTI as the common core profile that comes with the specification.
More details about the tools that were demonstrated are available at the JISC – CETIS conference pages.
Approaches to building interoperability and their pros and cons
System A needs to talk to System B. Standards are the ideal to achieve that, but pragmatics often dictate otherwise. Let’s have a look at what approaches there are, and their pros and cons.
When I looked at the general area of interoperability a while ago, I observed that useful technology becomes ubiquitous and predictable enough over time for the interoperability problem to go away. The route to get to such commodification is largely down to which party – vendors, customers, domain representatives – is most powerful and what their interests are. Which describes the process very nicely, but doesn’t help solve the problem of connecting stuff now.
So I thought I’d try to list what the choices are, and what their main pros and cons are:
A priori, global
Also known as de jure standardisation. Experts, user representatives and possibly vendor representatives get together to codify whole or part of a service interface between systems that are emerging or don’t exist yet; it can concern either the syntax, semantics or transport of data. Intended to facilitate the building of innovative systems.
Pros:
- Has the potential to save a lot of money and time in systems development
- Facilitates easy, cheap integration
- Facilitates structured management of network over time
Cons:
- Viability depends on the business model of all relevant vendors
- Fairly unlikely to fit either actually available data or integration needs very well
A priori, local
i.e. some type of Service Oriented Architecture (SOA). Local experts design an architecture that codifies syntax, semantics and operations into services. Usually built into agents that connect to each other via an ESB.
Pros:
- Can be tuned for locally available data and to meet local needs
- Facilitates structured management of network over time
- Speeds up changes in the network (relative to ad hoc, local)
Cons:
- Requires major and continuous governance effort
- Requires upfront investment
- Integration of a new system still takes time and effort
Ad hoc, local
Custom integration of whatever is on an institution’s network by the institution’s experts in order to solve a pressing problem. Usually built on top of existing systems using whichever technology is to hand.
Pros:
- Solves the problem of the problem owner fastest in the here and now.
- Results accurately reflect the data that is actually there, and the solutions that are really needed
Cons:
- Non-transferable beyond local network
- Needs to be redone every time something changes on the local network (considerable friction and cost for new integrations)
- Can create hard to manage complexity
Ad hoc, global
Custom integration between two separate systems, done by one or both vendors. Usually built as a separate feature or piece of software on top of an existing system.
Pros:
- Fast point-to-point integration
- Reasonable to expect upgrades for future changes
Cons:
- Depends on business relations between vendors
- Increases vendor lock-in
- Can create hard to manage complexity locally
- May not meet all needs, particularly cross-system BI
Post hoc, global
Also known as standardisation, consortium style. Service provider and consumer vendors get together to codify a whole service interface between existing systems; syntax, semantics, transport. The resulting specs usually get built into systems.
Pros:
- Facilitates easy, cheap integration
- Facilitates structured management of network over time
Cons:
- Takes a long time to start, and is slow to adapt
- Depends on business model of all relevant vendors
- Liable to fit either available data or integration needs poorly
Clearly, no approach offers instant nirvana, but it does make me wonder whether there are ways of combining approaches such that we can connect short term gain with long term goals. I suspect if we could close-couple what we learn from ad hoc, local integration solutions to the design of post-hoc, global solutions, we could improve both approaches.
Let me know if I missed anything!
ArchiMate modelling bash outcomes
What’s more effective than taking two days out and focus on a new practice with peers and experts?
Following the JISC’s FSD programme, an increasing number of UK Universities started to use the ArchiMate Enterprise Architecture modelling language. Some people have had some introductions to the language and its uses, others even formal training in it, others still visited colleagues who were slightly further down the road. But there was a desire to take the practice further for everyone.
For that reason, Nathalie Czechowski of Coventry University took the initiative to invite anyone with an interest in ArchiMate modelling (not just UK HE), to come to Coventry for a concentrated two days together. The aims were:
1) Some agreed modelling principles
2) Some idea whether we’ll continue with an ArchiMate modeller group and have future events, and in what form
3) The models themselves
With regard to 1), work is now underway to codify some principles in a document, a metamodel and an example architecture. These principles are based on the existing Coventry University standards and the Twente University metamodel, and the primary aim of them is to facilitate good practice by enabling sharing of, and comparability between, models from different institutions.
With regard to 2), the feeling of the ‘bash participants was that it was well worth sustaining the initiative and organise another bash in about six months’ time. The means of staying in touch in the mean time have yet to be established, but one will be found.
As to 3), a total of 15 models were made or tweaked and shared over the two days. Varying from some state of the art, generally applicable samples to rapidly developed models of real life processes in universities, they demonstrate the diversity of the participants and their concerns.
All models and the emerging community guidelines are available on the FSD PBS wiki.
Jan Casteels also blogged about the event on Enterprise Architect @ Work
And the Winner Is … The UK
I have spent this week at the IMS Learning Impact Conference in Long Beach California. I’ve enjoyed the conference and sensed a remarkably fresh approach, amongst delegates and IMS alike, to standards and their role in educational technology. Overall I’d suggest a strong re- affirmation that the direction of travel we have been following in CETIS is very much on course. Lots of talk of openness, collaboration and Learner centred approaches (I’ll reflect on this in my next blog post). As is custom at this event the final activity, before workshops and working group meetings, is the annual Learning Impact Awards. It was something akin to the British (music) invasion of the early 1960′s with the UK dominating the platinum awards across all categories winners included The BBC for their accessibility tool kit ASK, Pebblepad and the Nottingham Xerte online toolkit Three out of the four main awards to the UK with two of these being accessibility tools.
Clive 1st June 2007
It has been some times since I ‘blogged’
Since I last ‘put fingers to keyboard’ I have been boring folk with my view that over the last year the ‘centre of gravity’ of learning technology standards developments have moved to the schools and FE sectors from HE. Driven by the e-Strategy and other governemnt schools based agendas such as ‘Every Child Matters’, Becta, MIAP and others have been obliged to deliver solutions. to strictly imposed deadlines. The focus so far has been in two areas: joining social service systems up with school administration systems and learning platforms for schools. Both have required standards based developments: the Schools Interoperability Framework (SIF) underpinned by the adoption of a Unique learner Number (ULN) has been adopted for the former and Becta has produced specifications (all around standards and extendability) that suppliers of learning platforms have to satisfy. Additionally, the Qualification and Curriculum Authority, in order to meet the requirements of the new vocational Specilaised Diploma will have to produce data standards by the Autumn for course details and qualification achievements.
Obviously such developments will have an impact for the JISC communities. Firstly, there will be a concentration  of minds around where standards are really needed (rather than ‘could be useful’) and there will be a requirement for JISC to focus on those areas of detail which could impede national projects if not attended to. Solutions to the problems of Identity Management and the scalability of SOA implementations are just two that need urgent attention.
So to survive , JISC has to be sufficiently engaged in influencing and engaging with learning technology based solutions in the schools and tertiary sector in order to anticpate those areas that need the efforts and expertise of our community?
So what have I been doing for my one day per week in addition to boring my colleagues at mangement meetings with the above?
Well I have been supportng Peter with ePortfolio develovepment (around assessment) and with the help of Nottingham University finding out about Lifelong Learning Networks,Â