Packages from the cloud(s)

CETIS and Knowledge Integration are working together, with community input, to develop a content transcoder service prototype. What is being proposed is a web service which will convert content into a variety of standard packaging formats (e.g. IMS CP & CC and SCORM). The project also plans to look at the most frequently used proprietary formats such as those used by WebCT, Blackboard and Moodle and at significant UK application profiles such as NLN.

The first phase of the project will be looking at prioritizing which formats and platforms the service should use and general user interface issues. So, we are looking from input from the community to help us with:

*prioritising which formats to be transcoded
*supplying test-case packages
*verifying the quality of the transcoded results in your platform of choice.

If you’d like to get involved, or just find out a bit more about the project, detailed information including the project brief is available from the CETIS wiki.

Universal edit button for wikis – one more reason to install firefox 3?

Some of the biggest wiki providers have launched a new icon to indicate in the browser bar that a page is universally editable. At the moment you need to have firefox 3 installed to see the icon, but it’s hoped other browser platforms will come onboard and the UEB (universal edit button) will become as ubiquitous as the RSS icon.

Nice idea and good to see platform providers working together to produce a simple solution to encourage contributions to wikis. ReadWriteWeb have good summary of the launch too.

Tweet Clouds

A post from Martin Weller put me onto Tweet Clouds – a new tag cloud generating service for twitter. As someone who uses twitter mainly for work purposes I was curious to see what kind of cloud my account would generate. As expected (particularly after a relatively heavy twitter session at the OAI-ORE open day on Friday) there are a lot of “resources” and “aggregations” in my cloud:-)

I’m not sure just how much of a gimmick this is and just how useful it is to have another view on what you are writing about. As Martin points out the addition of more filtering and links would certainly help. But I think because I twitter in bursts at selected times, it may well be of more value to someone like me than a more regular twitter user as any clouds I generate might be a bit more focussed. Then again, for a more regularly user it may well be useful to get an overview of what you have been talking about . . . or is it just another ‘neat’ web 2.0 application that you use once, smile at the results and never use again?

Overview of semantic technologies

Read/Write web have produced a really concise guide to the use of semantic technologies – Semantic Web patterns: a guide to semantic technologies. They have also just introduced a new monthly podcast feature called “The Semantic Web Gang”. The first episode is called “readiness for the semantic web”. Although taking a primarily business view of things, I’m sure that there will be lots of cross over with the e-learning community and a good way to keep abreast of developments in the use of semantic technologies.

Pedagogy planners – where next?

A meeting was held on 4th March to get some ‘real world’ input into how the development on the two pedagogy planning tools in the current JISC Design for Learning programme should progress.

The audience was made up mainly of teaching practitioners, most of whom have an interest in staff development and e-learning. Introducing the day, Helen Beetham (consultant to the JISC e-Learning programme) outlined some of the challenges around the changing economic, technical and pedagogical issues that face the teaching and learning community today. The role of planning teaching and learning is becoming of increasing importance as is the recognition of the need to share and represent practice. Although technology offers tantalising visions for the potential of shared learning design practice, the tools we have available at the moment still seem to fall short of the vision. Very few (if any) tools can capture and delivery the myriad of teaching practice that exist. So, is it time to start thinking about a set of teacher tools and services instead of trying to develop more one size fits all tools?

During the day participants had a the opportunity to have “hands-on” time with both Phoebe and the London Pedagogy Planner (LPP). Grainne Conole (0U) has already written about the day and reviews of Phoebe and LPP. The projects then presented their vision of how someone could use Phoebe to create an initial design, look for case studies and exemplars and then export that design into LLP and start ‘fleshing’ out the plan with actual teaching contact time etc.

While both prototypes offer a different (but complementary) approach to planning, they are both very much at the prototype stage. A key question that keeps arising is what is it that they actually produce? XML output allows a level of interoperability between the two just now but this needs to be extended much further so that there is a useful output which can relate to other institutional systems such as VLEs, CMS etc – “where’s the export to moodle” button was heard a few times during the day:-) During the feedback sessions it was clear exporting and importing data between systems will be crucial if such tools are to have any chance of having take up in institutions.

intralibrary repositories conference 2008

I managed to get along the the second day of the recent intralibrary conference the other week. Although a many of the presenters had links with Intrallect and/or were intralibrary users the discussions did focus on much broader issues than the specifics of that particular system. As I missed the first day I can’t really give a complete overview of the whole conference so I’ll concentrate on a couple of items that caught my interest. Neil Fegen has provided an excellent overview of the conference as a whole.

The first presentation of the day caused a lot of interest. Ian Watson (Institute for Research and Innovation in Social Services) gave a demo of a new web based interface his team have developed for intralibrary call opensearch. Although at the early stages of development this did look like a really useful tool as it had lots of user friendly features. The team are hoping to extend the tool to incorporate federated searches, implement SWORD and it will be released as open-source. I’m sure this is one project my colleagues in the MDR SIG will be keeping an eye on.

There was also a presentation of a packaging tool called Compendle – which I had never heard of. The tool is basically a content aggregator/packager, and has quite a nice user-friendly drag and drop interface. However when probed a bit deeper, it does really only offer quite basic functionality. However the team did seem to be keen to develop it further to allow for more advanced editing sequencing functionality.

Probably the most interesting part of the day (for me anyway) was the workshop session led by John Casey (JORUM). John has recently joined the JORUM team and is leading the way in investigating ways to make the service more open – the ultimate goal is to have an open (at point of access) service, possibly called OpenJORUM. Plans are at a very early stages and John outlined some ideas he has been mulling over including an intermediary phase (possibly called JORUM UK) which would be open only to those in the UK. This idea didn’t seem to go down too well with the audience and parallels were drawn with the experience of the BBC with only allowing certain services (e.g. the iplayer) available to the UK.

In terms of IPR and licencing it looks like there will be a move to a more creative commons approach. This would hopefully bring about a much needed driver for greater clarity and leadership from institutions over IPR. Citing his previous work in the TrustDR project, John stressed that IPR is not the problem – it only becomes a problem for the teaching and learning community if there are no clear institutional guidelines. John, did emphasise that no decisions have been made, and that the driving factor of any such extension of the JORUM service would be providing something that is quick and easy to use.

Any developments with JORUM are of obviously of great interest to the CETIS community and the next EC SIG meeting (end of May, Manchester – watch the list for more details or contact me about it) will feature a session from John and colleagues and an opportunity for more community discussion around the open resources debate.

Presentations from the conference are available from the Intrallect website. Thanks to all at Intrallect for organising another stimulating conference.

A capital day for assessment projects

Last Monday CARET, University of Cambridge hosted a joint workshop for the current JISC Capital Programme Assessment projects. The day provided an opportunity for the projects to demonstrate how the tools they have been developing work together to provide the skeleton of a complete assessment system from authoring to delivery to storage. Participants were also encouraged to critically review progress to date and discuss future requirements for assessment tools.

Introducing the day Steve Lay reminded delegates of some of the detail of the call under which the projects had been funded. This included a focus on “building and testing software tools, composite applications and or implementing a data format and standards for to defined specification” – in this case QTI. The three funded projects have built directly on the outcomes of previous toolkits and demonstrator activities of the e-framework.

The morning was given over to a demo from the three teams, from Kingston, Cambridge and Southampton Universities respectively, showing how they interoperated by authoring a question in AQuRAte then storing it in Minibix and finally delivering it through ASDEL.

Although the user-interfaces still need a bit of work, the demo did clearly show how using a standards based approach does lead to interoperable systems and that the shorter, more iterative development funding cycle introduced by JISC can actually work.

In the afternoon there were two breakout sessions one dealing with the technical issues around developing and sustaining an open source community, the other looking innovations in assessment. One message that came through from both sessions was the need for more detailed feedback on what approaches and technologies work in the real world. Perhaps some kind of gap analysis between the tool-set we have just now and the needs of the user community combined with more detailed use cases. I think that this approach would certainly help to roadmap future funding calls in the domain as well as helping inform actually practice.

From the techie side of the discussion there was a general feeling of there still being lots of uncertainty about the development of an open source community. How/will/can the 80:20 rule of useful code be reversed? The JISC open source community is still relatively immature and the motivations for be part of it are generally because developers are being paid to be part of it – not because it is the best option. There was a general feeling that more work is needed to help develop, extend and sustain the community and that it is at quite a critical stage in its life-cycle. One suggestion to help with this was the need for a figure head to lead the community – so if you fancy being Mr/Mrs QTI do let us know:-)

More notes from the day are available for the projects’ discussion list.

Assessment, Packaging – where, why and what is going on?

Steve Lay (CARET, University of Cambridge) hosted the joint Assessment and EC SIG meeting at the University of Cambridge last week. The day provided and opportunity to get an update on what is happening in the specification world, particularly in the content packaging and assessment areas and compare that to some really world implementations including a key interest – IMS Common Cartridge.

Packaging and QTI are intrinsically linked – to share and move questions/items they need to be packaged – preferably in an interoperable format:-) However despite recent developments in both the IMS QTI and CP specifications, due to changes in the structure of IMS working groups there have been no public releases of either specifications for well over a year. This is mainly due to the need for at least two working implementations of a specification before public release. In terms of interoperability, general uptake and usabillity this does seem like a perfectly sensible change. But as ever, life is never quite that simple.

IMS Common Cartridge has come along and has turned into something of a flag-bearer for IMS. This has meant that an awful lot of effort from some of the ‘big’ (or perhaps ‘active’ would be more accurate) members of IMS has been concentrated on the development of CC and not pushing implementation of CP1.2 or the latest version of QTI. A decision was taken early in the development of CC to use older, more widely implemented versions of specifications rather than the latest versions. (It should be noted that this looks like changing as more demands are being made on CC which the newer versions of the specs can achieve.)

So, the day was also an opportunity to reflect on what the current state of play is with IMS and other specification bodies, and to discuss with the community what areas they feel are most important for CETIS to be engaging in. Profiling did surface as something that the JISC elearning development community – particularly in the assessment domain – should be developing further.

In terms of specification updates, our host Steve Lay presented a brief history of QTI and future development plans, Adam Cooper (CETIS) gave a round up from the IMS Quarterly meeting held the week before and Wilbert Kraan (CETIS) gave a round up of packaging developments including non IMS initiatives such as OAI-ORE and IEEE RAMLET. On the implementation side of things Ross MacKenzie and Sarah Wood (OU) took us through their experiences of developing common cartridges for the OpenLearn project and Niall Barr (NB Software) gave an overview of integrating QTI and common cartridge. There was also a very stimulating presentation from Linn van der Zanden (SQA) on a pilot project using wikis and blogs as assessment tools.

Presentations/slidecasts ( including as much discussion as was audible) and MP3s are available from the wiki so if you want to get up to speed on what is happening in the wonderful world of specifications – have a listen. There is also an excellent review of the day over on Rowin’s blog.

LETSI update

Alongside the AICC meetings in last week in California, there was an ADL/AICC/LETSI Content Aggregation Workshop. Minutes from the meeting are available from the LETSI wiki. There seemed to be a fairly general discussion covering a range of packaging formats from IMS CP to MPEG 21 and DITA.

As we have reported previously, the ADL would like to see a transition to a community driven version of SCORM called core SCORM by 2009/10. This meeting brought together some of the key players although it looks like there was no official IMS representation. It does seem that things are still very much at the discussion stage and there is still a way to go for consensus on what de jour standards core SCORM will include. There is another LETSI meeting in Korea in March, before the SC36 Plenary Meeting. One positive suggestion that appears at the end of the minutes is the development of white paper with a clear conclusion or “call to action’. Until then it’s still difficult to see what impact this initiative will have.