JISC and MIT: comparing notes on ed tech

A few weeks ago I had an opportunity to join a conversation between JISC and MIT OEIT (http://oeit.mit.edu/) to exchange information about current initiatives and possible collaborations. The general themes of the conversation were openness and sustainability. There was an agreed sense that, currently, “Open is the new educational tech” (Vijay). The areas of strategic interest, competencies, and knowledge of open institutes are now central to much educational development. JISC’s work in many diverse areas has contributed to the growth of openness both the successive programmes of work connected to repositories (including the cultivation of developer happiness) and more recently the JISC and HEA OER programme.

Vijay outlined some of the thinking that MIT OEIT are doing around innovation and sustainability outlining where they fit in that cycle and the limiting dependencies of innovation. In a four stage innovation cycle. MIT OEIT are mostly involved in the initial incubation/ development phase and the early implementation phase. They’re not in the business of running services but they need to ensure that their tools are designed and developed in ways which are congruous with sustainability. One key point in their analysis is that the limiting factor for innovation is not your organisational growth (whether the size of the project, design team, or facilities) but the growth of nascent surrounding communities in other parts of the value chain.

As a result, MIT have found that sustainability and embedding innovation isn’t just about more resources it’s about basic design choices and community development. Openness and open working allows the seeding of the wider community from the outset and allows a project to develop competencies and design skills in the wider community). This resonates with some of observations made by OSS Watch and Paul Walk. We then discussed the success of the Wookie widget work carried out by Scott Wilson (CETIS) and how that has successfully developed from a JISC project into an Apache Foundation incubator http://incubator.apache.org/wookie/.

The conversation continued around the tech choices being made in the UKOER programme noting the strength in the diversity of approaches and tools that have been in use in the programme and the findings that appear to be emerging- there is no dominant software platform, choices about support for standards are being driven, in part, by the software platforms rather than a commitment to any standard. [I’ll blog more on this in January as the technical conversations with projects are completed]. We also noted upcoming work around RSS and deposit tools taking place both following on from the JISCRI deposit tools event and emerging from the UKOER programme [see Jorum’s discussion paper on RSS for ingest http://blogs.cetis.org.uk/lmc/2009/12/09/oer-rss-and-jorumopen/]

Brandon then highlighted the SpokenMedia project (http://spokenmedia.mit.edu/) creating tools to automatically transcribe video of lectures both for to enable better search and to make materials to be more accessible and scannable. The tools achieve up 60% base accuracy and are trainable up to 80% accuracy. MIT hope this will make lecture video significantly more browseable and are exploring the release of an api for this as an educational service.

We then discussed some projects working in areas that support bringing research data into curriculum. MIT have a series of projects in this area under the general name of STAR (http://web.mit.edu/star/) which provide suites of tools to use research data in the classroom. One successful implementation of this is STARBioGene allows Biology students to use research tools and materials as part of the core curriculum. Some of the STAR tools are desktop applications and some are cloud-based, many have been made open source.
The wider uptake of the project has contributed to the development of communities outside MIT who are using these tools – as such also it’s an example of growing the wider uptake community outlined in their innovation cycle. One consideration that it has raised about communities of use is that some of the visualisation tools require high performance computing (even if only needed in small bursts). The trend toward computationally intensive science education may create other questions of access beyond the license.

Another interesting tool that we discussed was the Folk Semantic Tool from COSL at Utah State University: on the one hand it’s another RSS aggregator for OERs, on the other, for users running Firefox and Greasemonkey it’s a plugin to add recommendations for OERs into any webpage (which runs off a single line of javascript). http://www.folksemantic.com/

MIT: M.S. Vijay Kumar & Brandon Muramatsu JISC: David Flanders, John Robertson (CETIS)

Notes from the web: Proposed NSF Repository, and Open Textbook act

Just a few quick points of interest from the web about developments in the US which are worth noting:

A new subject repository and OA mandate?

“In addition to the $20 million grant announced today, the Libraries received a $300,000 grant from NSF to study the feasibility of developing, operating and sustaining an open access repository of articles from NSF-sponsored research. Libraries staff will work with colleagues from the Council on Library and Information Resources (CLIR), and the University of Michigan Libraries to explore the potential for the development of a repository (or set of repositories) similar to PubMedCentral, the open-access repository that features articles from NIH-sponsored research. This grant for the feasibility study will allow Choudhury’s group to evaluate how to integrate activities under the framework of the Data Conservancy and will result in a set of recommendations for NSF regarding an open access repository.” (http://releases.jhu.edu/2009/10/02/sheridan-libraries-awarded-20-million-grant/)

(via Glynn Moody URL12124123431 and Open Education News http://openeducationnews.org/2009/10/06/nsf-considering-repository/).

This is great news! if as a result of the feasability study all publications which result from NSF funding need to have, at least Green OA, licences – this will significantly impact OA in the Sciences.

This comes in the same week as US Senator Durbin has introduced a bill proposing funding for the creation and sustained development of open textbooks. As David Wiley notes the bill also includes a clause realting to federally funded materials

“In General- Notwithstanding any other provision of law, educational materials such as curricula and textbooks created through grants distributed by Federal agencies, including the National Science Foundation, for use in elementary, secondary, or postsecondary courses shall be licensed under an open license.” http://www.govtrack.us/congress/billtext.xpd?bill=s111-1714

(via David Wiley http://opencontent.org/blog/archives/1103 and Open Education News http://openeducationnews.org/2009/10/01/the-open-college-textbook-act/)

I’m not sure what the odds are of this bill being passed, but alongside all the federal data being released and made openly available there is a definite shift in US goverment policy towards a position that if the public money pays for something the public should have as much access to it as is reasonable.

More than this, if educational resources produced in relation to NSF funding do end up having open licences we’re going to end up with a lot more OERs and a lot more (US) institutions having to take a serious look at how they manage their learning materials.

Journals and the right choice of words

Erik Duval has just blogged about the first issue of IEEE Transactions on Learning Technologies.
His blog post is available at http://erikduval.wordpress.com/2008/10/01/ieee-transactions-on-learning-technologies/

I’m glad to see the launch of a new journal in the field of learning technology and very glad to see that the content is going to be Open Access after a 12 month embargo during which it’s only available to subscribers. They’ve got a featured article that looks interesting “Capture, Management, and Utilization of Lifecycle Information for Learning Resources, Lasse Lehmann, Tomas Hildebrandt, Christoph Rensing, Ralf Steinmetz” http://www.computer.org/portal/cms_docs_transactions/transactions/tlt/featured_article/featured.pdf.

<rant> I may just be grumpy because the train strike is making my life difficult but there are two things about this journal (or -to be fair- IEEE’s journal system) that have already managed to frustrate me.

The first is that there is no publicly (non-subscriber) viewable table of contents  – this may be a mix up because its the first issue – but apart from the featured article – I have no idea what’s in this issue.

The second is that the journal has access to PrePrints and Rapidposts.

‘PrePrints are papers accepted for publication in a future issue, but have not been fully edited. Their content may change prior to final publication. RapidPosts are articles that have been accepted for inclusion in a future issue. Content is final as presented, with the exception of pagination.’

This making available of content that is ‘in press’ is great and adds value for subscribers – but please – find a different word!  ‘PrePrint’ – as confusing as it may be as a term – has very strong connections and a lot of established use in Open Access repositories. When I see that word I expect to find something I can access not a subscribers’ login screen.

IEEE login screen

I know this choice of words has nothing to do with the editors but frustrating potential readers isn’t going to incline me to return to the page. </rant>