JISC and MIT: comparing notes on ed tech

A few weeks ago I had an opportunity to join a conversation between JISC and MIT OEIT (http://oeit.mit.edu/) to exchange information about current initiatives and possible collaborations. The general themes of the conversation were openness and sustainability. There was an agreed sense that, currently, “Open is the new educational tech” (Vijay). The areas of strategic interest, competencies, and knowledge of open institutes are now central to much educational development. JISC’s work in many diverse areas has contributed to the growth of openness both the successive programmes of work connected to repositories (including the cultivation of developer happiness) and more recently the JISC and HEA OER programme.

Vijay outlined some of the thinking that MIT OEIT are doing around innovation and sustainability outlining where they fit in that cycle and the limiting dependencies of innovation. In a four stage innovation cycle. MIT OEIT are mostly involved in the initial incubation/ development phase and the early implementation phase. They’re not in the business of running services but they need to ensure that their tools are designed and developed in ways which are congruous with sustainability. One key point in their analysis is that the limiting factor for innovation is not your organisational growth (whether the size of the project, design team, or facilities) but the growth of nascent surrounding communities in other parts of the value chain.

As a result, MIT have found that sustainability and embedding innovation isn’t just about more resources it’s about basic design choices and community development. Openness and open working allows the seeding of the wider community from the outset and allows a project to develop competencies and design skills in the wider community). This resonates with some of observations made by OSS Watch and Paul Walk. We then discussed the success of the Wookie widget work carried out by Scott Wilson (CETIS) and how that has successfully developed from a JISC project into an Apache Foundation incubator http://incubator.apache.org/wookie/.

The conversation continued around the tech choices being made in the UKOER programme noting the strength in the diversity of approaches and tools that have been in use in the programme and the findings that appear to be emerging- there is no dominant software platform, choices about support for standards are being driven, in part, by the software platforms rather than a commitment to any standard. [I’ll blog more on this in January as the technical conversations with projects are completed]. We also noted upcoming work around RSS and deposit tools taking place both following on from the JISCRI deposit tools event and emerging from the UKOER programme [see Jorum’s discussion paper on RSS for ingest http://blogs.cetis.org.uk/lmc/2009/12/09/oer-rss-and-jorumopen/]

Brandon then highlighted the SpokenMedia project (http://spokenmedia.mit.edu/) creating tools to automatically transcribe video of lectures both for to enable better search and to make materials to be more accessible and scannable. The tools achieve up 60% base accuracy and are trainable up to 80% accuracy. MIT hope this will make lecture video significantly more browseable and are exploring the release of an api for this as an educational service.

We then discussed some projects working in areas that support bringing research data into curriculum. MIT have a series of projects in this area under the general name of STAR (http://web.mit.edu/star/) which provide suites of tools to use research data in the classroom. One successful implementation of this is STARBioGene allows Biology students to use research tools and materials as part of the core curriculum. Some of the STAR tools are desktop applications and some are cloud-based, many have been made open source.
The wider uptake of the project has contributed to the development of communities outside MIT who are using these tools – as such also it’s an example of growing the wider uptake community outlined in their innovation cycle. One consideration that it has raised about communities of use is that some of the visualisation tools require high performance computing (even if only needed in small bursts). The trend toward computationally intensive science education may create other questions of access beyond the license.

Another interesting tool that we discussed was the Folk Semantic Tool from COSL at Utah State University: on the one hand it’s another RSS aggregator for OERs, on the other, for users running Firefox and Greasemonkey it’s a plugin to add recommendations for OERs into any webpage (which runs off a single line of javascript). http://www.folksemantic.com/

MIT: M.S. Vijay Kumar & Brandon Muramatsu JISC: David Flanders, John Robertson (CETIS)

Managing OERs: the problem of version control?

Proposal: those releasing OERs should not invest undue effort in attempting to maintain version control over copies of their material other than those they directly manage.

This post looks at one possible administrative or management concern or challenges emerging from the technical side of working with Open Educational Resources. My response to this concern is (more than usual) opinion rather than advice and hopes to provoke some debate.

There are plenty of reasons why version control of files is critical. These range from managing which version of a document you can safely delete to making sure you’re reading the right document or installing the most up to date patch. Good version control is a key part of content production, file management, and dissemination. Any repository, content management system or other tool – need to clearly distinguish between current and older versions. Older versions may or may not be maintained (whether publicly or privately). In itself this creates a question of what those releasing resources should link to. At its simplest version information about research papers is important to distinguish between pre-print and post-print. However, when papers are published that usually represents a final version of that paper (and not many repositories are [currently] likely to make public multiple versions of an article.

Educational resources on the other hand are usually considered less finished in that even once they are used for teaching year by year [in theory] they regularly evolve to reflect feedback, changes in course content, and developments in teaching style. These iterative versions may often blur into each over as in the lecturer’s mind they are the notes for topic ‘x’ rather than discrete intellectual works. Unless a course or class is completely restructured these assets are likely considered to be one entity [There is a case to be made that these materials are perhaps in need of more rigorous versioning]. For academics who’ve engaged with the idea of Open Education or simply appreciate the visibility it offers this may create a desire to update the materials they’ve released and replace them with new versions. Indeed, if they discover an error in their materials, or their thinking shifts they may be insistent on trying to manage the available copies of their work.

Local repositories or services managing OERs will doubtless develop their own policies and practices to support or address this concern and it makes sense to keep the available learning resources updated. The policies and practices will likely diverge over whether older versions of materials are kept and/or made available to the public. This process gets more interesting though when we consider what interaction projects have with other services which have copies of (rather than just link to) their resources – to what extent do you try to version secondary copies of your resources?

I think there are several factors that shape how OER producers should respond to this question:

  1. strictly speaking you have no legal right to request the removal or update of such resources. Once released under an open license the content is out of your control as long as the license is conformed to. [Note: I am not a lawyer but part of the entire point of most open licenses is that they are non-transactional and irrevocable].
  2. most services or individuals that have taken copies of your resources are likely to be very happy to take updated copies as well.
  3. although most notification or harvesting technologies or standards can support (in some form) the deletion, creation or updating of records [and this would potentially support pointing to a new version]; they deal with metadata rather than allowing remote file management [AFAIK] and even if they did support remote file management few people are going to enable such a feature.
  4. manually distributing updated copies is a possibility but is time intensive and also relies on the policy, procedure, and practice of the third party service.

Considering these factors, I don’t think under normal circumstances OER distributors should be concerned about how their materials are versioned once they leave their local service. This does imply that there may always be some degree of confusion but I’d suggest that on the web there is (even when concerted efforts are made to reduce it) and that responding to the confusion requires consumers of OERs to exercise the same information literacy skills that they need when interacting with any online resource.

This said I think there are steps OER producers can take to promote the visibility of their current resources: one such would be to include a purl or other appropriate uri which points to for the latest version in the resource and metadata where this is possible, whether as a cover page, a subtitle at the start of a video, or other such mechanism, there is a compelling case that resources should include information about where they’ve come from – not only to promote the latest versions but also to note the resource provider. I’ve talked previously about this idea of that resources should be self-descriptive. There may be limit cases in which there is a compelling case to try to remove every trace of a resource but these are unlikely to be common.

Technical challenges for managing Open Educational Resources

At the CETIS conference this year, Lorna organised a session offering Roundtable about technical issues facing projects engaging with Open Educational Resources – as most of the attendees were drawn from the UKOER programme. Although there will doubtless be more refined versions of this list I’ve created a first pass of the issues. The full list is on slideshare: http://www.slideshare.net/RJohnRobertson/cetis09-oer-technical-roundtable . The tweets from the session are on http://twapperkeeper.com/Cetis09find/. The issues list is probably more usable via slideshare but to give an impression:

Nested list of brainstormed issues

Nested list of brainstormed issues

Notes from the web: Proposed NSF Repository, and Open Textbook act

Just a few quick points of interest from the web about developments in the US which are worth noting:

A new subject repository and OA mandate?

“In addition to the $20 million grant announced today, the Libraries received a $300,000 grant from NSF to study the feasibility of developing, operating and sustaining an open access repository of articles from NSF-sponsored research. Libraries staff will work with colleagues from the Council on Library and Information Resources (CLIR), and the University of Michigan Libraries to explore the potential for the development of a repository (or set of repositories) similar to PubMedCentral, the open-access repository that features articles from NIH-sponsored research. This grant for the feasibility study will allow Choudhury’s group to evaluate how to integrate activities under the framework of the Data Conservancy and will result in a set of recommendations for NSF regarding an open access repository.” (http://releases.jhu.edu/2009/10/02/sheridan-libraries-awarded-20-million-grant/)

(via Glynn Moody URL12124123431 and Open Education News http://openeducationnews.org/2009/10/06/nsf-considering-repository/).

This is great news! if as a result of the feasability study all publications which result from NSF funding need to have, at least Green OA, licences – this will significantly impact OA in the Sciences.

This comes in the same week as US Senator Durbin has introduced a bill proposing funding for the creation and sustained development of open textbooks. As David Wiley notes the bill also includes a clause realting to federally funded materials

“In General- Notwithstanding any other provision of law, educational materials such as curricula and textbooks created through grants distributed by Federal agencies, including the National Science Foundation, for use in elementary, secondary, or postsecondary courses shall be licensed under an open license.” http://www.govtrack.us/congress/billtext.xpd?bill=s111-1714

(via David Wiley http://opencontent.org/blog/archives/1103 and Open Education News http://openeducationnews.org/2009/10/01/the-open-college-textbook-act/)

I’m not sure what the odds are of this bill being passed, but alongside all the federal data being released and made openly available there is a definite shift in US goverment policy towards a position that if the public money pays for something the public should have as much access to it as is reasonable.

More than this, if educational resources produced in relation to NSF funding do end up having open licences we’re going to end up with a lot more OERs and a lot more (US) institutions having to take a serious look at how they manage their learning materials.

Open Education: project or process and practice?

I’m new enough to the Open Education world that I can’t tell how unusual the closure of Utah State’s OpenCourseWare initiative is (http://chronicle.com/blogPost/Utah-State-Us-OpenCourseWare/7913/ and http://opencontent.org/blog/archives/967). It’s probably the largest OCW initiative in the US after MIT and in a summary of its successes earlier this year the  Utah State University online news noted that it gets 50000 visitors a month http://www.usu.edu/ust/index.cfm?article=34468. To all extents and purposes it has been a success but it’s now a success that’s, at best, on hold.

At the risk of simplifying the undoubtedly difficult and complex issues behind this decisions, it strikes me that for USU Open Education has been a project. A successful project, and one that has been good for the institution, but a project nevertheless. But in academia and in recessions projects are always vulnerable – they are, almost by definition, additonal to institution’s core business.

For Open Education and for Open Access, USU’s decision reinforces the need for sustainable approaches – a need to embed the work of making things open into the normal processes and practice of the institution.

I’m not suggesting that Open Education can be done without any additional cost to the institution, that it’s easy, or that all those invovled in the initiaive at USU didn’t do all that they could (and way more than I could). There are plenty of things that you have to do on top of normal practice and process before you can make content freely available and higher costs if you want to maximise the benefits to your institution. Open Education like Open Access is not free to provide.

Unfortunately I suspect that most OER (and most OA) initiatives face the challenge that they have to start from the position that institutions aren’t actively managing their digital stuff and that staff aren’t always considering IPR when they create materials. I’m not sure how many institutions would be able to say that they have organised copies of all their lecture materials and research papers (irrespective of whether they want to be Open or not).  An OER initiative that tries to do all that work for the institution, without the instituion undergoing some  changes to the basic institutional processes of how we do education. runs the risk of remaining a project (irrespective of their funding source). I’d suggest that the sustainability of OER and OA initiatives is entirely dependent on if institutions choose to manage their digital stuff. [I also don’t know how much of an issue this was at USU]

I’ll finish with three quotes drawn from the USU example that I take as a warning to any project:
“In the tradition of land grant universities, Utah State University OpenCourseWare assures that no individual who is prepared and who desires the opportunity to advance his or her education is turned away. USU OCW provides an unprecedented degree of free and open access to the knowledge and expertise of our faculty for the benefit of every citizen of the state of Utah and every person in the world. As we enter the 21st century, services like OpenCourseWare will enable land grant institutions to more fully accomplish their missions.” (Stan Albrecht, President, Utah State University on http://ocw.usu.edu/)

“The cause? The effort ran through grant money from the William and Flora Hewlett Foundation and $200,000 from the state Legislature. It needed $120,000 a year to keep going. But it failed to secure any more state or university money, Mr. Jensen said, despite being the third-most-visited Web site hosted by Utah State University.
“It’s just a bad timing issue,” Mr. Jensen told The Chronicle this morning. “The recession hit. People wanted to keep us up, but the economy was just such that we could not find money anywhere.””( “Utah State U.’s OpenCourseWare Closes Because of Budget Woes”, Marc Parry, The Chronicle of Higher Education http://chronicle.com/blogPost/Utah-State-Us-OpenCourseWare/7913/)

“2009-10 AUTHORIZED BASE BUDGET [of $] 226,327,800″ http://www.usu.edu/budget/documents/legislature/2009/budget%20summary_fy2009%20leg.pdf

Sadly this suggests to me that for USU, Openness was a project.