MOOCs and Open Education Timeline (updated!)

Untitled This revised version of the evolution of MOOCs was developed for our paper ‘Partnership Model for Entrepreneurial Innovation in Open Online’ now published in eLearning Papers. Three years after the initial MOOC hype, in line with our previous analysis we looked at some possible trends and influence of MOOCs the HE system in the contexts of face-to-face teaching, open education, online distance learning, and possible business initiatives in education and training. We expanded the diagram from 2012 -2015 and explored some key ideas and trends around the following aspects:
  1. Open license: Most MOOC content is not openly licensed so it cannot be reused in different contexts. There are, however, a few examples of institutions using Creative Commons licences for their courses – meaning they can be taken and re-used elsewhere. In addition, there is a trend for MOOC to be made available ‘on demand’ after the course has finished, where they in effect become another source of online content that is openly available. Those OERs and online content can be used to develop blended learning courses or support a flipped classroom approach in face-to-face teaching.
  2. Online learning pedagogy: New pedagogical experiments in online distance learning can be identified in addition to the c/xMOOC with variants including SPOCs (Small Private Open Courses), DOCCs (Distributed Open Collaborative Course) and SOOCs (Social Online Open Course or Small Open Online Course). It is likely that they will evolve to more closely resemble regular online courses with flexible learning pathways. These will provide a range of paid-for services, including learning support on demand, qualitative feedback on assignments, and certification and credits (Yuan and Powell 2014).
  3. New educational provisions: The disruptive effect of MOOCs will be felt most significantly in the development of new forms of provision that go beyond the traditional HE market. For example, the commercial MOOC providers, such as Udacity and Coursera, have moved on to professional and corporate training, broadening their offerings to appeal to employers (Chafkin, 2013). In an HE context, platforms are creating space for exam-based credit and competency-based programs which will enable commercial online learning providers to produce a variety of convenient, customizable, and targeted programs for the emergent needs of the job market backed by awards from recognised institutions.
  4. Add-on Services: The development of online courses is an evolving model with the market re-working itself to offer a broader range of solutions to deliver services at a range of price levels to a range of student types. There is great potential for add-on content services and the creation of new revenue models through building partnerships with institutions and other educational service providers. As these trends continue to unfold, we can expect to see even more entrepreneurial innovation and change in the online learning landscape.

Subject coding is changing from JACS3 to HECoS; here’s what’s different

From UCAS applications to HESA returns, and from league tables to the academic technology approval scheme, degree programmes and modules are classified by subject. JACS3 does that job now, but HECoS will do it in the future. Here are the main differences. After many years of use, the Joint Academic Coding System (JACS) that’s pervasive in UK Higher Education data sets ran into some limits: it was running out of codes in some subject areas, and it was being used for many more purposes than it was originally designed to support. That’s why the Higher Education Data and Information Improvement Programme (HEDIIP) commissioned CETIS, in collaboration with APS and Aspire, to consult with the sector on a replacement of the vocabulary. The result of that work is the Higher Education Coding of Subjects (HECoS) vocabulary. HECoS has now reached the penultimate stage in that a release candidate is out for consultation, as are proposals for the governance and adoption of the scheme. The whole vocabulary can be seen on our tematres development site, and reports on the development of HECoS, as well as the proposals for governance and adoption are available from the consultation site. Here are the main differences between JACS3 and HECoS in a nutshell, though; One flat list, no hierarchies, and no memorable codes This is easily the biggest and most noticeable change. HECoS itself is just a list of terms without any implied or given groupings. That doesn’t mean groupings and hierarchies aren’t important, quite the contrary: different organisations have different uses for subject information, and that means they can group subjects differently. In a way, that follows on from what’s already happening with JACS3 in practice. The definition of what subjects constitutes biological sciences, for example, already differs between JACS3, HEFCE and what a typical university is likely to be able to offer. Different drivers and different contexts lead these organisations to group subjects differently, and HECoS is designed to enable different groupings to exist side by side, whilst still sharing the same subject terms. HECoS with many hierarchies A consequence of the approach is that the familiar JACS3 codes (“L3xx” is anything sociological etc.) are no longer valid. From the perspective of HECoS “sociolinguistics” will therefore have no defined link with “sociology”, which is why the code for the former is “101016” –or a URI that encodes that number such as http://hecos.hediip.ac.uk/terms/101016– and the code for the latter is “100505”. For ease of navigation, however, HECoS will come with some common groupings. There is a “sociology group” that has both “sociolinguistics” and “sociology” in it. This is just to help people find terms, and nodes like “sociology group” cannot be used to classify a degree programme or module. Terms are based on demonstrated use, need and distinguishability While JACS was reviewed periodically, it hasn’t always had formal acceptance criteria either for the terms that were already in there, or for newly proposed ones. HECoS does have a proposal for it, which has already been applied in the development of the current draft. The criteria for the first cut were, in short:
  1. is the term in JACS3?
  2. is there evidence of use of the term in HESA data returns?
  3. is the term’s definition and scope sufficiently clear and comprehensive to allow classification?
  4. is the term reliably distinguishable from other terms?
The first criterion comes out of a recognition that JACS has imposed a structure and created its own reality over the years. That’s a good thing, and worth preserving for time series analysis reasons alone. The second criterion addresses an issue that has bedevilled JACS for a while: many terms were sound in theory, but barely or never used in practice. This creates confusion and often makes coding unreliable: what good is a term if it groups one degree programme in one institution? For that reason, we looked at whether a term has at least two degree programmes in at least two institutions in HESA student data returns. The third criterion has to do with the way some JACS terms were defined: some were incomplete –e.g. “history by topic” without specifying what that topic was– or where not sufficiently complete to determine what was in or out. The final criterion of distinguishability is related to that: we examined the HESA returns for consistency of coding. If the spread of similar degree programmes over several terms indicated that people were struggling to distinguish between terms, we’ve rearranged terms so that they follow the groupings that were obvious in the data as closely as possible. We’ve also started to test any such changes with sorting exercises to ensure that people can indeed distinguish between four related terms. A commonly administered change process Just like JACS evolved over the years, so will HECoS. The difference is that we are proposing to regularise the change and allow it to follow a predictable path. The main mechanism for that would be a registry for new terms. The diagram outlines how a new subject term can be discovered, or entered for consideration for inclusion, or discovery by others. newTermProcess The proposed criteria for accepting a new term into HECoS proper are similar the ones used for the first draft: a term has to be demonstrably in use, or fill a need, and be distinguishable by non-specialists. In each case, though, the HECoS governance body, which is designed to represent the whole sector, will have the ultimate say on which terms will be accepted or retired, and how often these changes will happen.

A simpler sourcing maturity assessment approach

Knowing how to procure your IT services, software and hardware is a vital function in any organisation. Assessing one’s maturity in this aspect can be complex, which is why SURF developed a simpler approach.

There are a number of perspectives to take on IT and its place in an organisation, but for further and higher education institutions, the procurement or sourcing of services – in the widest sense of the word ‘services’ – may be among the most important ones. With the ongoing move to cloud provisioning, determining where a particular service is going to come from and how it is managed is crucial.

Relationship Management: Be transparent and sincere

Following on from my previous post (Relationship Management: Communicate, communicate, communicate),  based on the Compendium of Good Practice in Relationship Management in Higher and Further Education, written by myself and Lou McGill, this post will focus on culture change. We’ve already stated the importance of communication, which is the glue that binds the various stakeholders together.  In this post, we’ll be taking a look at the institution’s and management’s role in relationship management with regard to culture change.

“The project, as a change management initiative, has contributed to the University [of Southampton's] understanding of its institutional context. Opening up our data silos is more political and cultural than technical, and these domains are starting to change. There is little concrete evidence of the fruits of the change yet, but the change process has begun… We have been able to make extensive preparation for change, and there is commitment within the University to continue with it.” (Moore, I. and Paull, A. (2012). JISC Relationship Management Programme – Impact Analysis: Strands 2 and 3. (Not publicly available)

Taking an institution-wide approach to relationship management presents opportunities to identify where existing cultural approaches and practices may be ineffective. Sometimes the introduction of a new software system can highlight areas where cultural change needs to occur. It can show where current procedures inhibit agility, or where collaboration and innovation initiatives are not working. Introducing new software often acts as a catalyst for change in policies, practice and culture, whilst improving access to data can encourage the organisational culture to be more innovative and transparent. Changing an organisation’s culture is not without its problems:

“For context we would note that the staff and student population of an average university is equivalent to that of a small town (and the largest universities to small cities). Planning for change on this scale is not easy.” (Moore, I. and Paull, A. (2012). JISC Relationship Management Programme – Impact Analysis: Strands 2 and 3. (Not publicly available)

Cultural change comes with a myriad of challenges and is probably one of the hardest aspects of relationship management to address. For example:

  • staff may view changes in processes and the introduction of new software systems as threatening to their working practices; eg at Loughborough University, some staff who considered their own processes to be fit for purpose were concerned about proposed changes
  • concerns around budget reductions
  • resulting staff turnover

Champions can help drive change. At the University of Nottingham, for example, senior management is encouraged to champion good practice for placements, with the placement co-ordinator acting as the central conduit for relationships and communication. Senior management buy-in or sponsorship can help to raise the importance of relationship management within the institution, but it must be sincere, otherwise an institution’s organisational structure will remain a barrier no matter what improvements are suggested:

“The process of change needs to be managed with care to ensure that all stakeholder are positively engaged, especially those who have the power to implement the change (primary stakeholders), and those who have influence over opinion within the organization. Hence it is essential to carry out a full stakeholder analysis. As with any change management, when it comes to implementing the change it is important to identify champions in each of the stakeholder groups, coupled with clear and regular communication.” (Davis, H., Howard, Y., and Prince, R. (2012). Ninjas and Dragons. University of Southampton)

Consultation with a wide range of departments and stakeholders can also help to identify new champions. For example, new enthusiasts at the University of Nottingham were instrumental in spreading the word about placements and sources of expertise. As a result, existing good practice (for example from the School of Veterinary Medicine) has now been incorporated into the placements process and at least five academic schools in the University have expressed interest in using ePortfolios to support placements or work-based activity.

The co-creation aspects of the service design approach can help to improve staff buy-in, because it empowers staff to take ownership of any process improvements with a good chance of long-term impact. Taking this approach and talking to people on their own terms may also win over ‘difficult’ institutional characters, thereby enabling ‘change by stealth’. Sometimes, it is necessary to establish new organisational structures to facilitate change and create new staff roles to reflect changing priorities. Communication is vital for promoting an understanding of what people are doing and why.

Change must be managed carefully to ensure that all stakeholders are engaged, especially those who have power or influence in the institution. For example, rather than imposing wholesale change across the whole institution, the University of Nottingham has taken a ‘hub and spoke’ approach in which new developments are conceived centrally and delivered locally. The primary focus is on the spokes, rather than the hub, which start to establish change across the institution. Similarly, encouraging staff to make bite-sized changes that do not take them away from day-to-day operations can reduce resentment to any new methods of working.

Changing the mindset of staff can have a huge impact, even if significant changes to processes are still to be made. For example, instead of just providing advice and guidance to students thinking of leaving, staff at the University of Derby now pro-actively reach out to students who wish to withdraw. This helps the student, who may not be able to articulate their reasons for withdrawal and who may just need additional support. It also provides the institution with useful feedback for making further improvements.

How to approach culture change

  • Establish champions to drive through changes
  • Senior management buy-in or sponsorship must be sincere
  • Talk to people on their own terms
  • Communicate, communicate, communicate
  • Use co-creation to encourage staff to take ownership of process improvements
  • Aim for small-scale rather than large-scale changes

Further information

 

Doing analytics with open source linked data tools

Like most places, the University of Bolton keeps its data in many stores. That’s inevitable with multiple systems, but it makes getting a complete picture of courses and students difficult. We test an approach that promises to integrate all this data, and some more, quickly and cheaply.

Integrating a load of data in a specialised tool or data warehouse is not new, and many institutions have been using them for a while. What Bolton is trying in its JISC sponsored course data project is to see whether such a warehouse can be built out of Linked Data components. Using such tools promises three major advantages over existing data warehouse technology:

It expects data to be messy, and it expects it to change. As a consequence, adding new data sources, or coping with changes in data sources, or generating new reports or queries should not be a big deal. There are no schemas to break, so no major re-engineering required.

It is built on the same technology as the emergent web of data. Which means that increasing numbers of datasets – particularly from the UK government – should be easily thrown into the mix to answer bigger questions, and public excerpts from Bolton’s data should be easy to contribute back.

It is standards based. At every step from extracting the data, transforming it and loading it to querying, analysing and visualising it, there’s a choice of open and closed source tools. If one turns out not to be up to the job, we should be able to slot another in.

But we did spend a day kicking the tires, and making some initial choices. Since the project is just to pilot a Linked Enterprise Data (LED) approach, we’ve limited ourselves to evaluate just open source tools. We know there plenty of good closed source options in any of the following areas, but we’re going to test the whole approach before deciding on committing to license fees.

Data sources

D2RQ

Google Refine logo

Before we can mash, query and visualise, we need to do some data extraction from the sources, and we’ve come down on two tools for that: Google Refine and D2RQ. They do slightly different jobs.

Refine is Google’s power tool for anyone who has to deal with malformed data, or who just wants to transform or excerpt from format to another. It takes in CSV or output from a range of APIs, and puts it in table form. In that table form, you can perform a wide range of transformations on the data, and then export in a range of formats. The plug-in from DERI Galway, allows you to specify exactly how the RDF – the linked data format, and heart of the approach – should look when exported.

What Refine doesn’t really do (yet?) is transform data automatically, as a piece of middleware. All your operations are saved as a script that can be re-applied, but it won’t re-apply the operations entirely automagically. D2RQ does do that, and works more like middleware.

Although I’ve known D2RQ for a couple of years, it still looks like magic to me: you download, unzip it, tell it where your common or garden relational database is, and what username and password it can use to get in. It’ll go off, inspect the contents of the database, and come back with a mapping of the contents to RDF. Then start the server that comes with it, and the relational database can be browsed and queried like any other Linked Data source.

Since practically all relevant data in Bolton are in a range of relational databases, we’re expecting to use D2R to create RDF data dumps that will be imported into the data warehouse via a script. For a quick start, though, we’ve already made some transforms with Refine. We might also use scripts such as Oxford’s XCRI XML to RDF transform.

Storage, querying and visualisation

Callimachus project logo

We expected to pick different tools for each of these functions, but ended up choosing one, that does it all- after a fashion. Callimachus is designed specifically for rapid development of LED applications, and the standard download includes a version of the Sesame triplestore (or RDF database) for storage. Other triple stores can also be used with Callimachus, but Sesame was on the list anyway, so we’ll see how far that takes us.

Callimachus itself is more of a web application on top that allows quick visualisations of data excerpts- be they straight records of one dataset or a collection of data about one thing from multiple sets. The queries that power the Callimachus visualisations have limitations – compared to the full power of SPARQL, the linked data query language – but are good enough to knock up some pages quickly. For the more involved visualisations, Callimachus SPARQL 1.1 implementation allows the results a query to be put out as common or garden JSON, for which many different tools exist.

Next steps

We’ve made some templates already that pull together course information from a variety of sources, on which I’ll report later. While that’s going on, the main other task will be to set up the processes of extracting data from the relational databases using D2R, and then loading it into Callimachus using timed scripts.

Approaches to building interoperability and their pros and cons

System A needs to talk to System B. Standards are the ideal to achieve that, but pragmatics often dictate otherwise. Let’s have a look at what approaches there are, and their pros and cons.

When I looked at the general area of interoperability a while ago, I observed that useful technology becomes ubiquitous and predictable enough over time for the interoperability problem to go away. The route to get to such commodification is largely down to which party – vendors, customers, domain representatives – is most powerful and what their interests are. Which describes the process very nicely, but doesn’t help solve the problem of connecting stuff now.

So I thought I’d try to list what the choices are, and what their main pros and cons are:

A priori, global
Also known as de jure standardisation. Experts, user representatives and possibly vendor representatives get together to codify whole or part of a service interface between systems that are emerging or don’t exist yet; it can concern either the syntax, semantics or transport of data. Intended to facilitate the building of innovative systems.
Pros:

  • Has the potential to save a lot of money and time in systems development
  • Facilitates easy, cheap integration
  • Facilitates structured management of network over time

Cons:

  • Viability depends on the business model of all relevant vendors
  • Fairly unlikely to fit either actually available data or integration needs very well

A priori, local
i.e. some type of Service Oriented Architecture (SOA). Local experts design an architecture that codifies syntax, semantics and operations into services. Usually built into agents that connect to each other via an ESB.
Pros:

  • Can be tuned for locally available data and to meet local needs
  • Facilitates structured management of network over time
  • Speeds up changes in the network (relative to ad hoc, local)

Cons:

  • Requires major and continuous governance effort
  • Requires upfront investment
  • Integration of a new system still takes time and effort

Ad hoc, local
Custom integration of whatever is on an institution’s network by the institution’s experts in order to solve a pressing problem. Usually built on top of existing systems using whichever technology is to hand.
Pros:

  • Solves the problem of the problem owner fastest in the here and now.
  • Results accurately reflect the data that is actually there, and the solutions that are really needed

Cons:

  • Non-transferable beyond local network
  • Needs to be redone every time something changes on the local network (considerable friction and cost for new integrations)
  • Can create hard to manage complexity

Ad hoc, global
Custom integration between two separate systems, done by one or both vendors. Usually built as a separate feature or piece of software on top of an existing system.
Pros:

  • Fast point-to-point integration
  • Reasonable to expect upgrades for future changes

Cons:

  • Depends on business relations between vendors
  • Increases vendor lock-in
  • Can create hard to manage complexity locally
  • May not meet all needs, particularly cross-system BI

Post hoc, global
Also known as standardisation, consortium style. Service provider and consumer vendors get together to codify a whole service interface between existing systems; syntax, semantics, transport. The resulting specs usually get built into systems.
Pros:

  • Facilitates easy, cheap integration
  • Facilitates structured management of network over time

Cons:

  • Takes a long time to start, and is slow to adapt
  • Depends on business model of all relevant vendors
  • Liable to fit either available data or integration needs poorly

Clearly, no approach offers instant nirvana, but it does make me wonder whether there are ways of combining approaches such that we can connect short term gain with long term goals. I suspect if we could close-couple what we learn from ad hoc, local integration solutions to the design of post-hoc, global solutions, we could improve both approaches.

Let me know if I missed anything!

Online Coursework Management Evaluation

The University of Exeter has developed an entirely online end-to-end coursework management system which is the subject of the Online Coursework Management Evaluation (OCME) project funded by JISC as part of the Assessment and Feedback programme Strand B.

This system sees the integration of Moodle and Turnitin within the university’s Exeter Learning Environment (ELE).  Assignments are submitted through the ELE, assigned an originality score by Turnitin, then available for marking through GradeMark (a commercial online marking system within Turnitin) or MS Word markup.  Feedback is returned to students either via uploaded forms or bespoke feedback forms, and are made available for viewing by both individual students and the personal tutor assigned to support them.  Initially deployed through a small 2011 pilot project funded by HEFCE, the system is now available institution-wide, although for practical reasons this evaluation project will concentrate on working with smaller groups across various disciplines.

Exeter’s Moodle support is provided by the University of London Computer Centre, who are developing the interface between Moodle and Turnitin.  There is strong internal support for the system which will be maintained and further developed well beyond the lifetime of this one year project.  What the OCME project will provide is a series of reports and briefing papers which will explore the pedagogic, technological and institutional aspects to transforming practice, and guidelines for future implementers and for those considering introducing such transformative technologies within their own institutions.  The experiences and lessons learned from this project should be of value across the sector.

Evaluating the Benefits of Electronic Assessment Management

Examining the embedding of electronic assessment management (EAM) within both administrative and teaching and learning practice is the main focus of the Evaluating the Benefits of Electronic Assessment Management (EBEAM) project running at the University of Huddersfield as part of the JISC Assessment and Feedback programme Strand B.  This 18 month project will look at how Turnitin, incorporating GradeMark and eRater, addresses student, staff and institutional requirements for timely, invidiualised and focused feedback, reduced staff workloads and increasing reflection on practice, and cost-effective, scaleable and sustainable innovation.

The dual focus on administrative and pedagogic aspects is crucial for real uptake of any new technology or process.  By providing a supportive administrative and technological infrastructure, institutions can enable academic staff to fully realise the benefits of innovative systems and practice, and provide a significantly enhanced learning environment for students.  The dynamic interplay of these factors is vividly illustrated in the poster the project submitted for the programme kick off meeting.  The impact on student satisfaction, achievement and retention rates already apparent at Huddersfield reflects the success of such an approach.

Like the Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan project, EBEAM is grounded in previous evaluation work investigating the benefits of Turnitin on staff and students.  As with other projects, the decision to adopt existing technologies incorporated through the institutional VLE (in this case, Blackboard) is a pragmatic choice, adopting known and proven technology rather than expending time and resources in developing yet more tools to do the same things.  Being able to pick up such tools as needed greatly increases institutional agility, and provides ready access to existing user groups and a wealth of shared practice.

EBEAM project staff also have a keen awareness of the need for meaningful and effective staff development to enable teaching staff to make full use of new technologies and achieve the integration of new approaches within their teaching practice, a theme covered in several posts on their excellent project blog.  The project will produce a wide range of development materials, including practically-focused toolkits, webinars and screencasts, which will be available through the project site and the JISC Design Studio.  In addition, they’re looking at ways of fully exploiting the extensive amount of data generated by these EAM systems to further enhance teaching and learning support as well as engaging administrative departments in discussions on topics such as data warehousing and change management.

The EBEAM project should provide an excellent study in the benefits of eassessment and of methods of integration that take a holistic approach to institutions and stakeholders.  I’m very much looking forward to seeing the outcomes of their work.

Business Adopts Archi Modelling Tool

Many technologies and tools in use in universities and colleges are not developed for educational settings. In the classroom particularly teachers have become skilled at applying new technologies such as Twitter to educational tasks. But technology also plays a crucial role behind the scenes in any educational organisation in supporting and managing learning, and like classroom tools these technologies are not always developed with education in mind. So it is refreshing to find an example of an application developed for UK Higher and Further education being adopted by the commercial sector.

Archi is an open source ArchiMate modelling tool developed as part of JISC’s Flexible Service Delivery programme to help educational institutions take their first steps in enterprise architecture modelling. ArchiMate is a modelling language hosted by the Open Group who describe it as “a common language for describing the construction and operation of business processes, organizational structures, information flows, IT systems, and technical infrastructure”. Archi enforces all the rules of ArchiMate so that the only relationships that can be established are those allowed by the language.

Since the release of version 1.0 in June 2010 Archi has built up a large user base and now gets in excess of 1000 downloads per month. Of course universities and colleges are not the only organisations that need a better understanding of their internal business processes, we spoke to Phil Beauvoir, Archi developer at JISC CETIS, about the tool and why it has a growing number of users in the commercial world.

Christina Smart (CS): Can you start by giving us a bit of background about Archi and why was it developed?

Phil Beauvoir (PB): In summer of 2009 Adam Cooper asked whether I was interested in developing an ArchiMate modelling tool. Some of the original JISC Flexible Service Delivery projects had started to look at their institutional enterprise architectures, and wanted to start modelling. Some projects had invested in proprietary tools, such as BiZZdesign’s Architect, and it was felt that it would be a good idea to provide an open source alternative. Alex Hawker (the FSD Programme manager) decided to invest six months of funding to develop a proof of concept tool to model using the ArchiMate language. The tool would be aimed at the beginner, be open source, cross-platform and would have limited functionality. I started development on Archi in earnest in January 2010 and by April had the first alpha version 0.7 ready. Version 1.0 was released in June 2010, it grew from there.

CS: How would you describe Archi?
PB: The web site describes Archi as: “A free, open source, cross platform, desktop application that allows you to create and draw models using the ArchiMate language”. Users who can’t afford proprietary software, would use standard drawing tools such as Omnigraffle or Visio for modelling. Archi is positioned somewhere between those drawing tools and a tool like BiZZdesign’s Architect. It doesn’t have all the functionality and enterprise features of the BiZZdesign tool, but it has more than just plain drawing tools. Archi also has hints and helps and user assistance technology built into it, so when you’re drawing elements there are certain ArchiMate rules about which connections you can make, if you try to make a connection that’s not allowed you get an explanation why not. So for the beginner it is a great way to start understanding ArchiMate. We keep the explanations simple because we aim to make things easier for those users who beginners in ArchiMate. As the main developer I try to keep Archi simple, because there’s always a danger that you can keep adding on features and that would make it unusable. I try to steer a course between usability and features.

Archi screenshot

Archi screenshot

Another aspect of Archi is the way it supports the modelling conversation. Modelling is not done in isolation; it’s about capturing a conversation between key stakeholders in an organisation. Archi allows you to sketch a model and take notes in a Sketch View before you add the ArchiMate enterprise modelling rules. A lot of people use the Sketch View. It enables a capture of a conversation, the “soft modelling” stage before undertaking “hard modelling”.

CS: How many people are using it within the Flexible Service Delivery programme?
PB: I’m not sure, I know the King’s College, Staffordshire and Liverpool John Moores projects were using it. Some of the FSD projects tended to use both Architect and Archi. If they already had one licence for BiZZdesign Architect they would carry on using it for their main architect, whereas other “satellite” users in the institution would use Archi.

CS: Archi has a growing number of users outside education, who are they and how did they discover Archi?
PB: Well the first version was released in June 2010, and people in the FSD programme were using it. Then in July 2010 I got an email from a large Fortune 500 insurance company in the US, saying they really liked the tool and would consider sponsoring Archi if we implemented a new feature. I implemented the feature anyway and we’ve built up the relationship with them since then. I know that this company has in the region of 100 enterprise architects and they’ve rolled Archi out as their standard enterprise architecture modelling tool.

I am also aware of other commercial companies using it, but how did they discover it? Well I think it’s been viral. A lot of businesses spend a lot of money advertising and pushing products, but the alternate strategy is pull, when customers come to you. Archi is of the pull variety, because there is a need out there, we haven’t had to do very much marketing, people seem to have found Archi on their own. Also the TOGAF (The Open Group Architecture Framework) developed by the Open Group is becoming very popular and I guess Archi is useful for people adopting TOGAF.

In 2010 BiZZdesign were I think concerned about Archi being a competitor in the modelling tool space. However now they’re even considering offering training days on Archi, because Archi has become the de facto free enterprise modelling tool. Archi will never be a competitor to BiZZdesign’s Architect, they have lots of developers and there’s only me working on Archi, it would be nuts to try to compete. So we will focus on the aspects of Archi that make it unique, the learning aspects, the focus on beginners and the ease of use, and clearly forge out a path between the two sets of tools.

Many people will start with Archi and then upgrade to BiZZdesign’s Architect, so we’re working on that upgrade path now.

CS: Why do you think it is so popular with business users?
PB: I’m end-user driven, for me Archi is about the experience of the end users, ensuring that the experience is first class and that it “just works”. It’s popular with business users firstly because it’s free, secondly because it works on all platforms, thirdly because it’s aimed at those making their first steps with ArchiMate.

CS: What is the immediate future for Archi?
PB: We’re seeking sponsorship deals and other models of sustainability because obviously JISC can’t go on supporting it forever. One of the models of sustainability is to get Archi adopted by something like the Eclipse Foundation. But you have to be careful that development continues in those foundations, because there is a risk of it becoming a software graveyard, if you don’t have the committers who are prepared to give their time. There is a vendor who has expressed an interest in collaborating with us to make sure that Archi has a future.

Lots of software companies now have service business models, so you provide the tool for free but charge for providing services on top of the free tool. The Archi tool will always be free, anyone could package it up and sell it. I know they’re doing that in China because I’ve had emails from people doing it, they’ve translated it and are selling it and that’s ok because that’s what the licence model allows.

In terms of development we’re adding on some new functionality. A new concept of a Business Model Canvas is becoming popular, where you sketch out your new business models. The canvas is essentially a nine box grid which you add various key partners, stakeholders etc to. We’re adding a canvas construction kit to Archi, so people can design their own canvas for new business models. The canvas construction kit is aimed at the high level discussions that people have when they start modelling their organisations.

CS: You’ve developed a number of successful applications for the education sector over the years, including, Colloquia, Reload and ReCourse, how do you feel the long term future for Archi compares with those?
PB: Colloquia was the first tool I developed back in 1998, and I don’t really think it’s used anymore. But really Colloquia was more a proof of concept to demonstrate that you could create a learning environment around the conversational model, which supported learning in a different way from the VLEs that were emerging at the time. Its longevity has been as a forerunner to social networking and to the concept of the Personal Learning Environment.

Reload was a set of tools for doing content packaging and SCORM. They’re not meant for teachers, but they’re still being used.

The ReCourse Learning Design tool developed for a very niche audience of those people developing scripted learning designs.

I think the long term future for Archi is better than those, partly because there’s a very large active community using it, and partly because it can be used by all enterprises and isn’t just a specific tool for the education sector. I think Archi has an exciting future.

User feedback
Phil has received some very positive feedback about Archi via email from JISC projects as well as those working in the commercial world.

JISC projects
“The feeling I get from Archi is that it’s helping me to create shapes, link and position them rather than jumping around dictating how I can work with it. And the models look much nicer too… I think Archi will allow people to investigate EA modelling cost free to see whether it works for them, something that’s not possible at the moment.”

“So why is Archi significant? It is an open source tool funded by JISC based on the ArchiMate language that achieves enough of the potential of a tool like BiZZdesign Architect to make it a good choice for relatively small enterprises, like the University of Bolton to develop their modelling capacity without a significant software outlay.” [15] Stephen Powell from the Co-educate project (JISC Curriculum Design Programme).

Commercial
“I’m new to EA world, but Archi 1.1 makes me fill like at home! So easy to use and so exciting…”

“Version 1.3 looks great! We are rolling Archi out to all our architects next week. The ones who have tried it so far all love it.”

Find Out More
If this interview has whetted your appetite, more information about Archi, and the newly released version 2.0 is available at http://archi.cetis.org.uk. For those in the north, there will be an opportunity to see Archi demonstrated at the forthcoming 2nd ArchiMate Modelling Bash being held in St Andrews on the 1st and 2nd November.

The cloud is for the boring

Members of the Strategic Technologies Group of the JISC’s FSD programme met at King’s Anatomy Theatre to, ahem, dissect the options for shared services and the cloud in HE.

The STG’s programme included updates on projects of the members as well as previews of the synthesis of the Flexible Service Delivery programme of which the STG is a part, and a preview of the University Modernisation Fund programme that will start later in the year.

The main event, though, was a series of parallel discussions on business problems where shared services or cloud solutions could make a difference. The one I was at considered a case from the CUMULUS project; how to extend rather than replace a Student Record System in a modular way.

View from the King's anatomy theatre up to the clouds

View from the King's anatomy theatre up to the clouds

In the event, a lot of the discussion revolved around what services could profitably be shared in some fashion. When the group looked at what is already being run on shared infrastructure and what has proven very difficult, the pattern is actually very simple: the more predictable, uniform, mature, well understood and inessential to the central business of research and education, the better. The more variable, historically grown, institution specific and bound up with the real or perceived mission of the institution or parts thereof, the worse.

Going round the table to sort the soporific cloudy sheep from the exciting, disputed, in-house goats, we came up with following lists:

Cloud:

  • email
  • Travel expenses
  • HR
  • Finance
  • Student network services
  • Telephone services
  • File storage
  • Infrastructure as a Service

In house:

  • Course and curriculum management (including modules etc)
  • Admissions process
  • Research processes

This ought not to be a surprise, of course: the point of shared services – whether in the cloud or anywhere else – is economies of scale. That means that the service needs to be the same everywhere, doesn’t change much or at all, doesn’t give the users a competitive advantage and has well understood and predictable interfaces.