Adam Cooper http://blogs.cetis.org.uk/adam Cetis Blogs Sat, 18 Oct 2014 11:23:00 +0000 en-US hourly 1 http://wordpress.org/?v=4.1.22 On the Question of Validity in Learning Analytics http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/ http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/#comments Fri, 17 Oct 2014 16:15:36 +0000 http://blogs.cetis.org.uk/adam/?p=830 The question of whether or not something works is a basic one to ask when investing time and money in changing practice and in using new technologies. For learning analytics, the moral dimension adds a certain imperative, although there is much that we do by tradition in teaching and learning in spite of questions about efficacy.

I believe the move to large-scale adoption of learning analytics, with the attendant rise in institution-level decisions, should motivate us to spend some time thinking about how concepts such as validity and reliability apply in this practical setting. Motivation comes from: large scale adoption has “upped the stakes”, and non-experts are now involved in decision-making. This article is a brief look at some of the issues with where we are now, and some of the potential pit-falls going forwards.

There is, of course, a great deal of literature on the topics of validity (and reliability) and various disciplines have their own conceptualisations, and sometimes multiple kinds of validity. The wikipedia disambiguation page for “validity” illustrates the variety and the disambiguation page for “validation” adds further to it.

For the purpose of this article I would like to avoid choosing one of these technical definitions, because it is important to preserve some variety; the Oxford English Dictionary definition will be assumed: “the quality of being logically or factually sound; soundness or cogency”. Before looking at some issues, it might be helpful to first clarify the distinction between “reliability” and “validity” in statistical parlance(diagram, below).

Distinguishing between reliability and validity in statistics.

Distinguishing between reliability and validity in statistics.

Issues

Technical terminology may mislead

The distinction between reliability and validity in statistics leads us straight to the issue that terms may be used with very specific technical meanings that are not appreciated by a community of non-experts, who might be making decisions about what kind of learning analytics approach to adopt. This is particularly likely where terms with every-day meaning are used, but even when technical terms are used without everyday counterparts, non-expert users will often employ them without recognising their misunderstanding.

Getting validity (and reliability) universally on the agenda

Taking a fairly hard-edged view of “validation”, as applied to predictive models, a good start would be to see this being universally adopted, following established best practice in statistics and data mining. The educational data mining research community is very hot on this topic but the wider community of contributors to learning analytics scholarship is not always so focused. More than this, it should be on the agenda of the non-researcher to ask the question about the results and the method, and to understand whether “85% correct classification judged by stratified 10-fold cross-validation” is an appropriate answer, or not.

Predictive accuracy is not enough

When predictive models are being described, it is common to hear statements such as “our model predicted student completion with 75% accuracy.” Even assuming that this accuracy was obtained using best practice methods it glosses over two kinds of fact that we should seek in any prediction (or, generally “classification”), but which are too often neglected:

  • How does that figure compare to a random selection? If 80% completed then the prediction is little better than picking coloured balls from a bag (68% predicted correctly). The “kappa statistic” gives a measure of performance that takes account of improved predictive performance compared to chance, but it doesn’t have such an intuitive feel.
  • Of the incorrect predictions, how many were false positives and how many were false negatives? How much we value making each kind of mistake will depend on social values and what we do with the prediction. What is a sensible burden of proof when death is the penalty?

Widening the conception of validity beyond the technical

One of the issues faced in learning analytics is that the paradigm and language of statistics and data mining could dominate the conceptualisation of validity. The focus on experiment and objective research that is present in most of the technical uses of “validity” should have a counterpart in the situation of learning analytics in practice.

This counterpart has an epistemological flavour, in part, and requires us to ask whether a community of practice would recognise something as being a fact. It includes the idea that professional practice utilising learning analytics would have to recognise an analytics-based statement as being relevant. An extreme example of the significance of social context to what is considered valid (fact) is the difference in beliefs between religious and non-religious communities.

For learning analytics, it is entirely possible to have some kind of prediction that is highly statistically-significant and scores highly in all objective measures of performance but is still irrelevant to practice, or which produces predictions that it would be unethical to use (or share with the subjects), etc.

Did it really work?

So, lets say all of the above challenges have been met. Did the adoption of a given learning analytics process or tool make a desirable difference?

This is a tough one. The difficulty in answering such questions in an educational setting is considerable, and has led to the recent development of new research methodologies such as Design-based Research, which is gaining adoption in educational technology circles (see, for example, Anderson and Shattuck in Educational Researcher vol. 41, no. 1).

It is not realistic to expect robust evidence in all cases, but in the absence of robust evidence we also need to be aware of proxies such as “based on the system used by [celebrated university X]”.

We also need to be aware of the pitfall of the quest for simplicity in determining whether something worked; an excessive focus on objective benefits neglects much of value. Benefits may be intangible, indirect, found other than where expected, or out of line with political or business rhetoric. As the well worn aphorism has it, “not everything that counts can be counted, and not everything that can be counted counts.” It does not follow, for example, that improved attainment is either a necessary or sufficient guide to whether a learning analytics system is a sound (valid) proposition.

Does it generalise? (external validity)

As we try to move from locally-contextualised research and pilots towards broadly adoptable patterns, it becomes essential to know the extent to which an idea will translate. In particular, we should be interested to know which of the attributes of the original context are significant, in order to estimate its transferability to other contexts.

This thought opens up a number of possibilities:

  • It will sometimes be useful to make separate statements about validity or fitness for purpose for a method and for the statistical models it might produce. e.g. is the predictive model transferable, or the method by which it is discovered?
  • It may be that learning analytics initiatives that are founded on some theorisation about cause and effect, and which somehow test that theorisation, are more easily judged in other contexts.
  • As the saying goes “one swallow does not a summer make” (Aristotle, but still in use!), so we should gather evidence (assess validity and share the assessment) as an initially-successful initiative migrates to other establishments and as time changes.

Transparency is desirable but challenging

The above points have been leading us in the direction of the need to share data about the effect of learning analytics initiatives. The reality of answering questions about what is effective is non-trivial, and the conclusions are likely to be hedged with multiple provisos, open to doubt, requiring revision, etc.

To some extent, the practices of good scholarship can address this issue. How easy is this for vendors of solutions (by this I mean other than general purpose analytics tools)? It certainly implies a humility not often attributed to the sales person.

Even within the current conventions of scholarship we face the difficulty that the data used in a study of effectiveness is rarely available for others to analyse, possibly asking different questions, making different assumptions, or for meta-analysis. This is the realm of reproducible research (see, for example, reproducibleresearch.net) and subject to numerous challenges at all levels, from ethics and business sensitivities down to practicalities of knowing what someone else’s data really means and the additional effort required to make data available to others. The practice of reproducible research is generally still a challenge but these issues take on an extra dimension when we are considering “live” learning analytics initiatives in use at scale, in educational establishments competing for funds and reputation.

To address this issue will require some careful thought to imagine solutions that side-step the immovable challenges.

Conclusion… so what?

In conclusion, I suggest that we (a wide community including research, innovation, and adoption) should engage in a discourse, in the context of learning analytics, around:

  • What do we mean by validity?
  • How can we practically assess validity, particularly in ways that are comparable?
  • How should we communicate these assessments to be meaningful for, and perceived as relevant by, non-experts, and how should we develop a wider literacy to this end?

This post is a personal view, incomplete and lacking academic rigour, but my bottom line is that learning analytics undertaken without validity being accounted for would be ethically questionable, and I think we are not yet where we need to get to… what do you think?


Target image is “Reliability and validity” by Nevit Dilmen. Licensed under Creative Commons Attribution-Share Alike 3.0 via Wikimedia Commons – http://commons.wikimedia.org/wiki/File:Reliability_and_validity.svg

This post was first published on the Learning Analytics Community Exchange website, www.laceproject.eu.

]]>
http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/feed/ 6
Learning Analytics Watchdog – a job description of the future for effective transparency? http://blogs.cetis.org.uk/adam/2014/09/24/learning-analytics-watchdog-a-job-description-of-the-future-for-effective-transparency/ http://blogs.cetis.org.uk/adam/2014/09/24/learning-analytics-watchdog-a-job-description-of-the-future-for-effective-transparency/#comments Wed, 24 Sep 2014 16:12:18 +0000 http://blogs.cetis.org.uk/adam/?p=827 We should all be worried when data about us is used, but when the purpose for which it is used and the methods employed are opaque. Credit ratings and car insurance are long-standing examples we have got used to, and for which the general principles are generally known. Importantly, we believe that there is sufficient market-place competition that, within the limits of the data available to the providers, the recipes used are broadly fair.

Within both educational establishments and work-place settings, an entirely different situation applies. There is not the equivalent of competition and our expectations of what constitutes ethical (and legal) practice is different. Our expectations of what the data should be used for, and by whom, differ. The range of data that could be used, and the diversity of methods that could be harnessed, is so enormous that it is tempting not to think about the possibilities, and to hide one’s head in the sand.

One of the ideas proposed to address this situation is transparency, i.e. that we, as the subjects of analytics can look and see how we are affected, and as the objects of analytics can look and see how data about is is being used. Transparency could be applied at different points, and make visible information about:

  • the data used, including data obtained from elsewhere,
  • who has access to the data, as raw data or derived to produce some kind of metric,
  • to whom the data is disclosed/transferred,
  • the statistical and data mining methods employed,
  • the results of validation tests, both at a technical level and at the level of the education/training interventions,
  • what decisions are taken that affect me.

Frankly, even speaking as someone with some technical knowledge of databases, statistics and data mining, it would make my head hurt to make sense of all that in a real-world organisation! It would also be highly inefficient for everyone to have to do this. The end result would be that little, if any, difference in analytics practice would be caused.

I believe we should consider transparency as not only being about the freedom to access information, but as including an ability to utilise it.  Maybe “transparency” is the wrong word, and I am risking an attempt at redefinition. Maybe “openness for inspection” would be better, not just open, but for inspection. The problem with stopping at making information available in principle, without also considering use, applies to some open data initiatives, for example where public sector spending data is released; the rhetoric from my own (UK) government about transparency has troubled me for quite some time.

It could be argued that the first challenge is to get any kind of openness, argued that the tendency towards black-box learning analytics should first be countered. My argument is that this could well be doomed to failure unless there is a bridge from the data and technicalities to the subjects of analytics.

I hope that the reason for the title of this article is now obvious. I should also add that the idea emerged in the Q&A following Viktor Mayer-Schönberger’s keynote at the recent i-KNOW conference.

WatchdogOne option would be to have Learning Analytics Watchdogs: independent people with the expertise to inspect the way learning analytics is being conducted, to champion the interests of the those affected, both learners and employees, and to challenge the providers of learning analytics as necessary. In the short term, this will make it harder to roll-out learning analytics, but in the long term it will, I believe, pay off:

  • Non-transparency will ultimately lead to a breakdown of trust, with the risk of public odium or being forced to take down whole systems.
  • A watchdog would force implementers to gain more evidence of validity, avoiding analytics that is damaging to learners and organisations. Bad decisions hurt everyone.
  • Attempts to avoid being savaged by the watchdog would promote more collaborative design processes, involving more stakeholders, leading to solutions that are better tuned to need.

 

Watchdog image is CC-BY-SA Gary Schwitzer, via Wikimedia Commons.

This post was first published on the Learning Analytics Community Exchange website, www.laceproject.eu.

 

]]>
http://blogs.cetis.org.uk/adam/2014/09/24/learning-analytics-watchdog-a-job-description-of-the-future-for-effective-transparency/feed/ 0
UK Government Crosses the Rubicon with Open Document Formats http://blogs.cetis.org.uk/adam/2014/07/28/uk-government-crosses-the-rubicon-with-open-document-formats/ http://blogs.cetis.org.uk/adam/2014/07/28/uk-government-crosses-the-rubicon-with-open-document-formats/#comments Mon, 28 Jul 2014 12:02:10 +0000 http://blogs.cetis.org.uk/adam/?p=812 Last week (July 22nd 2014), the UK Government announced the open document formats to be used by government: PDF/A, HTML, and ODF. This is the second tranche of open standards that have been adopted following open consultation, detailed work by technical panels, and recommendation by the Open Standards Board. The first tranche, which I wrote about in October 2013, was rather prosaic in dealing with HTTP, URL, Unicode, and UTF-8, and these do not really affect people outside government, whether citizens or suppliers. Document formats – both for viewing documents and 2-way exchanges – are an entirely different matter, and particularly with ODF, I have a real sense of government crossing the Rubicon of open standards.

This is a move that is likely to affect what we all do with documents in five years time, and to affect procurement of software and services inside and outside government. It will take some time for this policy, which is described in the policy paper “Sharing or collaborating with government documents“, to work its way through and the transition will require care and support, but the signs are that this has been well thought through and that the Government Digital Service (GDS) has both the plans and expertise to see through the transition. They are not, for example, naive about the practicalities of achieving interoperability across different pieces of software, and GDS is publically-committed to the idea of citizens and businesses having choice in the software they use (the ODF move, for example, is not a “LibreOffice by the back door” tactic).

Microsoft, naturally enough,  have been critical, as reported in Computer Weekly, but an article in CIO online magazine (strapline “Informing the UK’s business technology leaders”) is quite neutral in tone, and I hope that this is more reflective of the wider IT community. Actually, I perceive that Microsoft has been improving its ODF support for several years, and I doubt that this announcement will have much business impact for them; the writing has been on the wall for a while and there is a sales opportunity for product updates that help government departments meet their obligations while continuing to use MS Office products. And yet, things will not be the same… and we might even get practically-useful levels of interoperability between LibreOffice and MS Office.

 


Two members of Cetis have contributed to the process that informed this policy: Wilbert Kraan is a member of the Technical Standards Panel, and I am a member of the Open Standards Board.

]]>
http://blogs.cetis.org.uk/adam/2014/07/28/uk-government-crosses-the-rubicon-with-open-document-formats/feed/ 0
Do Higher Education Institutions Need a Learning Analytics Strategy? http://blogs.cetis.org.uk/adam/2014/06/27/do-higher-education-institutions-need-a-learning-analytics-strategy/ http://blogs.cetis.org.uk/adam/2014/06/27/do-higher-education-institutions-need-a-learning-analytics-strategy/#comments Fri, 27 Jun 2014 16:07:49 +0000 http://blogs.cetis.org.uk/adam/?p=825 The LACE Workshop, “Developing a Learning Analytics Strategy for a Higher Education Institution” took place on June 17th 2014, with over 35 participants exploring the issues and considering the question of what such a strategy would look like.

Approaching Strategy as a Business Model

The approach taken was to use an adapted version of the Business Model Canvas – see the workshop home page for more information – to attempt to frame a strategic response to Learning Analytics (LA). The use of this approach, in which the Canvas was used flexibly rather than rigidly, was predicated on the idea that most of the factors inherent in a successful business model are also important for the success of a LA initiative, and that pitching such an initiative to senior management would be more successful if these factors had been considered. Considering all of these factors at the same time, albeit in a shallow fashion, should be a good check for how realistic the approach is. The factors, which are further teased apart in the Canvas are:

  • The stakeholders (“interested parties” might be a better term, or “the people for whom value is to be created”).
  • What value (not necessarily financial) would be reaped by these stakeholders
  • How the stakeholders will be related to, engaged with, etc.
  • Which stakeholders are prepared to pay, and how.
  • The human, physical, etc resources required, and indicated costs.
  • The activities to be undertaken, and indicated costs
  • Key partners and suppliers.

The Business Model Canvas approach appears, on face value, to be a sensible way of approaching the question of what a LA strategy might look like but it presumes a subset of approaches to strategy. The workshop could be viewed as a thought experiment about the nature of this subset.

A separate web-page contains the results of the group work, Canvas templates with post-it notes attached.

Different Kinds of Strategy

Three differing approaches to a LA strategy seemed to emerge in discussions:

  1. A major cross-functional programme, “LA Everywhere”.
  2. LA in the service of particular institution-level strategic objectives, “Targetted LA”.
  3. A strategy based around putting enabling factors in place, “Latent LA”.

The majority of the workshop groups ended up following the second approach when using the Canvas, although it should be noted that budget and human resource limitations were assumed.

One aspect that emerged clearly in discussion, feedback, and comment, was that even targetted LA initiatives have a complex set of inter-related, sometimes conflicting, stakeholders and values. Attempts to accommodate these relationships leads to a rapidly inflating challenge and pushes discussion in the direction of “LA Everywhere”. LA Everywhere implies a radical re-orientation of the institution around Learning Analytics and is unlikely to be feasible in most Higher Education establishments, even assuming substantial financial commitments.

The Latent LA approach can be seen as a means of addressing this complexity, but also as a strategy driven by a need to learn more before targeting institution-level strategic objectives. Latent LA may not be recognised by some observers as a LA strategy per se but it is a strategic response to the emergence of the concept of learning analytics and the potential benefits it offers.

The enabling factors could include aspects such as:

  • data and statistical literacy;
  • amending policies and practices around information management, ownership, and governance to encompass LA-relevant systems and LA uses;
  • ethical and value-based principles;
  • fact-finding and feasibility assessment;
  • changing culture towards greater use data as evidence (in an open-minded way, with limitations understood);
  • developing a set of policies, principles, and repeatable patterns for the identification, prioritisation, and design of LA initiatives.

Issues with the Business Model Approach

Although the template based on the Business Model Canvas provided a framework to consider Learning Analytics strategy, there were a number of practical and conceptual limitations for this application. The practical issues were principally that time was rather limited (slightly over 1 hour) and that the activity was not being undertaken in a particular organisational context. This limited the extent to which conclusions could be drawn vs ideas explored.

The conceptual limitations for considering LA strategy include:

  • The Canvas did not lend itself to an exploration of inter-relationships and dependencies, and the differing kinds of relationship (this point is not about linking stakeholders to value proposition, etc, which is possible using colour coding, for example).
  • One of the key questions that is more important in education than in a normal business context is: what effect will this have on teaching/educational practice? The Canvas is not designed to map out effect on practice, and how that relates to values or educational theory.
  • A business model approach is not a good fit to a strategic response that is akin to Latent LA.

Do HE Institutions Need a LA Strategy?

This is a question that will ultimately be answered by history, but it may be more appropriate to ask whether a HE institution can avoid having a LA strategy. The answer is probably that avoidance is not an option; it will become progressively harder for leaders of HE institutions to appear to be doing nothing about LA when jostling for status with their peers.

In other words, whether or not there is a need, it seems inevitable that LA strategies will be produced. The tentative conclusion I draw from the workshop is that a careful blend of Latent and Targeted LA will be the best approach to having a strategy that delivers benefit, with the balance between Latent and Targeted varying between institutions. In this model, Latent LA lays the foundations for long term success while some shorter-term “results” and something identifiable arising from Targeted LA are a political necessity, both internal to the institution, and externally.

Web links to a selection of resources related to the question of learning analytics strategy may be found on the workshop home page.


This post was first published on the Learning Analytics Community Exchange website, www.laceproject.eu.

]]>
http://blogs.cetis.org.uk/adam/2014/06/27/do-higher-education-institutions-need-a-learning-analytics-strategy/feed/ 0
Open Learning Analytics – progress towards the dream http://blogs.cetis.org.uk/adam/2014/04/17/open-learning-analytics-progress-towards-the-dream/ http://blogs.cetis.org.uk/adam/2014/04/17/open-learning-analytics-progress-towards-the-dream/#comments Thu, 17 Apr 2014 15:58:53 +0000 http://blogs.cetis.org.uk/adam/?p=821 In 2011, a number of prominent figures in learning analytics and educational data mining published a concept paper on the subject of Open Learning Analytics (PDF), which they described as a  “proposal to design, implement and evaluate an open platform to integrate heterogeneous learning analytics techniques.” This has the feel of a funding proposal vision, a grand vision of an idealised future state. I was, therefore a little wary of the possibility that the recent Open Learning Analytics Summit (“OLA Summit”) would find it hard to get any traction, given the absence of a large pot of money. The summit was, however, rather interesting.

The OLA Summit, which is described in a SoLAR press release, immediately followed the Learning Analytics and Knowledge Conference and was attended by three members of the LACE project. A particular area of shared special interest between LACE and the OLA Summit is in open standards (interoperability) and data sharing.

One of the factors that contributed to the success of the event was the combined force of SoLAR, the Society for Learning Analytics Research, with the Apereo Foundation, which is an umbrella organisation for a number of open source software projects. Apereo has recently started a Learning Analytics Initiative, which has quite open-ended aims: “accelerate the operationalization of Learning Analytics software and frameworks, support the validation of analytics pilots across institutions, and to work together so as to avoid duplication”. This kind of soft-edged approach is appropriate for the current state of learning analytics; while institutions are still finding their way, a more hard-edged aim, such as building the learning analytics platform to rule the world, would be forced to anticipate rather than select requirements.

The combination of people from the SoLAR and Apereo communities, and an eclectic group of “others”, provided a balance of perspective; it is rare to find deep knowledge about both education and enterprise-grade IT in the same person. I think the extent to which the OLA Summit helped to integrate people from these communities is one of its key, if intangible, outcomes. This provides a (metaphorical) platform for future action. In the mean time, the various discussion groups intend to produce a number of relatively small scale outputs that further add to this platform, in a very bottom-up approach.

There is certainly a long way to go, and a widening of participation will be necessary, but a start has been made on developing a collaborative network from which various architectures, and conceptual and concrete assets will, I hope, emerge.


This post was first published on the Learning Analytics Community Exchange website, www.laceproject.eu.

]]>
http://blogs.cetis.org.uk/adam/2014/04/17/open-learning-analytics-progress-towards-the-dream/feed/ 0
More Data Doesn’t Always Lead to Better Choices – Lessons for Analytics Initiatives http://blogs.cetis.org.uk/adam/2014/04/04/more-data-doesnt-always-lead-to-better-choices-lessons-for-analytics-initiatives/ http://blogs.cetis.org.uk/adam/2014/04/04/more-data-doesnt-always-lead-to-better-choices-lessons-for-analytics-initiatives/#comments Fri, 04 Apr 2014 14:41:47 +0000 http://blogs.cetis.org.uk/adam/?p=807 An article appeared in the Times Higher Education online magazine recently (April 3, 2014) under the heading “More data can lead to poor student choices, Hefce [Higher Education Funding Council for England] learns”. The article was not about learning analytics, but about the data provided to prospective students with the aim of supporting their choice of Higher Education provider (HEp). The data is accessible via the Unistats web site, and includes various statistics on the cost of living, fees, student satisfaction, teaching hours, and employment prospects. In principle, this sounds like a good idea; I believe students are genuinely interested in these aspects, and the government and funding council see the provision of this information as a means of driving performance up and costs down. So, although this is not about learning analytics, there are several features in common: the stakeholders involved, the idea of choice informed by statistics, and the idea of shifting a cost-benefit balance for identified measures of interest.

For the rest of this article, which I originally published as “More Data Can Lead to Poor Student Choices”, please visit the LACE Project website.

]]>
http://blogs.cetis.org.uk/adam/2014/04/04/more-data-doesnt-always-lead-to-better-choices-lessons-for-analytics-initiatives/feed/ 0
Open Access Research http://blogs.cetis.org.uk/adam/2014/03/31/open-access-research/ http://blogs.cetis.org.uk/adam/2014/03/31/open-access-research/#comments Mon, 31 Mar 2014 18:15:08 +0000 http://blogs.cetis.org.uk/adam/?p=804 Last week was a significant one for UK academics and those interested in accessing scholarship; the funding councils announced a new policy mandating open access for the post-2014 research evaluation exercises. In the same week, Cetis added its name to the list of members of the Open Policy Network, (strap-line, “ensuring open access to publicly funded resources”). Thinking back only 5 years, the change in policy is not something I could imagine would have happened by now and I think it is a credit to the people who have pushed this through in the face of resistance from vested interests, and to the people in Jisc who have played a part in making this possible.

For me, it is now time to stop providing reviews for non-open-access journals.

This is not the end of the story, however… open access to only papers falls short of what I think we need for many disciplines, and certainly where we operate at the intersection of education and technology. Yes, I want access to data and source code. This is still too radical for many institutions today, for sure, but it will happen and, based on the speed with which Open Access has moved from being a hippy rant to funding council policy, I think we’ll have it sooner than many expect. Now is the time to make the change, and I was very pleased to hear the idea being posited for a pilot by people involved with the Journal of Learning Analytics, which is already OA. (NB: this is not agreed policy of JLA). On the other hand, the proceedings of the Learning Analytics and Knowledge conference are not yet OA. Is the subject matter only of interest to people working in organisations that subscribe to the ACM digital library? No… but it will doubtless take a few years for things to change. Come on SoLAR, anticipate the change!

]]>
http://blogs.cetis.org.uk/adam/2014/03/31/open-access-research/feed/ 0
Learning Analytics Interoperability – The Big Picture in Brief http://blogs.cetis.org.uk/adam/2014/03/28/learning-analytics-interoperability-the-big-picture-in-brief/ http://blogs.cetis.org.uk/adam/2014/03/28/learning-analytics-interoperability-the-big-picture-in-brief/#comments Fri, 28 Mar 2014 18:52:43 +0000 http://blogs.cetis.org.uk/adam/?p=797 Learning Analytics is now moving from being a research interest to a wider community who seek to apply it in practice. As this happens, the challenge of efficiently and reliably moving data between systems becomes of vital practical importance. System interoperability can reduce this challenge in principle, but deciding where to drill down into the details will be easier with a view of the “big picture”.

Part of my contribution to the Learning Analytics Community Exchange (LACE) project is a short briefing on the topic of learning analytics and interoperability (PDF, 890k). This introductory briefing, which is aimed at non-technical readers who are concerned with developing plans for sustainable practical learning analytics, describes some of the motivations for better interoperability and outlines the range of situations in which standards or other technical specifications can help to realise these benefits.

In the briefing, we expand on benefits such as:

  • efficiency and timeliness,
  • independence from disruption as software components change,
  • adaptability of IT architectures to evolving needs,
  • innovation and market growth,
  • durability of data and archival,
  • data aggregation, and
  • data sharing.

Whereas the focus of attention in learning analytics is often on data collected during learner activity, the briefing paper looks at the wider system landscape within which interoperability might contribute to practical learning analytics initiatives, including interoperability of models, methods, and analytical results.

The briefing paper is available from: http://laceproject.eu/publications/briefing-01.pdf (PDF, 890k).

LACE is a project funded by the European Commission to support the sharing of knowledge, and the creation of new knowledge through discourse. This post was first published on the LACE website.

]]>
http://blogs.cetis.org.uk/adam/2014/03/28/learning-analytics-interoperability-the-big-picture-in-brief/feed/ 0
Interoperability Incubation and Pre-standardisation Activity – A View on Desirable Qualities http://blogs.cetis.org.uk/adam/2014/02/21/interoperability-incubation-and-pre-standardisation-activity-a-view-on-desirable-qualities/ http://blogs.cetis.org.uk/adam/2014/02/21/interoperability-incubation-and-pre-standardisation-activity-a-view-on-desirable-qualities/#comments Fri, 21 Feb 2014 12:42:26 +0000 http://blogs.cetis.org.uk/adam/?p=790 There is an important process that should feed into the development of good standards (that are used in practice) and this process is currently in need of repair and reformation. They key idea behind this is that good standards to support educational technology, to take our area of particular interest, are not created on a blank sheet of paper by an elite but emerge from practice, collaborative design, experimentation, selective appropriation of web standards, … etc. Good standards documents are underpinned by a thoughtful analysis of these aspects such that what emerges is useful, usable, and used. The phrase “pre-standardisation and interoperability incubation forum” is an attempt to capture the character of such a process. Indeed, some industry partners may prefer to see a collaboration to incubate interoperability as the real thing, with the formal standardization politics as an optional, and sometimes problematic, add-on. It is our belief that all except the suppliers with a dominant market share stand to benefit from better interoperability – i.e. common means to share common data – and that there is a great deal of latent value that could be unlocked by better pre-standardisation activity and interoperability incubation.

Some recent changes to the pre-standardisation landscape indicate that this process, which is not assumed to exist within a single entity, is in need of repair and reformation. Some of these changes, and the problems the changes present is described in recent Cetis staff posts by Simon Grant, “Educational Technology Standardization in Europe”, and by Lorna Campbell, “CEN Learning Technologies Workshop Online Consultation Meeting”. The gist of these descriptions is that what we thought was a usefully-open access pre-standardisation forum is no more. This does not mean that “repair and reformation” means we should re-create what has been lost, rather that the loss has tipped the balance down on the side of taking action. What emerges may, quite rationally, be rather different in form to what went before.

This post makes public a discussion of the background and some statements about what I consider the desirable qualities of a pre-standardisation and interoperability incubation forum, and draws extensively on the ideas and insights of colleagues in Cetis and in the wider interoperability community.

Download the document as: PDF, 190kB, or Open Document Format (ODT), 55kB.

]]>
http://blogs.cetis.org.uk/adam/2014/02/21/interoperability-incubation-and-pre-standardisation-activity-a-view-on-desirable-qualities/feed/ 1
Cabinet Office Consults on Open Standards for Government – URI Patterns and Document Formats http://blogs.cetis.org.uk/adam/2014/02/07/cabinet-office-consults-on-open-standards-for-government-uri-patterns-and-document-formats/ http://blogs.cetis.org.uk/adam/2014/02/07/cabinet-office-consults-on-open-standards-for-government-uri-patterns-and-document-formats/#comments Fri, 07 Feb 2014 18:27:39 +0000 http://blogs.cetis.org.uk/adam/?p=786 Feedback is invited on three proposals by Feb 24th (and 26th). The proposals relate to the following challenges (which apply to UK government use, not the whole of the public sector or the devolved governments):

  • URI patterns for identifiers. These will be for resolvable URIs to identify things and codes within data published by government.
  • Viewing government documents. This covers access by citizens, businesses and government officials from diverse devices.
  • Sharing or collaborating with government documents. This extends the requirements of the previous proposal to cases where the documents must be editable.

The proposals will be further developed based on feedback, and with due diligence on the proposed open standards, and will be presented for approval by the Open Standards Board later in the year (the next meeting is March 25th 2014).

Comments may be added to the proposals.

 

 

 

]]>
http://blogs.cetis.org.uk/adam/2014/02/07/cabinet-office-consults-on-open-standards-for-government-uri-patterns-and-document-formats/feed/ 0