We should all be worried when data about us is used, but when the purpose for which it is used and the methods employed are opaque. Credit ratings and car insurance are long-standing examples we have got used to, and for which the general principles are generally known. Importantly, we believe that there is sufficient market-place competition that, within the limits of the data available to the providers, the recipes used are broadly fair.
Within both educational establishments and work-place settings, an entirely different situation applies. There is not the equivalent of competition and our expectations of what constitutes ethical (and legal) practice is different. Our expectations of what the data should be used for, and by whom, differ. The range of data that could be used, and the diversity of methods that could be harnessed, is so enormous that it is tempting not to think about the possibilities, and to hide one’s head in the sand.
One of the ideas proposed to address this situation is transparency, i.e. that we, as the subjects of analytics can look and see how we are affected, and as the objects of analytics can look and see how data about is is being used. Transparency could be applied at different points, and make visible information about:
- the data used, including data obtained from elsewhere,
- who has access to the data, as raw data or derived to produce some kind of metric,
- to whom the data is disclosed/transferred,
- the statistical and data mining methods employed,
- the results of validation tests, both at a technical level and at the level of the education/training interventions,
- what decisions are taken that affect me.
Frankly, even speaking as someone with some technical knowledge of databases, statistics and data mining, it would make my head hurt to make sense of all that in a real-world organisation! It would also be highly inefficient for everyone to have to do this. The end result would be that little, if any, difference in analytics practice would be caused.
I believe we should consider transparency as not only being about the freedom to access information, but as including an ability to utilise it. Maybe “transparency” is the wrong word, and I am risking an attempt at redefinition. Maybe “openness for inspection” would be better, not just open, but for inspection. The problem with stopping at making information available in principle, without also considering use, applies to some open data initiatives, for example where public sector spending data is released; the rhetoric from my own (UK) government about transparency has troubled me for quite some time.
It could be argued that the first challenge is to get any kind of openness, argued that the tendency towards black-box learning analytics should first be countered. My argument is that this could well be doomed to failure unless there is a bridge from the data and technicalities to the subjects of analytics.
I hope that the reason for the title of this article is now obvious. I should also add that the idea emerged in the Q&A following Viktor Mayer-Schönberger’s keynote at the recent i-KNOW conference.
One option would be to have Learning Analytics Watchdogs: independent people with the expertise to inspect the way learning analytics is being conducted, to champion the interests of the those affected, both learners and employees, and to challenge the providers of learning analytics as necessary. In the short term, this will make it harder to roll-out learning analytics, but in the long term it will, I believe, pay off:
- Non-transparency will ultimately lead to a breakdown of trust, with the risk of public odium or being forced to take down whole systems.
- A watchdog would force implementers to gain more evidence of validity, avoiding analytics that is damaging to learners and organisations. Bad decisions hurt everyone.
- Attempts to avoid being savaged by the watchdog would promote more collaborative design processes, involving more stakeholders, leading to solutions that are better tuned to need.
Watchdog image is CC-BY-SA Gary Schwitzer, via Wikimedia Commons.
This post was first published on the Learning Analytics Community Exchange website, www.laceproject.eu.