For? Against? Not bovvered? Don’t understand the question?
The term learning analytics is certainly trending in all the right ways on all the horizons scans. As with many “new” terms there are still some mis-conceptions about what it actually is or perhaps more accurately what it actually encompasses. For example, whilst talking with colleagues from the SURF Foundation earlier this week, they mentioned the “issues around using data to improve student retention” session at the CETIS conference. SURF have just funded a learning analytics programme of work which closely matches many of the examples and issues shared and discussed there. They were quite surprised that the session hadn’t be called “learning analytics”. Student retention is indeed a part of learning analytics, but not the only part.
However, back to my original question and the prompt for it. I’ve just caught up with the presentation Gardner Campbell gave to the LAK12 MOOC last week titled “Here I Stand” in which he presents a very compelling argument against some of the trends which are beginning to emerge in field of learning analytics.
Gardner is concerned that there is a danger of that the more reductive models of analytics may actually force us backwards in our models of teaching and learning. Drawing an analogy between M theory – in particular Stephen Hawkins description of there being not being one M theory but a “family of theories” – and how knowledge and learning actually occur. He is concerned that current learning analytics systems are based too much on “the math” and don’t actually show the human side of learning and the bigger picture of human interaction and knowledge transfer. As he pointed out “student success is not the same as success as a student”.
Some of the rubrics we might be tempted to use to (and in cases already are) build learning analytics systems reduce the educational experience to a simplistic management model. Typically systems are looking for signs pointing to failure, and not for the key moments of success in learning. What we should be working towards are system(s) that are adaptive, allow for reflection and can learn themselves.
This did make me think of the presentation at FOFE11 from IBM about their learning analytics system, which certainly scared the life out of me and many other’s I’ve spoken too. It also raised a lot of questions from the audience (and the twitter backchannel) about the educational value of the experience of failure. At the same time I was reflecting on the whole terminology issue again. Common understandings – why are they so difficult in education? When learning design was the “in thing”, I think it was John Casey who pointed out that what we were actually talking about most of the time was actually “teaching design”. Are we in danger of the same thing happening to the learning side of learning analytics being hi-jacked by narrower, or perhaps to be fairer, more tightly defined management and accountability driven analytics ?
To try and mitigate this we need to ensure that all key stakeholders are starting to ask (and answering) the questions Gardner raised in his presentation. What are the really useful “analytics” which can help me as a learner, teacher, administrator, etc? Which systems provide that data just now ? How can/do these stakeholders access and share the data in meaningful ways? How can we improve and build on these systems in ways which take into account the complexity of learning? Or as Gardner said, how can we start framing systems and questions around wisdom? But before we can do any of that we need to make sure that our stakeholders are informed enough to take a stand, and not just have to accept whatever system they are given.
At CETIS we are about to embark on an analytics landscape study, which we are calling an Analytics Reconnoitre. We are going to look at the field of learning analytics from a holistic perspective, review recent work and (hopefully) produce some pragmatic briefings on the who, where, why, what and when’s of learning analytics and point to useful resources and real world examples. This will build and complement work already funded by JISC such as the Relationship Management Programme, the Business Intelligence Infokit and the Activity Data Programme synthesis. We’ll also be looking to emerging communities of practice, both here in the UK and internationally to join up on thinking and future developments. Hopefully this work will contribute to the growing body of knowledge and experience in the field of learning analytics and well as raising some key questions (and hopefully some answers) around around its many facets.
Hi Sheila
A timely post, which made me think of Emily Collins’ recent WonkHE call for universities to remember they have people who are good at doing research but just don’t approach things like student engagement in that way. Perhaps we need something like…
Dear talented researchers,
There is a really interesting, continually emerging research situation very close to home.
Whether participative or illustrative, your methods and findings could make a real difference to the experience and outcomes of the learning community that surrounds you.
The good news is that there’s probably a network of people in your institution willing to work with you to design and build research instruments.
A label of “Learning Analytics” may ring some alarm bells and probably calls to mind the Einstein quote “Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted” but that’s why this project needs you!
Thanks Mark, and yes that sounds like a much better way to approach staff – and we should do the same with students too.
S
Hoped wording of the message would work for both staff and students as I’ve found a wealth of research talent in the student community and some great opportunities for learning through collaboration
I think your post highlights two issues:
The terminology issue points at the question of audience: is it for managers, teachers or learners? I’m sure we’ll settle on a good term for the application of analytical methods to learning in general, but it’ll always be crucial to bear in mind who the audience is.
The question of reductionism is more fundamental: I think analytics has to be reductive to work. A map that faithfully reproduces the full complexity of reality is useless. A map must simplify so that people can find the interesting and rich parts they’re looking for.
In that sense, I think there might be a tendency to over-estimate what analytics can do; If there are no clear quantifiable dependent variables, then the techniques we refer to by the name of ‘analytics’ are the wrong tool for the problem at hand. Al we can hope for is that it points at that problem.
Of course it does, and I should have realised that you of all people wouldn’t forget students:-)
Hi Wilbert
Thanks for you comments.
I can see that there does need to be some reductive processes involved, however I think what I (and indeed Gardner conveys far more eloquently ) is that there isn’t just one map and you can only see the directions needed for driving to you destination when you are walking. Or indeed you don’t have to take the most direct route – an indirect one might be much more valuable in terms of and educational experience.
Sheila
Thanks for this post – a really useful contribution to the “analytics” debate. I always find myself agreeing with Gartner Campbell on this as with other issues.
Thanks David, it’s hard to disagree with that voice sometimes:-) And again I would recommend everyone taking 45 minutes to listen to the session.
s
Hi Sheila, I missed Gardner Campbell’s online talk at the time, but was inspired to go and watch the recording after reading an enthusiastic blog account.
He picked up on some of the themes we have been working on at The Open University’s SocialLearn project as we try to develop social learning analytics that can support people to learn together online – not just in the classes and cohorts of formal education, but in the wider networks, communities and affinity groups that are important when we learn outside educational institutions.
I’ve just finished work on a review of the academic literature around analytics in education. As Wilbert commented, there are several audiences for analytics, and these audiences are using them for different purposes. In addition, the literature shows three main driving forces: technological (how can we extract value from ‘big data’ in the context of education?), educational (how can we use the understandings generated within the learning sciences, together with the big datasets we now have available, to support learning and teaching?) and political (how can we use analytics to improve our nation’s educational standards?). I think two-dimensional approaches to analytics emerged as an early response to the first of these drivers – and Gardner’s talk pushes us towards the second of these drivers and more learning-based approaches.
The review’s now online as a Technical Report – you might find it useful as you are carrying out your Analytics Reconnoitre http://kmi.open.ac.uk/publications/techreport/kmi-12-01
Thanks Rebecca – your project is definAtely on our list for the reconnoiter. And either me or one of my colleagues will no doubt be touch about it, I’m also hoping to got to LAK12 to catch up other developments too.
Sheila
One map? or many…I think it is the actions precipitated by the data that are the issue…
Years ago I remember talking to the team at Alton College, they’d surfaced student target data, attendance data and such like in a manner that allowed the student services folk to make interventions and support students before things got too bad…
the same data might be used by someone else to cull problem students.
It’s not what you’ve got, it’s what you do with it that counts… how do we support a moral framework around decision making?
Hi Sheila, when I take the suggested link “Here I Stand”, although I have a wikispaces account, the space is restricted…
“Invite-Only Wiki
This wiki has restricted membership. Contact the wiki organizers for details.”
Any suggestions for getting access?
Thanks, Stephen.
Hi Rob
Totally agree and that’s kind of what I was getting at. We need to ensure people get the chance to be involved in the process of gathering and using data, not just presented with “stuff” other thing are useful. If we only measure part of the picture we could end up in the ‘teaching to data’ trap like the teaching to the test trap. I think initially we just have to let people join the conversation.
S
Hi Stephen
That’s odd – I thought everything was open to anyone. Here is the main website for the course http://lak12.mooc.ca/ – maybe navigating to the course recordings that way will work – it did for me.
Sheila
I’m arriving to the party late but others have found your post and Gardner Campbell’s session very useful. I particularly liked the way Gardner turns the problem on it’s head highlighting things like using analytics to surface interventions around when someone is about to learn rather than fail.
The follow-up session by Simon Buckingham Shum at the OU dovetails nicely with Gardner’s session outlining the idea of an Open Learning Analytics ecosystem where the community can deposit and reuse analytics data which should help research explore ways to surface things like positive intervention http://lak12.wikispaces.com/Recordings
[You can also see parallels to this with the recent Value and benefits of text mining report http://www.jisc.ac.uk/news/stories/2012/03/textmining.aspx
Martin
Hi Martin
Glad you found it useful, and for the further links. Rebecca has also linked to work Simon is involved in too.
Sheila
Pingback: “Here I stand” – Campbell’s concerns on analytics and other stuff « The Weblog of (a) David Jones
Pingback: Yet more activity data work
Pingback: Adam Cooper’s Work Blog » Making Sense of “Analytics”
Pingback: Adam Cooper’s Work Blog » A Poem for Analytics
Pingback: The Importance of a Data-driven Infrastructure « UK Web Focus
Pingback: Our favourite posts of 2012