LAK12 Useful links and resources

There has been a huge amount of activty at this year’s LAK confrence. I’m still cogitating about the issues raised and will post my reflections over the next few days. However, in the meantime there were a number of really interesting tools and resources which were presented and which are available from this Diigo site George Siemens has set up.

Doug Clow has been doing a splendid (and quite awe inspiring) job of live blogging and has summary links of resources and his posts here. Myles Danson has also done some useful live blog posts from sessions too. We also have some really useful twitter activity summaries from Tony Hirst and Martin Hawkesy.

*Update - Audrey Watters review of the conference.

And just in case you missed them :-) below is a time line view of my collected tweets and a few pictures from the past few days.

LAK12 Pre conference workshop quick overview

I’ve had a very informative and stimulating day at the preconference workshops for the LAK12 conference. This is just a very quick post with links to some great summaries and resources that people have contributed.

*Learning Analtyics and Ethics live blog summary from Doug Clow (thanks, Doug you truly are a conference reporting machine!)

*Learning Analytics and Linked Data collective google doc – various contributors.

There has also been quite a bit of twitter activity and Tony Hirst was quick off the mark to visualise the connections. Martin Hawskey has also produced an alternative visualisation based on the twitter archive I set up last week I set up last week; and here’s another summary view from Tweetlevel.

I’ll hopefully do some more considered posts myself during the week. Based on today’s sessions this is shaping up to be a great conference.

Learning Analytics, where do you stand?

For? Against? Not bovvered? Don’t understand the question?

The term learning analytics is certainly trending in all the right ways on all the horizons scans. As with many “new” terms there are still some mis-conceptions about what it actually is or perhaps more accurately what it actually encompasses. For example, whilst talking with colleagues from the SURF Foundation earlier this week, they mentioned the “issues around using data to improve student retention” session at the CETIS conference. SURF have just funded a learning analytics programme of work which closely matches many of the examples and issues shared and discussed there. They were quite surprised that the session hadn’t be called “learning analytics”. Student retention is indeed a part of learning analytics, but not the only part.

However, back to my original question and the prompt for it. I’ve just caught up with the presentation Gardner Campbell gave to the LAK12 MOOC last week titled “Here I Stand” in which he presents a very compelling argument against some of the trends which are beginning to emerge in field of learning analytics.

Gardner is concerned that there is a danger of that the more reductive models of analytics may actually force us backwards in our models of teaching and learning. Drawing an analogy between M theory – in particular Stephen Hawkins description of there being not being one M theory but a “family of theories” – and how knowledge and learning actually occur. He is concerned that current learning analytics systems are based too much on “the math” and don’t actually show the human side of learning and the bigger picture of human interaction and knowledge transfer. As he pointed out “student success is not the same as success as a student”.

Some of the rubrics we might be tempted to use to (and in cases already are) build learning analytics systems reduce the educational experience to a simplistic management model. Typically systems are looking for signs pointing to failure, and not for the key moments of success in learning. What we should be working towards are system(s) that are adaptive, allow for reflection and can learn themselves.

This did make me think of the presentation at FOFE11 from IBM about their learning analytics system, which certainly scared the life out of me and many other’s I’ve spoken too. It also raised a lot of questions from the audience (and the twitter backchannel) about the educational value of the experience of failure. At the same time I was reflecting on the whole terminology issue again. Common understandings – why are they so difficult in education? When learning design was the “in thing”, I think it was John Casey who pointed out that what we were actually talking about most of the time was actually “teaching design”. Are we in danger of the same thing happening to the learning side of learning analytics being hi-jacked by narrower, or perhaps to be fairer, more tightly defined management and accountability driven analytics ?

To try and mitigate this we need to ensure that all key stakeholders are starting to ask (and answering) the questions Gardner raised in his presentation. What are the really useful “analytics” which can help me as a learner, teacher, administrator, etc? Which systems provide that data just now ? How can/do these stakeholders access and share the data in meaningful ways? How can we improve and build on these systems in ways which take into account the complexity of learning? Or as Gardner said, how can we start framing systems and questions around wisdom? But before we can do any of that we need to make sure that our stakeholders are informed enough to take a stand, and not just have to accept whatever system they are given.

At CETIS we are about to embark on an analytics landscape study, which we are calling an Analytics Reconnoitre. We are going to look at the field of learning analytics from a holistic perspective, review recent work and (hopefully) produce some pragmatic briefings on the who, where, why, what and when’s of learning analytics and point to useful resources and real world examples. This will build and complement work already funded by JISC such as the Relationship Management Programme, the Business Intelligence Infokit and the Activity Data Programme synthesis. We’ll also be looking to emerging communities of practice, both here in the UK and internationally to join up on thinking and future developments. Hopefully this work will contribute to the growing body of knowledge and experience in the field of learning analytics and well as raising some key questions (and hopefully some answers) around around its many facets.