Sheila Macneill » analytics http://blogs.cetis.org.uk/sheilamacneill Cetis blog Wed, 25 Sep 2013 09:58:15 +0000 en-US hourly 1 http://wordpress.org/?v=4.1.22 Small is beautiful: an antidote to big data #altc2013 http://blogs.cetis.org.uk/sheilamacneill/2013/09/19/small-is-beautiful-an-antidote-to-big-data-altc2013/ http://blogs.cetis.org.uk/sheilamacneill/2013/09/19/small-is-beautiful-an-antidote-to-big-data-altc2013/#comments Thu, 19 Sep 2013 15:15:35 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2354 Over the past year Cetis has been spending quite a bit of time exploring the context and potential of analytics within the education sector.  The Cetis analytics series is our on-going contribution to the debate. As part of our investigations we undertook a survey of UK institutions to try and get a baseline of where institutions are “at” in terms of analytics (see this post for more information ).

One of the issues around analytics at the moment is ownership and responsibility. Just who in your institution is responsible for learning analytics for example – someone in your VLE/Learning technology team, the stats team, someone in IS? We’re not sure either so we did try to hit as many areas and mailing lists as possible to get feedback. Unfortunately we didn’t get a huge response so we can’t draw anything conclusive from it apart from the fact that there is something happening, but it’s not quite clear what or where. However the data is providing a valuable starting point/ potential baseline which Adam Cooper has written up.  Adam’s post gives more information including links to his report and the actual data.

What does seem to be clear is that despite the hype of big data, at the institutional level small data is indeed beautiful and useful.  Last week at ALT-C 2013, Stephen Powell led a workshop around this theme.  During the session we used the case studies from  Cetis Analytics Series and the results of the survey to stimulate discussion around data and analytics in education. There is undoubtedly still lots of interest in analytics (particularly learning analytics) within the learning technology community as our very busy session demonstrated, however the discussion highlighted key concerns including:

  • An overall uncertainty about what might emerge
  • Are small scale initiatives more achievable than large scale institutional ones?
  • Ethics – including concerns about to what purposes analytics might be put – in the hand of managers who may use it in an unknowing way
  • Where is the data? who can access it? and what can they do with it?

You can also access the slides from the workshop via slideshare.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/09/19/small-is-beautiful-an-antidote-to-big-data-altc2013/feed/ 0
LASI-UK a twitter summary http://blogs.cetis.org.uk/sheilamacneill/2013/07/08/lasi-uk-a-twitter-summary/ http://blogs.cetis.org.uk/sheilamacneill/2013/07/08/lasi-uk-a-twitter-summary/#comments Mon, 08 Jul 2013 08:58:24 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2302 The LASI UK even held last Friday (5 July), brought over 50 people from across the UK to Edinburgh to join in the international learning analytics-fest accompanying the face to face Learning Analtyics Summer Institute being held at Stanford University.

I’m still trying to process all the great presentations and discussions from the day, but to give a flavour of the day I’ve pulled together some of the tweets from the #lasiuk back channel to provide a summary of the day. Martin Hawksey also live blogged the morning and afternoon sessions.

I’d also like to take this opportunity to give a public thank you to Naomi Jeffery and Hannah Jones from the OU Scotland for all their hard work in organising and ensuring the smooth running of the day.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/07/08/lasi-uk-a-twitter-summary/feed/ 2
IT departments – the institutional fall guy for MOOCs? http://blogs.cetis.org.uk/sheilamacneill/2013/05/16/it-departments-the-institutional-fall-guy-for-moocs/ http://blogs.cetis.org.uk/sheilamacneill/2013/05/16/it-departments-the-institutional-fall-guy-for-moocs/#comments Thu, 16 May 2013 13:23:18 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2273
The Fall Guy

The Fall Guy


(Image from IMDB http://www.imdb.com/media/rm782014464/tt0081859)

As Martin Weller pointed out earlier this week there are a growing number of MOOC metaphors being created. As I’ve been following the tweets from today’s “#moocapalooza ” (a hashtag I think invented by David Kernohan) a.k.a. Open and Online Learning making the most of MOOCs and other Models Conference I think I need to to add the Fall Guy to Martin’s list, particularly after reading this tweet.

I’m going to try and not too much in this post, and I apologise for taking this tweet at face value and out with its original context, but . . . Isn’t this just another one of those MOOC myths that twist the reality of what happens within institutions to suit the “education is broken we must build something else” mind set? As Martin Hawskey and Lorna Campbell both said in response to David’s tweet it’s not the systems that are the problem.

I’m going to stick my neck out (not too far) and say every technology you need to run a MOOC is available within every University. I’ve not seen anything in my adventures in MOOC-land that has made me think “oh wow, wish we could have one of those back in the non-MOOC world”. There are VLEs, blogs, wikis aplenty. And IT departments do a sterling job in keeping these running for all that “non MOOC stuff” that Universities do. You know, the dull and boring things you need to do “traditional” teaching.

Yesterday during a webinar on analytics and assessment and feedback, Rachel Forsyth (MMU) shared some of their learning system analytics data. Since the beginning of this year they’ve had over 8 million hits on their mobile interface which allows students to access key information like assessment marks, timetables and reading lists. At key points of in the year they have over 100,000 assignments being submitted electronically. I suspect many institutions are working at this scale. So I don’t think it’s a question of IT department’s not being up to delivering MOOCs, I think it’s more that they have quite a lot to do already and adding another potentially x000,000 of users is not something that can be undertaken lightly, or without any cost implications.

Investing in internal IT resources isn’t seen as a key part of MOOC development strategy. Why would it be when Coursera etc have been able to get money to build systems. In many ways using an external platform like FutureLearn is a very sensible option. It means that experiments with MOOCs can take place without putting additional strain on existing resources. We all know, or should do by now, that there’s no such thing as a free MOOC and that includes the infrastructure they sit within. So let’s not let another myth develop that the HE sector don’t have the technology or the ability to deliver MOOCs. They do, it’s just that it’s already working at capacity delivering their day to day business.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/05/16/it-departments-the-institutional-fall-guy-for-moocs/feed/ 8
Analytics in UK Further and Higher Education Survey http://blogs.cetis.org.uk/sheilamacneill/2013/05/13/analytics-in-uk-further-and-higher-education-survey/ http://blogs.cetis.org.uk/sheilamacneill/2013/05/13/analytics-in-uk-further-and-higher-education-survey/#comments Mon, 13 May 2013 13:26:56 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2263 Over the past few months, we at Cetis have been involved in a number of analytics related activities, most notably our Analytics Series of papers and case studies. Although we know there are pockets of really exciting developments here in the UK, we are keen to find out more about what is actually happening in our Universities and Colleges. In order to give us (and the community) a more accurate insight we are launching our Analytics in UK Further and Higher Education survey. From teaching and learning to the library to registry and business intelligence, we need to hear from you!

The survey is quite short (12 questions) and has been designed to try and allow us to undertake a “lite” benchmark of activity in the UK sector. We’d really appreciate if you could take 10 minutes or so to give us your feedback. The survey will stay open until June 16. Once we have all the data we will of course publish the results. We will be sharing our initial analysis of the data at a session at this years ALT-C.

The survey can be accessed here, please feel free to pass the link on to any relevant colleagues.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/05/13/analytics-in-uk-further-and-higher-education-survey/feed/ 2
Learning Analytics for Assessment and Feedback Webinar, 15 May http://blogs.cetis.org.uk/sheilamacneill/2013/05/13/learning-analytics-for-assessment-and-feedback-webinar-15-may/ http://blogs.cetis.org.uk/sheilamacneill/2013/05/13/learning-analytics-for-assessment-and-feedback-webinar-15-may/#comments Mon, 13 May 2013 12:22:29 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2257 **update 16 May**
Link to session recording

Later this week I’ll be chairing a (free) webinar on Learning Analytics for Assessment and Feeback. Featuring work from three projects in the current Jisc Assessment and Feedback Programme. I’m really looking forward to hearing first hand about the different approaches being developed across the programme.

“The concept of learning analytics is gaining traction in education as an approach to using learner data to gain insights into different trends and patterns but also to inform timely and appropriate support interventions. This webinar will explore a number of different approaches to integrating learning analytics into the context of assessment and feedback design; from overall assessment patterns and VLE usage in an institution, to creating student facing workshops, to developing principles for dashboards.”

The presentations will feature current thinking and approaches from teams from the following projects:
*TRAFFIC, Manchester Metropolitan University
*EBEAM, University of Huddersfield,
*iTeam, University of Hertfordshire

The webinar takes place Wednesday 15 May at 1pm (UK time) and is free to attend. A recording will also be available after the session. You can register by following this link.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/05/13/learning-analytics-for-assessment-and-feedback-webinar-15-may/feed/ 0
Deconstructing my (dis)engagement with MOOCs part 2 http://blogs.cetis.org.uk/sheilamacneill/2013/04/16/deconstructing-my-disengagement-with-moocs-part-2/ http://blogs.cetis.org.uk/sheilamacneill/2013/04/16/deconstructing-my-disengagement-with-moocs-part-2/#comments Tue, 16 Apr 2013 14:49:21 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2175 Following from my early post, I’ve attempted to use the classifiers outlined in the #lak13 paper on disengagement in MOOCs, in the context of my experiences. Obviously I’ve modified things a bit as what I’m doing is more of a self reflection of my personal context -so I’ve made the labels past tense. I’m also doing a presentation next week at the University of Southampton on the learner perspective of MOOCs and thought that these classifications would be a good way to talk about my experiences.

Firstly here are the MOOCs I’ve signed up for over ( the ? years are when I was aware but not active in MOOCs)

MOOCs I've took!

MOOCs I've took!

Now with the course engagement labels

My MOOC engagement with labels

My MOOC engagement with labels

And finally aligned to trajectory labels

My MOOC participation using trajectory labels

My MOOC participation using trajectory labels

A big caveat, not completing, disengaging and dropping out does not mean I didn’t learn from each he experience and context of each course.

More to come next week including the full presentation.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/04/16/deconstructing-my-disengagement-with-moocs-part-2/feed/ 0
Deconstructing my own (dis)engagement with MOOCs http://blogs.cetis.org.uk/sheilamacneill/2013/04/15/deconstructing-my-own-disengagement-with-moocs/ http://blogs.cetis.org.uk/sheilamacneill/2013/04/15/deconstructing-my-own-disengagement-with-moocs/#comments Mon, 15 Apr 2013 14:23:51 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2165 No educational technology conference at the moment is complete without a bit of MOOC-ery and #lak13 was no exception. However the “Deconstructing disengagement: analyzing learner sub-populations in massive open online courses” paper was a move on from the familiar territory of broad, brush stroke big numbers towards a more nuanced view of some of the emerging patterns of learners across three Stanford based Coursera courses.

The authors have created:

” a simple, scalable, and informative classification method that identifies a small number of longitudinal engagement trajectories in MOOCs. Learners are classified based on their patterns of interaction with video lectures and assessments, the primary features of most MOOCs to date . . .”

” . . .the classifier consistently identifies four prototypical trajectories of engagement.”

As I listened to the authors present the paper I couldn’t help but reflect on my own recent MOOC experience. Their classifier labels (auditing, completing, sampling, disengaging) made a lot of sense to me. At times I have been in all four “states” of auditing, completing, disengaging and sampling.

The study investigated typical Coursera courses which mainly take the talking head video, quiz, discussion forum, final assignment format and suggested that use of the framework to identify sub-populations of learners would allow more customisation of courses and (hopefully) more engagement and I guess ultimately completion.

I did find it interesting that they identified that completing learners were most active on forums, something that contradicts my (limited) experience. I’ve signed up for a number of the science-y type Coursera courses and have sampled and disengaged. Compare that to the recent #edcmooc which again was run through Coursera but didn’t use the talking head-quiz-forum design. Although I didn’t really engage with the discussion forums (I tried but they just “don’t do it for me”) I did feel very engaged with the content, the activities, my peers and I completed the course.

I’ve spoken to a number of fellow MOOC-ers recently and they’re not that keen on the discussion forums either. Of course, it’s highly likely that people I speak to are like me and probably interact more on their blogs and twitter than in discussion forums. Maybe its an arts/science thing ? Shorter discussions? I don’t really know, but at scale I find any discussion forum challenging, time consuming and to be completely honest a bit of a waste of time.

The other finding to emerge from the study was that completing and auditing (those that just watch the videos and don’t necessarily contribute to forums or submit assignments) sub-populations have the best experiences of the courses. Again drawing on my own experiences, I can see why this could be the case. Despite dropping out of courses, the videos I’ve watched have all been “good” in the sense that they were of a high technical quality, and the content was very clear. So I’ve watched and thought “oh, I didn’t know that/ oh, so that’s what that means? oh that’s what I need to do”. The latter being the point that I usual disengage as there is something far more pressing I need to do :-) But I have to say that the experience of actually completing (I’m now at 3 for that) MOOCs was far richer. Partly that was down to the interaction with my peers on each occasion, and the cMOOC ethos of each course design.

That said, I do think the auditing, completing, disengaging, sampling labels are a very useful addition to the discourse and understanding of what is actually going on within the differing populations of learners in MOOCs.

A more detailed article on the research is available here.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/04/15/deconstructing-my-own-disengagement-with-moocs/feed/ 0
Learning analytics – a bridge to the middle space? #lak13 http://blogs.cetis.org.uk/sheilamacneill/2013/04/14/learning-analytics-a-bridge-to-the-middle-space-lak13/ http://blogs.cetis.org.uk/sheilamacneill/2013/04/14/learning-analytics-a-bridge-to-the-middle-space-lak13/#comments Sun, 14 Apr 2013 14:09:05 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2150 It’s not quite a chicken and egg situation, but there is a always a tension between technology and pedagogy. A common concern being that technology is being used in education “just because it can” and not because it has a sound pedagogical impact. Abelardo Pardo’s keynote at the recent #lak13 conference described how learning analytics could potentially sit in the middle space between technology and teaching.

Learning analytics could provide additional bridges between each community to help make real improvements to teaching and learning.  Analytical tools can provide data driven insights into how people interact with systems, activities and each other and learn, but in turn we need to have the expertise of teachers to help developers/data scientists frame questions, develop potential data collection points and contextualize findings. Abelardo’s personal story about his own engagement both with pedagogy and analytics was a powerful example of this. The bridge analogy really resonated with me and many other of the delegates.  I’ve often described, and indeed hope that, a large part of my job is being a bridge between technology and teaching.  

On the final day of the conference  there was a healthy debate around what the focus of the LAK conference and community should be.  On the one hand learning analytics is a relatively new discipline. It is trying hard to establish its research credentials, and so needs to be active in producing “serious” research papers. On the other, if it really wants live up its own hypothesis and gain traction with practitioners/institutions, then it needs to not only to provide insights but also accessible, scalable tools and methodologies.  The “science bit” of some of  the LAK research papers were quite challenging to put into a real world context, even for the enthusiastic data amateur such as myself.

However we do need valid research to underpin the discipline and also to validate  any claims that are being made.  Extension of action research projects could provide one solution to this which was encompassed by a number of papers. I’m a strong believer in action research in education, it seems a natural fit with how most teachers actually work, and also can provide real opportunities for students to be involved in the process too.  ( As an aside, like last year, I did get the feeling that what was being discussed was actually teaching analytics – not learning analytics, i.e it was still about teacher intervention understanding and what could be done to students). 

Part of what we have been trying to at CETIS with our Analytics Series, is to try and provide a bridge into this whole area. The set of case studies I’ve been working on in particular are specifically aimed at illustrating applications of analytics in a variety of real world contexts. But they are not the kind of papers that would be accepted (or submitted ) to the LAK conference. One suggestion my colleague Martin Hawskey came up with during the final day of the conference was the idea of a more “relaxed” stream/session.  

Perhaps something along the lines of the lightning presentations we used at both the UK SoLAR Flare meeting and the recent CETIS conference. This could provide a bridge between the research focus of the conference and actual practice, and give an opportunity to quickly share some of the exciting work that many people are doing, but for a variety of reasons, aren’t writing research papers on. Maybe that would  bring a bit more of an experimentation/what’s actually happening now/fun element to the proceedings.  

If you want to catch up on conference proceedings, I’d thoroughly recommend reading some of the excellent live blogs from Doug Clow, Sharon Slade and Myles Danson, which Doug has rather handily collated here. 

I’ll also be following up with a couple of more posts in the next few days based on some of the really exciting work I saw presented at the conference. 

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/04/14/learning-analytics-a-bridge-to-the-middle-space-lak13/feed/ 0
Acting on Assessment Analytics – new case study http://blogs.cetis.org.uk/sheilamacneill/2013/04/10/acting-on-assessment-analytics-new-case-study/ http://blogs.cetis.org.uk/sheilamacneill/2013/04/10/acting-on-assessment-analytics-new-case-study/#comments Wed, 10 Apr 2013 06:16:58 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2142 Despite the hype around it, getting started with learning analytics can be a challenge for most everyday lecturers. What can you actually do with data once you get it? As more “everyday” systems (in particular online assessment tools) are able to provide data and/or customised reports, it is getting easier to start applying and using analytics approaches in teaching and learning.  

The next case study in our Analytics series focuses on the work of Dr Cath Ellis and colleagues at the University of Huddersfield. It illustrates how they are acting on the data from their e-submission system, not only to enhance and refine their feedback to students, but also to help improve their approaches to assessment and overall curriculum design.  
 
At the analytics session at #cetis13 Ranjit Sidhu pointed out that local data can be much more interesting and useful than big data. This certainly rings true for teaching and learning.  Using very local data, Cath and her colleagues are developing a workshop approach to sharing generic assessment data with students in a controlled and emotionally secure environment. The case study also highlights issues around data handling skills and the need for more evidence of successful interventions through using analtyics. 

You can access the full case study here

We are always looking for potential case studies to add to our collection, so if you are doing some learning analtyics related work and would be willing to share your experiences in this way, then please get in touch.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/04/10/acting-on-assessment-analytics-new-case-study/feed/ 1
Avoiding getting caught in the data slick: thoughts from Analtyics and Institutional Capabilities session, #cetis13 http://blogs.cetis.org.uk/sheilamacneill/2013/03/18/avoiding-the-getting-caught-in-the-data-slick-thoughts-from-analtyics-and-institutional-capabilities-session-cetis13/ http://blogs.cetis.org.uk/sheilamacneill/2013/03/18/avoiding-the-getting-caught-in-the-data-slick-thoughts-from-analtyics-and-institutional-capabilities-session-cetis13/#comments Mon, 18 Mar 2013 11:31:40 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2121 Data, data everywhere data, but what do we actually do with it? Do we need “big” data in education? What is it we are trying to find out? What is our ROI both at institutional and national levels? Just some the questions that were raised at the Analytics and Institutional Capabilities session at #cetis13 last week.

Is data our new oil? asked Martin Hawksey in his introduction to the session. And if, as many seem to think, it is, do we we really have the capabilities to “refine” it properly? How can we ensure that we aren’t putting the equivalent of petrol into a diesel engine? How can we ensure that institutions (and individuals) don’t end getting trapped in a dangerous slick of data? Are we ensuring that everyone (staff and students) are developing the data literacy skills they need to use and ultimately understand the visualisations we can produce from data?

Bird in an oil slick

Bird in an oil slick

Ranjit Sidhu (Statistics into Decisions) gave an equally inspiring and terrifying presentation around the hype of big data. He pointed out that in education “local data” and not “big data” is really where we should be focusing our attention, particularly in relation to our core business of attracting students. In relation to national level data he also questions the ROI on some “quite big” data national data collection activities such as the KIS. From the embarrassingly low figures he showed us of the traffic to the UniStats site, it would appear not. We may have caused a mini spike in the hits for one day in March :-)

However, there are people who are starting to ask the right questions and use their data in ways that are meaningful. A series of lightning talks which highlighted a cross section of approaches to using institutional data. This was followed by three inspiring talks from Jean Mutton (University of Derby), Mark Stubbs (MMU) and Simon Buckingham Shum (OU). Jean outlined the work she and her team have been doing at Derby on enhancing the student experience (more information on this is available through our new case study); Mark then gave a review of the work they have been doing around deeper exploration of NSS returns data and their VLE data. Both Jean and Mark commented that their work started without them actually realising they were “doing analytics”. Marks analytics cycle diagram was really useful in illustrating their approach.

screen shot of analtyics cycle

screen shot of analtyics cycle

Simon, on the other hand, of course very much knew that he was “doing analytics” and gave an overview of some the learning analtyics work currently being undertaken at the OU, including a quick look at some areas FutureLearn could potentially be heading.

Throughout all the presentations the key motivator has, and continues to be, framing and then developing the “right” questions to get the most out of data collection activity and analysis.

More information including links to the slides from the presentations are available on the CETIS website.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/03/18/avoiding-the-getting-caught-in-the-data-slick-thoughts-from-analtyics-and-institutional-capabilities-session-cetis13/feed/ 0
Cetis Analytics Series Volume 2: Engaging with Analytics http://blogs.cetis.org.uk/sheilamacneill/2013/03/13/cetis-analytics-series-volume-2-engaging-with-analytics/ http://blogs.cetis.org.uk/sheilamacneill/2013/03/13/cetis-analytics-series-volume-2-engaging-with-analytics/#comments Wed, 13 Mar 2013 08:04:01 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2113 Our first set of papers around analytics in education has been published, and with nearly 17,000 downloads, it would seem that there is an appetite for resources around this topic. We are now moving onto phase of our exploration of analytics and accompanying this will be a range of outputs including some more briefing papers and case studies. Volume 1 took a high level view of the domain, volume 2 will take a much more user centred view including a number of short case studies sharing experiences of a range of early adopters who are exploring the potential of taking a more analytics based approach.

The first case study features Jean Mutton, Student Experience Project Manager, at the University of Derby. Jean shares with us how her journey into the world of analytics started and how and where she and the colleagues across the university she has been working with, see the potential for analytics to have an impact on improving the student experience.

University of Derby, student engagement factors

University of Derby, student engagement factors

The case study is available to download here.

We have a number of other case studies identified which we’ll be publishing over the coming months, however we are always looking for more examples. So if you are working with analytics have some time to chat with us, we’d love to hear from you and share your experiences in this way too. Just leave a comment or email me (s.macneill@strath.ac.uk).

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/03/13/cetis-analytics-series-volume-2-engaging-with-analytics/feed/ 1
What can I do with my educational data? (#lak13) http://blogs.cetis.org.uk/sheilamacneill/2013/03/06/what-can-i-do-with-my-educational-data-lak13/ http://blogs.cetis.org.uk/sheilamacneill/2013/03/06/what-can-i-do-with-my-educational-data-lak13/#comments Wed, 06 Mar 2013 14:05:35 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2106 Following on from yesterday’s post, another “thought bomb” that has been running around my brain is something far closer to the core of Audrey’s “who owns your educational data?” presentation. Audrey was advocating the need for student owned personal data lockers (see screen shot below). This idea also chimes with the work of the Tin Can API project, and closer to home in the UK the MiData project. The latter is more concerned with more generic data around utility, mobile phone usage than educational data, but the data locker concept is key there too.

Screen shot of Personal Education Data Locker (Audrey Watters)

Screen shot of Personal Education Data Locker (Audrey Watters)

As you will know dear reader, I have turned into something of a MOOC-aholic of late. I am becoming increasingly interested in how I can make sense of my data, network connections in and across the courses I’m participating in and, of course, how I can access and use the data I’m creating in and across these “open” courses.

I’m currently not very active member of the current LAK13 learning analytics MOOC, but the first activity for the course is, I hope, going to help me frame some of the issues I’ve been thinking about in relation to my educational data and in turn my personal learning analytics.

Using the framework for the first assignment/task for LAK13, this is what I am going to try and do.

1. What do you want to do/understand better/solve?

I want to compare what data about my learning activity I can access across 3 different MOOC courses and the online spaces I have interacted in on each and see if I can identify any potentially meaningful patterns, networks which would help me reflective and understand better, my learning experiences. I also want to explore see how/if learning analytics approaches could help me in terms of contributing to my personal learning environment (PLE) in relation to MOOCs, and if it is possible to illustrate the different “success” measures from each course provider in a coherent way.

2. Defining the context: what is it that you want to solve or do? Who are the people that are involved? What are social implications? Cultural?

I want to see how/if I can aggregate my data from several MOOCs in a coherent open space and see what learning analytics approaches can be of help to a learner in terms of contextualising their educational experiences across a range of platforms.

This is mainly an experiment using myself and my data. I’m hoping that it might start to raise issues from the learner’s perspective which could have implications for course design, access to data, and thoughts around student created and owned eportfolios/and or data lockers.

3. Brainstorm ideas/challenges around your problem/opportunity. How could you solve it? What are the most important variables?

I’ve already done some initial brain storming around using SNA techniques to visualise networks and connections in the Cloudworks site which the OLDS MOOC uses. Tony Hirst has (as ever) pointed the way to some further exploration. And I’ll be following up on Martin Hawksey’s recent post about discussion group data collection .

I’m not entirely sure about the most important variables just now, but one challenge I see is actually finding myself/my data in a potentially huge data set and finding useful ways to contextualise me using those data sets.

4. Explore potential data sources. Will you have problems accessing the data? What is the shape of the data (reasonably clean? or a mess of log files that span different systems and will require time and effort to clean/integrate?) Will the data be sufficient in scope to address the problem/opportunity that you are investigating?

The main issue I see just now is going to be collecting data but I believe there some data that I can access about each MOOC. The MOOCs I have in mind are primarily #edc (coursera) and #oldsmooc (OU). One seems to be far more open in terms of potential data access points than the other.

There will be some cleaning of data required but I’m hoping I can “stand on the shoulders of giants” and re-use some google spreadsheet goodness from Martin.

I’m fairly confident that there will be enough data for me to at least understand the problems around the challenges for letting learners try and make sense of their data more.

5. Consider the aspects of the problem/opportunity that are beyond the scope of analytics. How will your analytics model respond to these analytics blind spots?

This project is far wider than just analytics as it will hopefully help me to make some more sense of the potential for analytics to help me as a learner make sense and share my learning experiences in one place that I chose. Already I see Coursera for example trying to model my interactions on their courses into a space they have designed – and I don’t really like that.

I’m thinking much more about personal aggregation points/ sources than the creation of actual data locker. However it maybe that some existing eportfolio systems could provide the basis for that.

As ever I’d welcome any feedback/suggestions.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/03/06/what-can-i-do-with-my-educational-data-lak13/feed/ 2
Whose data is it anyway? http://blogs.cetis.org.uk/sheilamacneill/2013/03/05/whose-data-is-it-anyway/ http://blogs.cetis.org.uk/sheilamacneill/2013/03/05/whose-data-is-it-anyway/#comments Tue, 05 Mar 2013 11:46:22 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2101 I’ve just caught up with the recent #etmooc webinar featuring Audrey Watters titled ‘who owns your education data?’. At the start of her talk Audrey said she wanted to plant some “thought bombs” for participants. I’m not sure this post is particularly explosive, but her talk has prompted me to try and share some thoughts which have been mulling around my brain for a while now.

Audrey’s talk centred around the personal data, and asked some very pertinent questions in relation to educational data; as well as the more general “data giveaway” we are all a part of when we all too quickly sign terms and conditions for various services. Like most people I’ve never actually read all the terms and conditions of anything I’ve signed up for online.

Over the last year or so, I’ve been increasingly thinking about data and analtyics (not just learning analytics) in education in general. And I keep coming back to the fundamental questions Audrey raises in the presentation around the who, what, why, where, when and how of data collection, access and (re)use. Audrey focuses on the issue from the individual point of view, and I won’t try and repeat her presentation, I would recommend you take half an hour to listen to it. One thought bomb that is ticking in my head is about data collection and use at the institutional level.

As more and more systems offer analytics packages, and in particular learning analytics solutions, are we sure that at an institutional we can get the data from the systems, when we want it and in a format we want it and not just be given data reports/and or dashboards? At these relatively early stages for learning analtyics, are institutions in danger of unwittingly giving away their data to companies who have solutions which suit today’s needs without thinking about future requirements for access to/and use of data? There is a recognised skills shortage of data scientists (not just in education) so at the moment it is often easier to buy an off the shelf solution. As we all become more data aware and (hopefully) data literate, our demands for access to data and our abilities to do something useful with it should develop too.

This is an issue John Campbell (Purdue University) raised at his presentation at the Surfnet Conference last November. We had several conversations about the potential for turning some of the terms and conditions for data on its head by having having some (community created and shared) clause which system vendors would have to agree to. Something along the lines of “if we use your tool, we have the right to right to request all data being collected for return to the institution on a timely basis in a format of our choice”. I can see a clause like that being useful at at personal level too.

Wherever we sit we need to continually use the fundamental questions around who, what, why, where, when and how of our data collection systems, policies and strategies to negotiate appropriate access.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/03/05/whose-data-is-it-anyway/feed/ 0
Prototyping my Cloudworks profile page http://blogs.cetis.org.uk/sheilamacneill/2013/02/12/prototyping-my-cloudworks-profile-page/ http://blogs.cetis.org.uk/sheilamacneill/2013/02/12/prototyping-my-cloudworks-profile-page/#comments Tue, 12 Feb 2013 15:57:57 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2047 Week 5 in #oldsmooc has been all about prototyping. Now I’ve not quite got to the stage of having a design to prototype so I’ve gone back to some of my earlier thoughts around the potential for Cloudworks to be more useful to learners and show alternative views of community, content and activities. I really think that Cloudworks has potential as a kind of portfolio/personal working space particularly for MOOCs.

As I’ve already said, Cloudworks doesn’t have a hierarchical structure, it’s been designed to be more social and flexible so its navigation is somewhat tricky, particularly if you are using it over a longer time frame than say a one or two day workshop. It relies on you as a user to tag and favourite clouds and cloudscapes, but even then when you’re involved in something like a mooc that doesn’t really help you navigate your way around the site. However cloudworks does have an open API and as I’ve demonstrated you can relatively easily produce a mind map view of your clouds which makes it a bit easier to see your “stuff”. And Tony Hirst has shown how using the API you can start to use visualisation techniques to show network veiws of various kinds.

In a previous post I created a very rough sketch of how some of Tony’s ideas could be incorporated in to a user’s profile page.

Potential Cloudworks Profile page

Potential Cloudworks Profile page

As part of the prototyping activity I decide to think a bit more about this and use Balsamiq (one of the tools recommended to us this week) to rough out some ideas in a bit more detail.

The main ideas I had were around redesigning the profile page so it was a bit more useful. Notifications would be really useful so you could clearly see if anything had been added to any of your clouds or clouds you follow – a bit like Facebook. Also one thing that does annoy me is the order of the list of my clouds and cloudscapes – it’s alphabetical. But what I really want at the top of the list is either my most recently created or most active cloud.

In the screenshot below you can see I have an extra click and scroll to get to my most recent cloud via the clouds list. What I tend to do is a bit of circumnavigation via my oldsmooc cloudscape and hope I have add my clouds it it.

Screen shot of my cloud and cloudscape lists

Screen shot of my cloud and cloudscape lists

I think the profile page could be redesigned to make use of the space a bit more (perhaps lose the cloud stream, because I’m not sure if that is really useful or not as it stands), and have some more useful/useble views of my activity. The three main areas I thought we could start grouping are clouds, cloudscapes (and they are already included) and add a community dimension so you can start to see who you are connecting with.

My first attempt:

screen shot of my first Cloudworks mock up

screen shot of my first Cloudworks mock up

Now but on reflection – tabs not a great idea and to be honest they were in the tutorial so I that’s probably why I used them :-)

But then I had another go and came up something slightly different. Here is a video where I explain my thinking a bit more.

cloudworks profile page prototype take 2 from Sheila MacNeill on Vimeo.

Some initial comments from fellow #oldsmooc-ers included:

and you can see more comments in my cloud for the week as well as take 1 of the video.

This all needs a bit more thought – particularly around what is actually feasible in terms of performance and creating “live” visualisations, and indeed about what would actually be most useful. And I’ve already been in conversation with Juliette Culver the original developer of Cloudworks about some of the more straight forward potential changes like the re-ordering of cloud lists. I do think that with a bit more development along these lines Cloudworks could become a very important part of a personal learning environment/portfolio.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/02/12/prototyping-my-cloudworks-profile-page/feed/ 2
Ghosts in the machine? #edcmooc http://blogs.cetis.org.uk/sheilamacneill/2013/02/08/ghosts-in-the-machine-edcmooc/ http://blogs.cetis.org.uk/sheilamacneill/2013/02/08/ghosts-in-the-machine-edcmooc/#comments Fri, 08 Feb 2013 12:06:22 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2044 Following on from last week’s post on the #edcmooc, the course itself has turned to explore the notion of MOOCs in the context of utopian/dystopian views of technology and education. The questions I raised in the post are still running through my mind. However they were at a much more holistic than personal level.

This week, I’ve been really trying to think about things from my student (or learner) point of view. Are MOOCs really changing the way I engage with formal education systems? On the one hand yes, as they are allowing me (and thousands of others) to get a taste of courses from well established institutions. At a very surface level who doesn’t want to say they’ve studied at MIT/Stanford/Edinburgh? As I said last week, there’s no fee so less pressure in one sense to explore new areas and if they don’t suit you, there’s no issue in dropping out – well not for the student at this stage anyway. Perhaps in the future, through various analytical methods, serial drop outs will be recognised by “the system” and not be allowed to join courses, or have to start paying to be allowed in.

But on the other hand, is what I’m actually doing really different than what I did at school and when I was an undergraduate or was a student on “traditional’ on line, distance courses. Well no, not really. I’m reading selected papers and articles, watching videos, contributing to discussion forums – nothing I’ve not done before, or presented to me in a way that I’ve not seen before. The “go to class” button on the Coursera site does make me giggle tho’ as it’s just soo American and every time I see it I hear a disembodied American voice. But I digress.

The element of peer review for the final assignment for #edcmooc is something I’ve not done as a student, but it’s not a new concept to me. Despite more information on the site and from the team this week I’m still not sure how this will actually work, and if I’ll get my certificate of completion for just posting something online or if there is a minimum number of reviews I need to get. Like many other fellow students the final assessment is something we have been concerned about from day 1, which seemed to come as a surprise to some of the course team. During the end of week 1 google hang out, the team did try to reassure people, but surely they must have expected that we were going to go look at week 5 and “final assessment” almost before anything else? Students are very pragmatic, if there’s an assessment we want to know as soon as possible the where,when, what, why, who,how, as soon as possible. That’s how we’ve been trained (and I use that word very deliberately). Like thousands of others, my whole education career from primary school onwards centred around final grades and exams – so I want to know as much as I can so I know what to do so I can pass and get that certificate.

That overriding response to any kind of assessment can very easily over-ride any of the other softer (but just as worthy) reasons for participation and over-ride the potential of social media to connect and share on an unprecedented level.

As I’ve been reading and watching more dystopian than utopian material, and observing the general MOOC debate taking another turn with the pulling of the Georgia Tech course, I’ve been thinking a lot of the whole experimental nature of MOOCs. We are all just part of a huge experiment just now, students and course teams alike. But we’re not putting very many new elements into the mix, and our pre-determined behaviours are driving our activity. We are in a sense all just ghosts in the machine. When we do try and do something different then participation can drop dramatically. I know that I, and lots of my fellow students on #oldsmooc have struggled to actually complete project based activities.

The community element of MOOCs can be fascinating, and the use of social network analysis can help to give some insights into activity, patterns of behaviour and connections. But with so many people on a course is it really possible to make and sustain meaningful connections? From a selfish point of view, having my blog picked up by the #edcmooc news feed has greatly increased my readership and more importantly I’m getting comments which is more meaningful to me than hits. I’ve tried read other posts too, but in the first week it was really difficult to keep up, so I’ve fallen back to a very pragmatic, reciprocal approach. But with so much going on you need to have strategies to cope, and there is quite a bit of activity around developing a MOOC survival kit which has come from fellow students.

As the course develops the initial euphoria and social web activity may well be slowing down. Looking at the twitter activity it does look like it is on a downwards trend.

#edcmooc Twitter activity diagram

#edcmooc Twitter activity diagram

Monitoring this level of activity is still a challenge for the course team and students alike. This morning my colleague Martin Hawskey and I were talking about this, and speculating that maybe there are valuable lessons we in the education sector can learn from the commercial sector about managing “massive” online campaigns. Martin has also done a huge amount of work aggregating data and I’d recommend looking at his blogs. This post is a good starting point.

Listening to the google hang out session run by the #edcmooc team they again seemed to have under estimated the time sink reality of having 41,000 students in a course. Despite being upfront about not being everywhere, the temptation to look must be overwhelming. This was also echoed in the first couple of weeks of #oldsmooc. Interestingly this week there are teaching assistants and students from the MSc course actively involved in the #edcmooc.

I’ve also been having a play with the data from the Facebook group. I’ve had a bit of interaction there, but not a lot. So despite it being a huge group I don’t get the impression, that apart from posting links to blogs for newsfeed, there is a lot of activity or connections. Which seems to be reflected in the graphs created from the data.

#edc Facebook group friends connections

#edc Facebook group friends connections


This is a view based on friends connections. NB it was very difficult for a data novice like me to get any meaningful view of this group, but I hope that this gives the impression of the massive number of people and relative lack of connections.

There are a few more connections which can be drawn from the interactions data, and my colleagye David Sherlock manage create a view where some clusters are emerging – but with such a huge group it is difficult to read that much into the visualisation – apart from the fact that there are lots of nodes (people).

#edcmooc Facebook group interactions

#edcmooc Facebook group interactions


I don’t think any of this is unique to #edcmooc. We’re all just learning how to design/run and participate at this level. Technology is allowing us to connect and share at a scale unimaginable even 10 years ago, if we have access to it. NB there was a very interesting comment on my blog about us all being digital slaves.

Despite the potential affordances of access at scale it seems to me we are increasingly just perpetuating an existing system if we don’t take more time to understand the context and consequences of our online connections and communities. I don’t need to connect with 40,000 people but I do want to understand more about how, why and how I could/do. That would be a really new element to add to any course, not just MOOCs (and not something that’s just left to a course specifically about analytics). Unless that happens my primary driver will be that “completion certificate”. In this instance, and many others, to get that I don’t really need to make use of the course community. So I’m just perpetuating an existing where I know how to play the game, even if it’s appearance is somewhat disguised.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/02/08/ghosts-in-the-machine-edcmooc/feed/ 7
Quick review of the Larnaca Learning Design Declaration http://blogs.cetis.org.uk/sheilamacneill/2013/01/08/quick-review-of-the-larnaca-learning-design-declaration/ http://blogs.cetis.org.uk/sheilamacneill/2013/01/08/quick-review-of-the-larnaca-learning-design-declaration/#comments Tue, 08 Jan 2013 15:04:47 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1970 Late last month the Larnaca Declaration on Learning Design was published. Being “that time of year” I didn’t get round to blogging about it at the time. However as it’s the new year and as the OLDS mooc is starting this week, I thought it would be timely to have a quick review of the declaration.

The wordle gives a flavour of the emphasis of the text.

Wordle of Larnaca Declaration on Learning Design

Wordle of Larnaca Declaration on Learning Design

First off, it’s actually more of a descriptive paper on the development of research into learning design, rather than a set of statements declaring intent or a call for action. As such, it is quite a substantial document. Setting the context and sharing the outcomes of over 10 years worth of research is very useful and for anyone interested in this area I would say it is definitely worth taking the time to read it. And even for an “old hand” like me it was useful to recap on some of the background and core concepts. It states:

“This paper describes how ongoing work to develop a descriptive language for teaching and learning activities (often including the use of technology) is changing the way educators think about planning and facilitating educational activities. The ultimate goal of Learning Design is to convey great teaching ideas among educators in order to improve student learning.”

One of my main areas of involvement with learning design has been around interoperability, and the sharing of designs. Although the IMS Learning Design specification offered great promise of technical interoperability, there were a number of barriers to implementation of the full potential of the specification. And indeed expectations of what the spec actually did were somewhat over-inflated. Something I reflected on way back in 2009. However sharing of design practice and designs themselves has developed and this is something at CETIS we’ve tried to promote and move forward through our work in the JISC Design for Learning Programme, in particular with our mapping of designs report, the JISC Curriculum Design and Delivery Programmes and in our Design bashes: 2009, 2010, 2011. I was very pleased to see the Design Bashes included in the timeline of developments in the paper.

James Dalziel and the LAMS team have continually shown how designs can be easily built, run, shared and adapted. However having one language or notation system is a still goal in the field. During the past few years tho, much of the work has been concentrated on understanding the design process and how to help teachers find effective tools (online and offline) to develop new(er) approaches to teaching practice, and share those with the wider community. Viewpoints, LDSE and the OULDI projects are all good examples of this work.

The declaration uses the analogy of the development of musical notation to explain the need and aspirations of a design language which can be used to share and reproduce ideas, or in this case lessons. Whilst still a conceptual idea, this maybe one of the closest analogies with universal understanding. Developing such a notation system, is still a challenge as the paper highlights.

The declaration also introduces a Learning Design Conceptual Map which tries to “capture the broader education landscape and how it relates to the core concepts of Learning Design“.

Learning Design Conceptual Map

Learning Design Conceptual Map

These concepts including pedagogic neutrality, pedagogic approaches/theories and methodologies, teaching lifecycle, granularity of designs, guidance and sharing. The paper puts forward these core concepts as providing the foundations of a framework for learning design which combined with the conceptual map and actual practice provides a “new synthesis for for the field of learning design” and future developments.

Components of the field of Learning Design

Components of the field of Learning Design

So what next? The link between learning analytics and learning design was highlighted at the recent UK SoLAR Flare meeting. Will having more data about interaction/networks be able to help develop design processes and ultimately improving the learning experience for students? What about the link with OERs? Content always needs context and using OERs effectively intrinsically means having effective learning designs, so maybe now is a good time for OER community to engage more with the learning design community.

The Declaration is a very useful summary of where the Learning Design community is to date, but what is always needed is more time for practising teachers to engage with these ideas to allow them to start engaging with the research community and the tools and methodologies which they have been developing. The Declaration alone cannot do this, but it might act as a stimulus for exisiting and future developments. I’d also be up for running another Design Bash if there is enough interest – let me know in the comments if you are interested.

The OLDS MOOC is a another great opportunity for future development too and I’m looking forward to engaging with it over the next few weeks.

Some other useful resources
*Learning Design Network Facebook page
*PDF version of the Declaration
*CETIS resources on curriculum and learning design
*JISC Design Studio

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/01/08/quick-review-of-the-larnaca-learning-design-declaration/feed/ 7
Institutional Readiness for Analytics – practice and policy http://blogs.cetis.org.uk/sheilamacneill/2012/12/20/institutional-readiness-for-analytics-practice-and-policy/ http://blogs.cetis.org.uk/sheilamacneill/2012/12/20/institutional-readiness-for-analytics-practice-and-policy/#comments Thu, 20 Dec 2012 12:04:59 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1966 So far in our Analytics Series we have been setting out the background, history and context of analytics in education at fairly broad and high levels. Developing policy and getting strategic buy-in is critical for any successful project (analytics based or not), so we have tried to highlight issues which will be of use to senior management in terms of the broader context and value of analytics approaches.

Simon Buckingham Schum at the OU (a key figure in the world of learning analytics) has also just produced Learning Analytics Policy Brief for the UNESCO Institute for Information Technologies in Education. Specifically focussing on learning analytics Simon’s paper highlights a number of key issues around “the limits of computational modelling, the ethics of analytics, and the educational paradigms that learning analytics promote”, and is another welcome addition to the growing literature on learning analytics; and is a useful complementary resource to to the CETIS series. I would recommend it to anyone interested in this area.

Moving from the policy to practicalities is the focus of our next paper, Institutional Readiness for Analytics. Written by Stephen Powell (with a little bit of input from me), this paper drills down from policy level decisions to the more pragmatic issues faced by staff in institutions who want to start to make some sense of their data through analytics based techniques. It presents two short cases studies (from the University of Bolton and the Open University) outlining the different approaches each institution has taken to try and make more sense of the data they have access to and how that can begin to make an impact on key decisions around teaching, learning and administrative processes.

The OU is probably slightly “ahead of the game” in terms of data collection and provisioning and so their case study focuses more on staff development issues through their Data Wrangler Project, whereas the University of Bolton case study looks more at how they are approaching data provisioning issues. As the paper states, although the two approaches are very different “they should be considered as interrelated with each informing the work of the other in a process of experimentation leading to the development of practices and techniques that meet the needs of the organisation.”

As ever if you have thoughts or any experiences of using analytics approaches in your institution, we’d love to hear from you in the comments.

The paper is available for download here, and the other papers in the series from are available here.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/12/20/institutional-readiness-for-analytics-practice-and-policy/feed/ 0
UK SoLAR meeting feedback http://blogs.cetis.org.uk/sheilamacneill/2012/12/06/uk-solar-meeting-feedback/ http://blogs.cetis.org.uk/sheilamacneill/2012/12/06/uk-solar-meeting-feedback/#comments Thu, 06 Dec 2012 10:33:28 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1946 Last month in collaboration with the colleagues at the OU we co-hosted the inaugural UK SoLAR Flare. A number of blogs, pictures and videos of the day are available on the SoLAR website.

This was the first meeting in the UK focusing on learning analytics and as such we had quite a broad cross section of attendees. We’ve issued a small survey to get some feedback from delegates, and many thanks to all the attendees who completed it. We had 20 responses in total and you can access collated results of the survey from here.

Overall, 100% of respondents found the day either very useful or useful, which is always a good sign, and bodes well for the beginnings of a new community of practice and future meetings

The need for staff development and a range of new skills is something that is being increasingly identified for successful analytics projects and is an underlying theme our current Analtyics Series. The role of the Data Scientist is being increasingly recognised as a key role both in the “real” world and in academia. So what roles did our attendees have? Well, we did have one data scientist, but perhaps not that surprisingly the most common role was that of Learning Technologist with 5 people. The full results were as follows:

learning technologist 5
manager 3
lecturer 3
developer 3
researcher 3
data scientist 1
other 2
(other answers; “director/agile manager” “sort of learning technologist but also training”

So a fair spread of roles which again bodes well for the development of teams with the skill needed to develop successful analytics projects.

We also asked attendees to share the main idea that they took away from the day. Below is a selection of responses.

“That people are in the early stages of discussion.”

“Learning analytics needs to reach out end-users”

“The overall idea was how many people are in the same position and that the field is in a very experimental stage. This improves the motivation to be experimental.”

“more a better understanding of the current status than a particular idea. But if I had to chose one idea it is the importance of engaging students in the process.”

“Early thoughts on how learning analytics could be used in the development of teaching staff.”

“That HE is on the cusp of something very exciting and possibly very enlightening regarding understanding the way students learn. BUT the institution as a whole needs to be commited to the process, and that meaningful analysis of the mass of potential data that is ‘out there’, is going to be critical. There is also the very important issues of ethics and who is going to do what with the data………I could go on, and on, and on…….”

Suggestions for further meetings included:

“It would be great to involve more academic teaching staff and students in future meetings.”

“I think bringing together the different stakeholders (technologists, teachers, students, data scientists, statisticians) is a great feature for this group. It is easy to break into silos and forget the real end-user. Having more students involved would be great.”

“An international project exchange. Have, say, 10 – 15 lightning talks. Then organise a poster session with posters corresponding to the lightning talks. People whose interest was drawn by one project or another will have the chance to follow up on that project for further information. Also maybe an expert panel (with people that have experience with putting learning analytics into educational practice) that can answer questions sent in beforehand by people wanting to set up a learning analytics project/activity. This can also be done Virtually”

“Would really welcome the opportunity to have a ‘hands on’ session possibly focussing upon the various dashboards that are out there.”

You can access the full results at the SoLAR website.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/12/06/uk-solar-meeting-feedback/feed/ 0
Analytics for Understanding Research http://blogs.cetis.org.uk/sheilamacneill/2012/12/04/analytics-for-understanding-research/ http://blogs.cetis.org.uk/sheilamacneill/2012/12/04/analytics-for-understanding-research/#comments Tue, 04 Dec 2012 11:07:46 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1941 After a bit of exploration of the history, meanings and definitions of analytics from Adam Cooper, today our Analytics Series continues with the Analytics for Understanding Research paper (by Mark Van Harmelen).

Research and research management are key concerns for Higher Education, and indeed the wider economy. The sector needs to to ensure it is developing, managing, and sharing research capacity, capabilities, reputation and impact as effectively and efficiently as possible.

The use of analytics platforms has the potential to impact all aspects of research practice from the individual researcher in sharing and measuring their performance, to institutional management and planning of research projects, to funders in terms of decision making about funding areas.

The “Analytics for Understanding Research” paper focuses on analytics as applied to “the process of research, to research results and to the measurement of research.” The paper highlights exemplar systems, metrics and analytic techniques backed by evidence in academic research, the challenges in using them and future directions for research. It points to the need for the support and development of high quality, timely data for researchers to experiment with in terms of measuring and sharing their reputation and impact, and the wider adoption of platforms which utilise publicly available (and funded) data to inform and justify research investment.

Some key risks involved in the use of analytics to understand research highlighted in the paper are:

*Use of bibliometric indicators as the sole measure of research impact or over-reliance on metrics without any understanding of the context and nature of the research.
*Lack of understanding of analytics and advantages and disadvantages of different indicators on the part of users of those indicators. Managers and decision makers may lack the background needed to interpret existing analytics sensitively.
*The suitability of target-based assessment based on analytics is unproven. A wider assessment approach was tentatively recommended above (in most detail on page 29).
*There is a danger of one or a few vendors supplying systems that impose a particular view of analytics on research management data.

However it also points to some key opportunities including:

*Access to high-quality timely analytics may enable professionals to gauge their short-term performance, and use experimentation to discover new and novel ways to boost their impact.
*Adoption of CERIF-based CRIS across UK HE institutions and research institutes, with automatic retrieval of public data by UK Research Councils may help motivate increases in public funding of scientific and other scholarly activity; vitally important to the UK economy and national economic growth.
*Training as to the advantages, limitations and applicability of analytics may assist in the effective use of analytics its lay users, including researchers, research managers, and those responsible for policy and direction in institutions and beyond.

As ever, if you have any thoughts or experiences you’d like to share, please do so in the comments.

The paper is available to download here .

The papers published to date in the series are all available here.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/12/04/analytics-for-understanding-research/feed/ 0
Legal, Risk and Ethical Aspects of Analytics in Education http://blogs.cetis.org.uk/sheilamacneill/2012/11/27/legal-risk-and-ethical-aspects-of-analytics-in-education/ http://blogs.cetis.org.uk/sheilamacneill/2012/11/27/legal-risk-and-ethical-aspects-of-analytics-in-education/#comments Tue, 27 Nov 2012 09:17:24 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1936 After some initial feedback on the CETIS Analytics Series, we’ve had a wee re-think of our publication schedule and today we launch the “Legal, Risk and Ethcial Aspects of Analytics in Education” written by David Kay (Sero Consulting), Naomi Korn and Professor Charles Oppenheim.

As all researchers are only too well aware, any practice involving data collection and reuse has inherent ethical and legal implications of which institutions must be cognisant. Most institutions have guidelines and policies in place for the collection and use of research data in place. However, the gathering of usage data primarily from internal systems, is an area where it is less commonplace for institutions to have legal and ethical guidelines in place. As with a number of developments in technology, current laws have not developed at a similar pace.

The “Legal, Risk and Ethical Aspects of Analytics in Higher Education” paper provides a concise overview of legal and ethical concerns in relation to analytics in education. It outlines a number of legal actors which impinge on analytics for education, in particular:

* Data Protection
* Confidentiality & Consent
* Freedom of Information
* Intellectual Property Rights
* Licensing for Reuse.

The paper also recommends a set of common principles which have universal application.

*Clarity; open definition of purpose, scope and boundaries, even if that is broad and in some respects extent open-ended,

*Comfort & care; consideration for both the interests and the feelings of the data subject and vigilance regarding exceptional cases,

*Choice & consent; informed individual opportunity to opt-out or opt-in,

*Consequence & complaint; recognition that there may be unforeseen consequences and therefore provision of mechanisms for redress.

Being aware of the legal and ethical implications of any activity requiring data collection is fundamental before undertaking any form of data analysis activity, and we hope this paper will be of use in helping inform and develop practice. As ever, if you have any comments/ examples please use the comments section to share them with us.

The paper is available to download here.

The papers published so far in the series are:

*Analytics, What is Changing and Why does it Matter?
*Analytics for the Whole Institution; Balancing Strategy and Tactics
*Analytics for Learning and Teaching

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/11/27/legal-risk-and-ethical-aspects-of-analytics-in-education/feed/ 2
Analytics for Teaching and Learning http://blogs.cetis.org.uk/sheilamacneill/2012/11/23/analytics-for-teaching-and-learning/ http://blogs.cetis.org.uk/sheilamacneill/2012/11/23/analytics-for-teaching-and-learning/#comments Fri, 23 Nov 2012 09:17:48 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1928 It’s all been about learning analytics for me this week. Following the SoLAR UK meeting on Monday, I’m delighted to announced that next paper in the CETIS Analytics Series, “Analytics for Teaching and Learning” launches today.

Building on from “Analytics for the Whole Institution, balancing strategy and tactics“, this paper (written by Mark Van Harmelen and David Workman) takes a more in-depth look at issues specifically related to applying analytics in teaching and learning.

The Analytics for Teaching and Learning paper examines:

” the use of analytics in education with a bias towards providing information that may help decision makers in thinking about analytics in their institutions. Our focus is pragmatic in providing a guide for this purpose: we concentrate on illustrating uses of analytics in education and on the process of adoption, including a short guide to risks associated with analytics.”

Learning analytics is an emerging field of research and holds many promises of improving engagement and learning. I’ve been following developments with interest and I hope a healthy level of scepticism and optimism. A number of VLEs (or LMSs if you’re in North America) are now shipping with built in analytics features aka dashboards. However, as I pointed out in the “Analytics, what is changing and why does it matter?” paper, there really isn’t a “magic analytics” button which will suddenly create instantly engaged students and better results. Effective use and sense making of any data requires lots of considerations. You need to think very carefully about the question(s) you want the data help you to answer and then ensure that results are shared with staff and students in ways that allow them to gain “actionable insights”. Inevitably the more data you gather, the more questions you will ask. As Adam summarised in his “how to do analytics right” post a simple start can be best. This view was echoed at discussions during the SoLAR meeting on Monday.

Starting at small scale, developing teams, sharing data in meaningful ways, developing staff/student skills and literacies are all crucial to successful analytics projects. The need for people with both data handling, interpretation and within education, pedagogic understanding is becoming more apparent. As the paper points out,

“There are a variety of success factors for analytics adoption. Many of them are more human and organisational in nature than technical. Leadership and organisational culture and skills matter a lot.”

Again if you have any thoughts/experiences to share, please feel free to leave a comment here.

The paper can be downloaded from here.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/11/23/analytics-for-teaching-and-learning/feed/ 1
Quick links from SoLAR Flare meeting http://blogs.cetis.org.uk/sheilamacneill/2012/11/20/quick-links-from-solar-flare-meeting/ http://blogs.cetis.org.uk/sheilamacneill/2012/11/20/quick-links-from-solar-flare-meeting/#comments Tue, 20 Nov 2012 08:52:08 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1906 So we lit the UK SoLAR Flare in Milton Keynes yesterday, and I think it is going to burn brightly for some time. This post is just a quick round up of some links to discussions/blogs/tweets and pics produced over the day.

Overviews of the presentations and discussions were captured by some live blogging from Myles Danson (JISC Programme Manager for our Analytics Series)

and Doug (master of the live blog) Clow of the OU.

Great overview of the day – thanks guys!

And our course we have some twitter analytics thanks to our very own Martin Hawksey’s TAGs archive for #FlareUK and the obligitory network diagram of the twitter stream (click the image to see larger, interactive version)

#FlareUK hashtag user community network

Slides from the morning presentations and subsequent group discussions are available from the the SoLAR website, and videos of the morning presentations will be available from there soon too.

As a taster of the day – here’s a little video of what went on.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/11/20/quick-links-from-solar-flare-meeting/feed/ 0
Analytics for the Whole Institution; Balancing Strategy and Tactics http://blogs.cetis.org.uk/sheilamacneill/2012/11/19/analytics-for-the-whole-institution-balancing-strategy-and-tactics/ http://blogs.cetis.org.uk/sheilamacneill/2012/11/19/analytics-for-the-whole-institution-balancing-strategy-and-tactics/#comments Mon, 19 Nov 2012 07:49:09 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1892 Following on from last week’s introductory and overview briefing paper, Analytics, what is changing and why does it matter?, this week we start to publish the rest of our series, beginning with “Analytics for the Whole Institution; Balancing Strategy and Tactics” (by David Kay and Mark van Harmelen)

Institutional data collection and analysis is not new to institutions, and most Higher Education Institutions and Further Education Colleges do routinely utilise collect data for a range of purposes, and many are using Business Intelligence (BI) as part of their IT infrastructure.

This paper takes an in-depth look as some of the issues which “pose questions about how business intelligence and the science of analytics should be put to use in customer facing enterprises”.

The focus is not on specific technologies, rather on how best to act upon the potential of analytics and new ways of thinking about collecting, sharing and reusing data to enable high value gains in terms of business objectives across an organisation.

There a number of additional considerations when trying to align BI solutions with some of the newer approaches now available for applying analytics across an organisiation.  For example, it is not uncommon for there to be a disconnect between gathering data from centrally managed systems and specific teaching and learning systems such as VLEs. So at a strategic level, decisions need to be taken about overall data management, sharing and re-use e.g. what sytems hold the most useful/ valuable data? What formats is avaiable in? Who has access to the data and how can it be used to develop actional insights? To paraphrase from a presentation I gave with my colleague Adam Cooper last week “how data ready and capabile is your organisation?”, both in terms of people and systems.

As well as data considerations, policies (both internally and externally) need to be developed in terms of ethical use of data, and also in terms of developing staff and the wider organisational culture to developed data informed practices. Of course, part of the answers to these issues lie in sharing in the sharing and development of practice through organisations suchs as JISC. The paper highlights a number of examples of JISC funded projects.  

Although the paper concentrates mainly on HEIs, many of the same considerations are relevant to the Further Education colleges. Again we see this paper as a step in widening participation and identifying areas for further work. 

At an overview level the paper aims to:

*Characterise the educational data ecosystem, taking account of both institutional and individual needs
*Recognise the range of stakeholders and actors – institutions, services (including shared above-campus and contracted out), agencies, vendors
*Balance strategic policy approaches with tactical advances
*Highlight data that may or may not be collected
*Identify opportunities, issues and concerns arising

As ever we’d welcome feedback on any of the issues raised in the paper, and sharing of any experiences and thoughts in the comments.

The paper is available to download from here.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/11/19/analytics-for-the-whole-institution-balancing-strategy-and-tactics/feed/ 1
Analytics, what is changing and why does it matter? http://blogs.cetis.org.uk/sheilamacneill/2012/11/14/analytics-what-is-changing-and-why-does-it-matter/ http://blogs.cetis.org.uk/sheilamacneill/2012/11/14/analytics-what-is-changing-and-why-does-it-matter/#comments Wed, 14 Nov 2012 07:00:24 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1859 A couple of tricky questions in that title, but hopefully some answers are are provided in a series of papers we are launching today.

The CETIS Analytics Series consists of 11 papers, written by a range of our staff (and some commissioned pieces) looking at a range of topics relevant to Analytics in education. The series is intended to provide a broad landscape of the history, context, issues and technologies of Analytics in post 16 education, and in particular the UK context.

As this diagram below illustrates, the series covers four main areas: “big issues” which consists of in depth reports on issues relating to the whole institution including ethical and legal, learning and teaching, research management; “history and context” which looks at the history and development of analytics in more generally; “practice” which looks some of the issues around implementing analytics particularly in HE institutions; and “technology” which reviews a number of technologies and tools available just now.

The Cetis Analytics Series Graphic
(click graphic to see larger image)

The series provides a background, critique and pointers to current and future developments to help managers and early adopters develop their thinking and practice around the use of analytics. As Adam Cooper highlights

“Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data.”

We hope that the papers will help people in developing processes to not only identify actionable insights, but also how to develop processes, and more importantly, the staff/student skills and literacies, to produce measurable impacts across the range of activities undertaken in educational organisations such as universities and colleges. As Nate Silver demonstrated in the recent US election, it’s not just about having the data, it’s being able make sense of it and communicate findings effectively that makes the difference.

Given the that this is a rapidly developing field, it is impossible to cover every everything, but we hope that the papers will provide a solid basis for discussion and pointers for further work. Of course as well as the papers, we continue to report on our work and thoughts around data and analytics. For example, over the past month or so, Sharon Perry has been summarising a number of significant outputs and findings from the JISC Relationship Management Programme on her blog. Next week we co-host the inaugural UK SoLAR Flare with colleges from the OU (UK) which will provide another opportunity to help identify key areas for further research and collaboration.

We’ll be publishing the papers between now and early January, and each will have an accompanying blog post providing bit more context for each and the opportunity for feedback and discussion. Below is a list of titles with the week of its publication.

* Analytics for the Whole Institution; Balancing Strategy and Tactics (19th November)
* Analytics for Learning and Teaching (22 November)
* Analytics for Understanding Research (22 November)
* What is Analytics? Definition and Essential Characteristics (4 December)
* Legal, Risk and Ethical Aspects of Analytics in Higher Education (4 December)
* A Framework of Characteristics for Analytics (18 December)
* Institutional Readiness for Analytics (19 December)
* A Brief History of Analytics (8 January)
* The Implications of Analytics for Teaching Practice in Higher Education (8 January)
* Infrastructure and Tools for Analytics (15 January)

Today we start with an overview briefing paper which provides and overview and sets the context for the series. You can download the paper from the link below.

*Analytics, what is changing and why does it matter ? briefing paper.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/11/14/analytics-what-is-changing-and-why-does-it-matter/feed/ 1
Lighting the UK SoLAR Flare http://blogs.cetis.org.uk/sheilamacneill/2012/09/18/lighting-the-uk-solar-flare/ http://blogs.cetis.org.uk/sheilamacneill/2012/09/18/lighting-the-uk-solar-flare/#comments Tue, 18 Sep 2012 05:30:37 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1765 I’m delighted to announce that CETIS and the OU are co-sponsoring the first UK SoLAR Flare meeting on Monday 19th November in Milton Keynes.

This is the first UK gathering dedicated to the field of Learning Analytics. Under the auspices of SoLAR (Society for Learning Analytics Research).

Part of SoLAR’s mission is to improve the quality of dialogue within and across the many stakeholders impacted by Learning Analytics. Flare events are “a series of regional practitioner-focused events to facilitate the exchange of information, case studies, ideas, and early stage research.”

We are therefore inviting technology specialists, researchers, educators, ICT purchasing decision-makers, senior leaders, business intelligence analysts, policy makers, funders, students, and companies to join us in Milton Keynes for this inaugural event.

We’ve designed the day to maximise social learning with plenty of opportunity to meet with peers and explore collaboration possibilities, the chance to hear — and share — lightning updates on what’s happening and the opportunity to shape future Flares.

So if you’re involved in any aspect of analytics and want to share your work, or would just like to find out more, join us in Milton Keynes. The event is free to attend, but places are limited, so book quickly.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/09/18/lighting-the-uk-solar-flare/feed/ 1
Big data, learning analytics, a crack team from the OU . . . and me http://blogs.cetis.org.uk/sheilamacneill/2012/09/14/big-data-learning-analytics-a-crack-team-from-the-ou-and-me/ http://blogs.cetis.org.uk/sheilamacneill/2012/09/14/big-data-learning-analytics-a-crack-team-from-the-ou-and-me/#comments Fri, 14 Sep 2012 07:57:36 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1758 Yesterday I was part of a panel in the Big Data and Learning Analtyics Symposium at ALT-C.   Simon Buckingham Schum, Rebecca Fergusson, Noami Jeffrey, Kevin Mayles and Richard Nurse the “crack team” from the OU gave a really useful overview of the range of work they are all undertaking in the OU. Simon’s blog has details of the session and our introductory slides.

We were pleasantly surprised by the number of delegates who came to the session given we were scheduled at the same time as yesterday’s invitied speakers Professor Mark Stubbs and Sarah Porter. The level of discussion and interest indicated the growing realisation of the potential and the challenges for analytics across the education sector.

As ever it is hard to report effectively on a discussion session however a few issues which seemed to resonate with everyone in the room were:

*the danger of recommendation systems reducing and not extending choice 
*data driven v data deterimistic decision making
*the difference between measuring success and success in learning – they are not the same
*the danger of “seduction by stats” by senior management
*the need for the development of new skills sets and roles within institutions based on data science but with the ability to communicate with all staff to help question the data. 
*the increased need for development of statistical literacy for all staff and students 
*the potential for learning analytics in terms of expanding the flipped classroom model allowing teachers and students more time for sense making and actually thinking about the teaching and learning process.

Many of these issues will be covered in a series of papers we will be releasing next month as part of our Reconnoitre work.  And the discussions will be continued at a SoLAR meeting in November which we are co-hosting with the OU (more details on that in the next few days).

 

 

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/09/14/big-data-learning-analytics-a-crack-team-from-the-ou-and-me/feed/ 3
Confronting Big Data and Learning Analytics @ #altc2012 http://blogs.cetis.org.uk/sheilamacneill/2012/09/07/confronting-big-data-and-learning-analytics-altc2012/ http://blogs.cetis.org.uk/sheilamacneill/2012/09/07/confronting-big-data-and-learning-analytics-altc2012/#comments Fri, 07 Sep 2012 08:10:18 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1740 Next Thursday morning I’m participating in the Big Data and Learning Analytics symposium at ALT-C 2012 with colleagues from the OU, Simon BuckinghamShum, Rebecca Ferguson, Naomi Jeffery, Kevin Mayles and Richard Nurse.

The session will start will a brief overview from me of the analytics reconnoitre that CETIS is currently undertaking for JISC, followed by short overviews from different parts of the OU on a number of analytics projects and initiatives being undertaken there. We hope the session will:

“air some of the critical arguments around the limits of decontextualised data and automated analytics, which often appear reductionist in nature, failing to illuminate higher order learning. There are complex ethical issues around data fusion, and it is not clear to what extent learners are empowered, in contrast to being merely the objects of tracking technology. Educators may also find themselves at the receiving end of a new battery of institutional ‘performance indicators’ that do not reflect what they consider to be authentic learning and teaching.”

We’re really keen to have open discussion with delegates and engage with their views and experiences in relation to big data and learning analytics. So, come and join us bright and early (well, 9am) on Thursday. If you can’t make the session, but have some views/experiences then please feel free to leave comments here and I’ll do my best to raise them at the session and in my write up of the it.

More information about the session is available here.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/09/07/confronting-big-data-and-learning-analytics-altc2012/feed/ 0
Some thoughts on web analytics uisng our work on analytics http://blogs.cetis.org.uk/sheilamacneill/2012/06/07/some-thoughts-on-web-analytics-uisng-our-work-on-analytics/ http://blogs.cetis.org.uk/sheilamacneill/2012/06/07/some-thoughts-on-web-analytics-uisng-our-work-on-analytics/#comments Thu, 07 Jun 2012 12:05:44 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1602 As I’ve mentioned before, CETIS are in the middle of a piece of work for JISC around analytics in education (our Analytics Reconnoitre project). You may have noticed a number of blog posts from myself and colleagues around various aspects of analytics in education. We think this is a “hot topic” but is it? Can our analytics help us to gauge interest?

CETIS, like many others, is increasingly using Google Analytics to monitor traffic on our website. We are also lucky to have in Martin Hawksey, a resident google spreadsheet genius. Since Martin has come to work with us, we have been looking at ways we can use some of his “stuff” to help us develop our communications work, and gain more of an understanding of how people interact with our overall web presence.

As part of the recent CETIS OER visualisation project, Martin has explored ways of tracking social sharing of resources. Using this technique Martin has adapted one of his spreadsheet so that it not only takes in google analytics from our CETIS blog posts, but also combines the number of shares a post is getting from these social sharing sites: Buzz, Reddit, Stumbleupon, Diggs, Pinterest, Delicious, Google+, Facebook and Twitter. By adding the the rss feed from our Analytics topic area, we get a table like this which combines the visit and comments information with the number of shares a post gets on each of the sharing sites.

social sharing stats for JISC CETIS analytics topic feed

(NB Martin’s blog is not hosted on our CETIS server so we can’t automagically pull his page view info in this way which is why there is a 0 value in the page view column for his posts, but I think we can safely say that he gets quite a few page views)

From this table it is apparent that Twitter is the main way our posts are being shared. Linkedin comes in second with delicious and google+ also generating a few “shares”. The others get virtually no traffic. We already knew that twitter is a key amplification tool for us, and again Martin’s magic has allowed us to create a view of the top click throughs from Twitter on our blog posts.

JISC CETIS Top twitter distributers

We could maybe be accused of playing the system, as you can see a number of our top re-tweeters are staff members – but if we can’t promote our own stuff, then we can hardly expect anyone else to!

But I digress, back to the main point. We can now get an overview of traffic on a particular topic area, and see not only the number of visits and comments it is getting but also where else it is being shared. We can then start to make comparisons across topic areas.

This is useful on a number of levels beyond basic web stats. Firstly, it gives us another view on how our audience shares and values our posts. I think we can say that if someone book marks a post, they do place some value on it. I would hesitate to start to quantify what that value is, but increasingly we are being asked about ROI so it is something we need to consider. Similarly with re-tweets, if something is re-tweeted they people want to share that resource and so feel that it is of value to their twitter network. I don’t see a lot of bot retweets in the my network. It also allows us to share and evaluate more information not only internally, but also with our funders (and through posts like this) our community.

It also raises some questions wider questions about resource sharing and web analytics in general. Martin raised this issue last year with this post which sparked this reply from me. The questions I raised there are still on my mind, and increasingly as I explore this more in the context of CETIS, I think I am beginning to see more evidence of the habits and practice of our community.

Twitter is a useful dissemination channel, and increasingly a key way for peer sharing of information. The use of other social sharing sites, would appear to be not so much. Tho’ I was surprised to see relatively high numbers for linked in. Again this might be down to the “professional” nature of linked in – or the fact that I am an unashamed social media tart, and repost all my blog posts in linked in too :-) We also have sharing buttons on the bottom of our posts which have very obvious buttons for twitter, linked in and Facebook.

In terms of other social sharing sites, are these just more a question of people’s own work practices and digital literacies? Are these spaces seen as more private? Or is it just that people still don’t really use them that much, did the delicious debacle affect our trust in such sites? Should we encourage more sharing by having more obvious buttons for the other sites listed in the table? And more importantly should JISC and its funded services and projects be looking towards these sites for more measures of impact and engagement? Martin’s work illustrates how you can relatively easily combine data from different sources, and now there are some templates available there really isn’t a huge time cost to adapt them, but are they gathering the relevant data? Do we need to actively encourage more use of social sharing sites? I’d be really interested to hear of any thoughts/ experiences other have of any of these issues.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/06/07/some-thoughts-on-web-analytics-uisng-our-work-on-analytics/feed/ 6
Some useful resources around learning analytics http://blogs.cetis.org.uk/sheilamacneill/2012/05/14/some-useful-resources-around-learning-analytics/ http://blogs.cetis.org.uk/sheilamacneill/2012/05/14/some-useful-resources-around-learning-analytics/#comments Mon, 14 May 2012 09:10:41 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1540 As I’ve mentioned before, and also highlighted in a number of recent posts by my colleagues Adam Cooper and Martin Hawskey, CETIS is undertaking some work around analytics in education which we are calling our Analytics Reconnoitre.

In addition to my recent posts from the LAK12 conference, I thought it would be useful to highlight the growing number of resources that the our colleagues in Educause have been producing around learning analytics. A series of briefing papers and webinars are available which covering a range of issues around the domain. For those of you not so familiar with the area, a good starting point is the “Analytics in Education: Establishing a Common Language” paper which gives a very clear outline of a range of terms being used in the domain and how they relate to teaching and learning.

For those of you who want to delve a bit deeper the resource page also links to the excellent “The State of Learning Analytics in 2012: A Review and Future Challenges” report by Rebecca Ferguson, from the OU’s KMI, which gives a comprehensive overview of the domain.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/05/14/some-useful-resources-around-learning-analytics/feed/ 1
5 things from LAK12 http://blogs.cetis.org.uk/sheilamacneill/2012/05/09/5-things-from-lak12/ http://blogs.cetis.org.uk/sheilamacneill/2012/05/09/5-things-from-lak12/#comments Wed, 09 May 2012 11:19:24 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1526

@sheilmcn I’d love to know the 5 most interesting things you learned from #lak12 today:)

— Brandon Muramatsu (@bmuramatsu) May 2, 2012

Following that challenge, I’m going to try and summarise my experiences and reflections on the recent LAK12 conference in the five areas that seemed to resonate with me over the 4 days of the conference (including the pre conference workshop day) which are: research, vendors, assessment, ethics and students.

Research
Learning Analytics is a newly emerging research domain. This was only the second LAK conference, and to an extent the focus of the conference was on trying to establish and benchmark the domain. Aberlardo has summarised this aspect of the conference far better than I could. Although I went to the conference with an open mind, and didn’t have set expectations I was struck by the research focus of the papers, and the lack of large(r) scale implementations. Perhaps this is due to the ‘buzzy-ness’ of the term learning analytics just now (more on that in the vendor section of this post) – and is not meant in any way as a critisism of the conference or the quality of the papers, both of which were excellent. On reflection I think that the pre-conference workshops gave more of an opportunity for discuss than the traditional paper presentation with short Q&A format which the conference followed. Perhaps for LAK13 a mix of presentation formats might be included. With any domain which hopes to impact on teaching and learning there are difficulties breaching the research and practice divide and personally I find workshops give more opportunity for discussion. That said, I did see a lot of interesting presentations which did have potential, including a reintroduction to SNAPP which Lori Lockyer and Shane Dawson presented at the Learning Analytics meets Learning Design workshop; a number of very interesting presentations from the OU on various aspects of their work in research and now applying analytics; the Mirror project, an EU funded work based learning project which includes a range of digital, physical and emotional analytics and the GLASS system presented by Derek Leony, Carlos III, Madrid to name just a few.

George Seimens presented his vision(s) for the domain in his keynote (this was the first keynote I have seen where the presenter’s ideas were shared openly during the presentation – such a great example of openness in practice). There was also an informative panel session around the differences and potential synergies with the Educational Data Mining community. SOLAR (the society for learning analytics research ) is planning a series of events to continue these discussions and scoping of the domain, and we at CETIS will be involved in helping with a UK event later this year.

Vendors
There were lots of vendors around. I didn’t get any impression of any kind of hard sell, but every educational tool be it LMS/VLE/CMS now has a very large, shiny new analytics badge on it – even if what is being offered is actually the same as before, but just with parts re-labelled. I’m not sure how much (or any) of the forward thinking research that was presented will filter down into large scale tools, but I guess that’s an answer in itself for the need for the research in this area. So we in the education community can be informed and ask questions challenging the vendors and the systems they present. I was impressed with a (new to me) system called canvas analytics which colleagues from the community college sector in Washington State briefly showed me. It seems to allow flexibility and customisation of features and UI, is cloud based and so has a more distributed architecture, has CC licensing built in, and a crowd sourced feature request facility.

With so many potential sources of data it is crucial that systems are flexible and can pull and push data out to a variety of end points. This allows users – both at the institutional back end and the UI end – flexibility over what they use. CETIS have been supporting JISC to explore notions of flexible provision through a number of programmes including DVLE.

Lori Lockyer made an timely reflection on the development of learning design drawing parallels with the learning analytics. This made me immediately think of the slight misnomer of learning design, which in many cases was actually more about teaching design. With learning analytics there are similar parallels but what also crossed my mind on more than one occasion was the notion of marketing analytics as a key driver in this space. This was probably more noticeable due to the North American slant of the conference. But I was once again struck by the differences in approaches to marketing of students in North America and the UK. Universities and colleges in the US have relatively huge marketing budgets compared to us, they need to get students into their classes and keep them there. Having a system or integrated systems which manage retention numbers, and if you like the more business intelligence end of the analytics spectrum, could gain traction far more quickly than ones that are exploring the much harder to qualify effective learning analytics. Could this lead us into a similar situation with VLEs/LMSs where there was a perceived need to have one (“everyone else has got one”), vendors sold the sector something which kind of looked like it did the job? Given my comments earlier about flexibility and pervasiveness of web services, I hope not, but some dark thoughts did cross my mind and I was drawn back to Gardner Campbell’s presentation questioning some of the narrow definitions of learning analytics.

Assessment
It’s still the bottom line, and the key driver for most educational systems, and in turn analytics about those systems. Improving assessment numbers gets senior management attention. The Signals project at Purdue is one of the leading lights in the domain of learning analytics, and John Campbell and the team there have, and continue to do an excellent job of gathering data from mainly their LMS and feed it back to students in ways that do have an impact. But again, going back to Gardner Campbell’s presentation, learning analytics as a research domain is not just about assessment. So, I was heartened to see lots of references to the potential for analytics to be used in terms of measuring competencies, which I think could have potential for students as it might help to contextualise existing and newly developed/ing competencies, and allow some more flexible approaches to recognition of competencies to be developed. More opportunities to explore the context of learning and not just sell the content? Again, relating back the role of vendors, I was reminded of how content driven the North American systems is. Vendors are increasingly offering competitive alternatives for elective courses with accreditation, as well as OERs (and of course collecting the data). In terms of wider systems, I’m sure that an end to end analytics system with content and assessment all bundled in is not that far off being offered, if it isn’t already.

Ethics
Data and ethics, collect one and ignore the other at your peril! My first workshop was one run by Sharon Slade and Finella Gaphin from the OU and I have to say, I think it was a great start to the whole week (not just because we got to play snakes and ladders) as ethics and our approaches to them underline all the activity in this area. Most attention just now is focusing on issues of privacy, but there are a host of other issues including:
*power – who gets to decided what is done with the data?
*rights – does everyone have the same rights to use data? who can mine data for other purposes?
*ownership – do students own their data – what are the consequences of opt outs?
*responsibility – is there shared responsibility between institutions and students?

Doug Clow live blogged the workshop if you want more detailed information, and it is hoped that a basis for a code of conduct can be developed from the session.

Students
Last, but certainly not least, students. The student voice was at times deafening by its silence. At several points during the conference, particularly during the panel session on Building Organisational Capacity by Linda Baer and Dan Norris, I felt a growing concern about things being done “to” and not “with” students. Linda and Dan are conducting some insightful research into organisational capacity building and have already interviewed many (North American) institutions and vendors but there was very little mention of students. If learning analytics are going to really impact on learning and help transform pedagogical approaches, then shouldn’t we be talking about them to the students? What does really work for them? Are they aware of what data is being collected about them? Are they willing to let more data from informal sources e.g. Facebook, 4square etc be used in the context of learning analytics? Are they aware of their data exhaust? As well as these issues, Simon Buckingham-Schum made the very pertinent point, that if students were given access to their data, would they actually be able to do anything with it?

And also if we are collecting data about students shouldn’t we be also collecting similar data about teaching staff?

I don’t want to add yet another literacy to the seemingly never ending list, but this does tie in with the wider context of digital literacy development. Sense making of data and visualisations is key if learning analytics is to gain traction in practice, and it’s not just students who are falling short, it’s probably all of us. I saw lots of “pretty pictures” in terms of network visualisations, potential dashboard views, etc over the week – but did I really understand them? Do I have the necessary skills to properly de-code and make sense of them? Sometimes, but not all the time. I think visualisations should come with a big question mark symbol attached or overlaid – they should always raise questions. at the moment I don’t think enough people have the skills to be able to confidently question them.

Overall it was a very thought provoking week, with too much to included in one post but if you have a chance take a look at Katy Borner’s keynote Visual Analytics in Support of Education one of my highlights.

So, thanks to all the organisers for creating such a great atmosphere for sharing and learning. I’m looking forward to LAK13 and what advances will be made in the coming year and if a European location will bring some a different slant to the conference.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/05/09/5-things-from-lak12/feed/ 8
LAK12 Useful links and resources http://blogs.cetis.org.uk/sheilamacneill/2012/05/03/lak12-useful-links-and-resources/ http://blogs.cetis.org.uk/sheilamacneill/2012/05/03/lak12-useful-links-and-resources/#comments Thu, 03 May 2012 14:30:43 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1514 There has been a huge amount of activty at this year’s LAK confrence. I’m still cogitating about the issues raised and will post my reflections over the next few days. However, in the meantime there were a number of really interesting tools and resources which were presented and which are available from this Diigo site George Siemens has set up.

Doug Clow has been doing a splendid (and quite awe inspiring) job of live blogging and has summary links of resources and his posts here. Myles Danson has also done some useful live blog posts from sessions too. We also have some really useful twitter activity summaries from Tony Hirst and Martin Hawkesy.

*Update - Audrey Watters review of the conference.

And just in case you missed them :-) below is a time line view of my collected tweets and a few pictures from the past few days.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/05/03/lak12-useful-links-and-resources/feed/ 0
LAK12 Pre conference workshop quick overview http://blogs.cetis.org.uk/sheilamacneill/2012/04/30/lak12-pre-conference-workshop-quick-overview/ http://blogs.cetis.org.uk/sheilamacneill/2012/04/30/lak12-pre-conference-workshop-quick-overview/#comments Mon, 30 Apr 2012 00:58:31 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1498 I’ve had a very informative and stimulating day at the preconference workshops for the LAK12 conference. This is just a very quick post with links to some great summaries and resources that people have contributed.

*Learning Analtyics and Ethics live blog summary from Doug Clow (thanks, Doug you truly are a conference reporting machine!)

*Learning Analytics and Linked Data collective google doc – various contributors.

There has also been quite a bit of twitter activity and Tony Hirst was quick off the mark to visualise the connections. Martin Hawskey has also produced an alternative visualisation based on the twitter archive I set up last week I set up last week; and here’s another summary view from Tweetlevel.

I’ll hopefully do some more considered posts myself during the week. Based on today’s sessions this is shaping up to be a great conference.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/04/30/lak12-pre-conference-workshop-quick-overview/feed/ 2
Learning Analytics, where do you stand? http://blogs.cetis.org.uk/sheilamacneill/2012/03/09/learning-analytics-where-do-you-stand/ http://blogs.cetis.org.uk/sheilamacneill/2012/03/09/learning-analytics-where-do-you-stand/#comments Fri, 09 Mar 2012 09:19:03 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1406 For? Against? Not bovvered? Don’t understand the question?

The term learning analytics is certainly trending in all the right ways on all the horizons scans. As with many “new” terms there are still some mis-conceptions about what it actually is or perhaps more accurately what it actually encompasses. For example, whilst talking with colleagues from the SURF Foundation earlier this week, they mentioned the “issues around using data to improve student retention” session at the CETIS conference. SURF have just funded a learning analytics programme of work which closely matches many of the examples and issues shared and discussed there. They were quite surprised that the session hadn’t be called “learning analytics”. Student retention is indeed a part of learning analytics, but not the only part.

However, back to my original question and the prompt for it. I’ve just caught up with the presentation Gardner Campbell gave to the LAK12 MOOC last week titled “Here I Stand” in which he presents a very compelling argument against some of the trends which are beginning to emerge in field of learning analytics.

Gardner is concerned that there is a danger of that the more reductive models of analytics may actually force us backwards in our models of teaching and learning. Drawing an analogy between M theory – in particular Stephen Hawkins description of there being not being one M theory but a “family of theories” – and how knowledge and learning actually occur. He is concerned that current learning analytics systems are based too much on “the math” and don’t actually show the human side of learning and the bigger picture of human interaction and knowledge transfer. As he pointed out “student success is not the same as success as a student”.

Some of the rubrics we might be tempted to use to (and in cases already are) build learning analytics systems reduce the educational experience to a simplistic management model. Typically systems are looking for signs pointing to failure, and not for the key moments of success in learning. What we should be working towards are system(s) that are adaptive, allow for reflection and can learn themselves.

This did make me think of the presentation at FOFE11 from IBM about their learning analytics system, which certainly scared the life out of me and many other’s I’ve spoken too. It also raised a lot of questions from the audience (and the twitter backchannel) about the educational value of the experience of failure. At the same time I was reflecting on the whole terminology issue again. Common understandings – why are they so difficult in education? When learning design was the “in thing”, I think it was John Casey who pointed out that what we were actually talking about most of the time was actually “teaching design”. Are we in danger of the same thing happening to the learning side of learning analytics being hi-jacked by narrower, or perhaps to be fairer, more tightly defined management and accountability driven analytics ?

To try and mitigate this we need to ensure that all key stakeholders are starting to ask (and answering) the questions Gardner raised in his presentation. What are the really useful “analytics” which can help me as a learner, teacher, administrator, etc? Which systems provide that data just now ? How can/do these stakeholders access and share the data in meaningful ways? How can we improve and build on these systems in ways which take into account the complexity of learning? Or as Gardner said, how can we start framing systems and questions around wisdom? But before we can do any of that we need to make sure that our stakeholders are informed enough to take a stand, and not just have to accept whatever system they are given.

At CETIS we are about to embark on an analytics landscape study, which we are calling an Analytics Reconnoitre. We are going to look at the field of learning analytics from a holistic perspective, review recent work and (hopefully) produce some pragmatic briefings on the who, where, why, what and when’s of learning analytics and point to useful resources and real world examples. This will build and complement work already funded by JISC such as the Relationship Management Programme, the Business Intelligence Infokit and the Activity Data Programme synthesis. We’ll also be looking to emerging communities of practice, both here in the UK and internationally to join up on thinking and future developments. Hopefully this work will contribute to the growing body of knowledge and experience in the field of learning analytics and well as raising some key questions (and hopefully some answers) around around its many facets.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/03/09/learning-analytics-where-do-you-stand/feed/ 22