Sheila Macneill » Learning Analytics http://blogs.cetis.org.uk/sheilamacneill Cetis blog Wed, 25 Sep 2013 09:58:15 +0000 en-US hourly 1 http://wordpress.org/?v=4.1.22 Avoiding getting caught in the data slick: thoughts from Analtyics and Institutional Capabilities session, #cetis13 http://blogs.cetis.org.uk/sheilamacneill/2013/03/18/avoiding-the-getting-caught-in-the-data-slick-thoughts-from-analtyics-and-institutional-capabilities-session-cetis13/ http://blogs.cetis.org.uk/sheilamacneill/2013/03/18/avoiding-the-getting-caught-in-the-data-slick-thoughts-from-analtyics-and-institutional-capabilities-session-cetis13/#comments Mon, 18 Mar 2013 11:31:40 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2121 Data, data everywhere data, but what do we actually do with it? Do we need “big” data in education? What is it we are trying to find out? What is our ROI both at institutional and national levels? Just some the questions that were raised at the Analytics and Institutional Capabilities session at #cetis13 last week.

Is data our new oil? asked Martin Hawksey in his introduction to the session. And if, as many seem to think, it is, do we we really have the capabilities to “refine” it properly? How can we ensure that we aren’t putting the equivalent of petrol into a diesel engine? How can we ensure that institutions (and individuals) don’t end getting trapped in a dangerous slick of data? Are we ensuring that everyone (staff and students) are developing the data literacy skills they need to use and ultimately understand the visualisations we can produce from data?

Bird in an oil slick

Bird in an oil slick

Ranjit Sidhu (Statistics into Decisions) gave an equally inspiring and terrifying presentation around the hype of big data. He pointed out that in education “local data” and not “big data” is really where we should be focusing our attention, particularly in relation to our core business of attracting students. In relation to national level data he also questions the ROI on some “quite big” data national data collection activities such as the KIS. From the embarrassingly low figures he showed us of the traffic to the UniStats site, it would appear not. We may have caused a mini spike in the hits for one day in March :-)

However, there are people who are starting to ask the right questions and use their data in ways that are meaningful. A series of lightning talks which highlighted a cross section of approaches to using institutional data. This was followed by three inspiring talks from Jean Mutton (University of Derby), Mark Stubbs (MMU) and Simon Buckingham Shum (OU). Jean outlined the work she and her team have been doing at Derby on enhancing the student experience (more information on this is available through our new case study); Mark then gave a review of the work they have been doing around deeper exploration of NSS returns data and their VLE data. Both Jean and Mark commented that their work started without them actually realising they were “doing analytics”. Marks analytics cycle diagram was really useful in illustrating their approach.

screen shot of analtyics cycle

screen shot of analtyics cycle

Simon, on the other hand, of course very much knew that he was “doing analytics” and gave an overview of some the learning analtyics work currently being undertaken at the OU, including a quick look at some areas FutureLearn could potentially be heading.

Throughout all the presentations the key motivator has, and continues to be, framing and then developing the “right” questions to get the most out of data collection activity and analysis.

More information including links to the slides from the presentations are available on the CETIS website.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/03/18/avoiding-the-getting-caught-in-the-data-slick-thoughts-from-analtyics-and-institutional-capabilities-session-cetis13/feed/ 0
Cetis Analytics Series Volume 2: Engaging with Analytics http://blogs.cetis.org.uk/sheilamacneill/2013/03/13/cetis-analytics-series-volume-2-engaging-with-analytics/ http://blogs.cetis.org.uk/sheilamacneill/2013/03/13/cetis-analytics-series-volume-2-engaging-with-analytics/#comments Wed, 13 Mar 2013 08:04:01 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2113 Our first set of papers around analytics in education has been published, and with nearly 17,000 downloads, it would seem that there is an appetite for resources around this topic. We are now moving onto phase of our exploration of analytics and accompanying this will be a range of outputs including some more briefing papers and case studies. Volume 1 took a high level view of the domain, volume 2 will take a much more user centred view including a number of short case studies sharing experiences of a range of early adopters who are exploring the potential of taking a more analytics based approach.

The first case study features Jean Mutton, Student Experience Project Manager, at the University of Derby. Jean shares with us how her journey into the world of analytics started and how and where she and the colleagues across the university she has been working with, see the potential for analytics to have an impact on improving the student experience.

University of Derby, student engagement factors

University of Derby, student engagement factors

The case study is available to download here.

We have a number of other case studies identified which we’ll be publishing over the coming months, however we are always looking for more examples. So if you are working with analytics have some time to chat with us, we’d love to hear from you and share your experiences in this way too. Just leave a comment or email me (s.macneill@strath.ac.uk).

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/03/13/cetis-analytics-series-volume-2-engaging-with-analytics/feed/ 1
What can I do with my educational data? (#lak13) http://blogs.cetis.org.uk/sheilamacneill/2013/03/06/what-can-i-do-with-my-educational-data-lak13/ http://blogs.cetis.org.uk/sheilamacneill/2013/03/06/what-can-i-do-with-my-educational-data-lak13/#comments Wed, 06 Mar 2013 14:05:35 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2106 Following on from yesterday’s post, another “thought bomb” that has been running around my brain is something far closer to the core of Audrey’s “who owns your educational data?” presentation. Audrey was advocating the need for student owned personal data lockers (see screen shot below). This idea also chimes with the work of the Tin Can API project, and closer to home in the UK the MiData project. The latter is more concerned with more generic data around utility, mobile phone usage than educational data, but the data locker concept is key there too.

Screen shot of Personal Education Data Locker (Audrey Watters)

Screen shot of Personal Education Data Locker (Audrey Watters)

As you will know dear reader, I have turned into something of a MOOC-aholic of late. I am becoming increasingly interested in how I can make sense of my data, network connections in and across the courses I’m participating in and, of course, how I can access and use the data I’m creating in and across these “open” courses.

I’m currently not very active member of the current LAK13 learning analytics MOOC, but the first activity for the course is, I hope, going to help me frame some of the issues I’ve been thinking about in relation to my educational data and in turn my personal learning analytics.

Using the framework for the first assignment/task for LAK13, this is what I am going to try and do.

1. What do you want to do/understand better/solve?

I want to compare what data about my learning activity I can access across 3 different MOOC courses and the online spaces I have interacted in on each and see if I can identify any potentially meaningful patterns, networks which would help me reflective and understand better, my learning experiences. I also want to explore see how/if learning analytics approaches could help me in terms of contributing to my personal learning environment (PLE) in relation to MOOCs, and if it is possible to illustrate the different “success” measures from each course provider in a coherent way.

2. Defining the context: what is it that you want to solve or do? Who are the people that are involved? What are social implications? Cultural?

I want to see how/if I can aggregate my data from several MOOCs in a coherent open space and see what learning analytics approaches can be of help to a learner in terms of contextualising their educational experiences across a range of platforms.

This is mainly an experiment using myself and my data. I’m hoping that it might start to raise issues from the learner’s perspective which could have implications for course design, access to data, and thoughts around student created and owned eportfolios/and or data lockers.

3. Brainstorm ideas/challenges around your problem/opportunity. How could you solve it? What are the most important variables?

I’ve already done some initial brain storming around using SNA techniques to visualise networks and connections in the Cloudworks site which the OLDS MOOC uses. Tony Hirst has (as ever) pointed the way to some further exploration. And I’ll be following up on Martin Hawksey’s recent post about discussion group data collection .

I’m not entirely sure about the most important variables just now, but one challenge I see is actually finding myself/my data in a potentially huge data set and finding useful ways to contextualise me using those data sets.

4. Explore potential data sources. Will you have problems accessing the data? What is the shape of the data (reasonably clean? or a mess of log files that span different systems and will require time and effort to clean/integrate?) Will the data be sufficient in scope to address the problem/opportunity that you are investigating?

The main issue I see just now is going to be collecting data but I believe there some data that I can access about each MOOC. The MOOCs I have in mind are primarily #edc (coursera) and #oldsmooc (OU). One seems to be far more open in terms of potential data access points than the other.

There will be some cleaning of data required but I’m hoping I can “stand on the shoulders of giants” and re-use some google spreadsheet goodness from Martin.

I’m fairly confident that there will be enough data for me to at least understand the problems around the challenges for letting learners try and make sense of their data more.

5. Consider the aspects of the problem/opportunity that are beyond the scope of analytics. How will your analytics model respond to these analytics blind spots?

This project is far wider than just analytics as it will hopefully help me to make some more sense of the potential for analytics to help me as a learner make sense and share my learning experiences in one place that I chose. Already I see Coursera for example trying to model my interactions on their courses into a space they have designed – and I don’t really like that.

I’m thinking much more about personal aggregation points/ sources than the creation of actual data locker. However it maybe that some existing eportfolio systems could provide the basis for that.

As ever I’d welcome any feedback/suggestions.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/03/06/what-can-i-do-with-my-educational-data-lak13/feed/ 2
Prototyping my Cloudworks profile page http://blogs.cetis.org.uk/sheilamacneill/2013/02/12/prototyping-my-cloudworks-profile-page/ http://blogs.cetis.org.uk/sheilamacneill/2013/02/12/prototyping-my-cloudworks-profile-page/#comments Tue, 12 Feb 2013 15:57:57 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2047 Week 5 in #oldsmooc has been all about prototyping. Now I’ve not quite got to the stage of having a design to prototype so I’ve gone back to some of my earlier thoughts around the potential for Cloudworks to be more useful to learners and show alternative views of community, content and activities. I really think that Cloudworks has potential as a kind of portfolio/personal working space particularly for MOOCs.

As I’ve already said, Cloudworks doesn’t have a hierarchical structure, it’s been designed to be more social and flexible so its navigation is somewhat tricky, particularly if you are using it over a longer time frame than say a one or two day workshop. It relies on you as a user to tag and favourite clouds and cloudscapes, but even then when you’re involved in something like a mooc that doesn’t really help you navigate your way around the site. However cloudworks does have an open API and as I’ve demonstrated you can relatively easily produce a mind map view of your clouds which makes it a bit easier to see your “stuff”. And Tony Hirst has shown how using the API you can start to use visualisation techniques to show network veiws of various kinds.

In a previous post I created a very rough sketch of how some of Tony’s ideas could be incorporated in to a user’s profile page.

Potential Cloudworks Profile page

Potential Cloudworks Profile page

As part of the prototyping activity I decide to think a bit more about this and use Balsamiq (one of the tools recommended to us this week) to rough out some ideas in a bit more detail.

The main ideas I had were around redesigning the profile page so it was a bit more useful. Notifications would be really useful so you could clearly see if anything had been added to any of your clouds or clouds you follow – a bit like Facebook. Also one thing that does annoy me is the order of the list of my clouds and cloudscapes – it’s alphabetical. But what I really want at the top of the list is either my most recently created or most active cloud.

In the screenshot below you can see I have an extra click and scroll to get to my most recent cloud via the clouds list. What I tend to do is a bit of circumnavigation via my oldsmooc cloudscape and hope I have add my clouds it it.

Screen shot of my cloud and cloudscape lists

Screen shot of my cloud and cloudscape lists

I think the profile page could be redesigned to make use of the space a bit more (perhaps lose the cloud stream, because I’m not sure if that is really useful or not as it stands), and have some more useful/useble views of my activity. The three main areas I thought we could start grouping are clouds, cloudscapes (and they are already included) and add a community dimension so you can start to see who you are connecting with.

My first attempt:

screen shot of my first Cloudworks mock up

screen shot of my first Cloudworks mock up

Now but on reflection – tabs not a great idea and to be honest they were in the tutorial so I that’s probably why I used them :-)

But then I had another go and came up something slightly different. Here is a video where I explain my thinking a bit more.

cloudworks profile page prototype take 2 from Sheila MacNeill on Vimeo.

Some initial comments from fellow #oldsmooc-ers included:

and you can see more comments in my cloud for the week as well as take 1 of the video.

This all needs a bit more thought – particularly around what is actually feasible in terms of performance and creating “live” visualisations, and indeed about what would actually be most useful. And I’ve already been in conversation with Juliette Culver the original developer of Cloudworks about some of the more straight forward potential changes like the re-ordering of cloud lists. I do think that with a bit more development along these lines Cloudworks could become a very important part of a personal learning environment/portfolio.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/02/12/prototyping-my-cloudworks-profile-page/feed/ 2
Ghosts in the machine? #edcmooc http://blogs.cetis.org.uk/sheilamacneill/2013/02/08/ghosts-in-the-machine-edcmooc/ http://blogs.cetis.org.uk/sheilamacneill/2013/02/08/ghosts-in-the-machine-edcmooc/#comments Fri, 08 Feb 2013 12:06:22 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=2044 Following on from last week’s post on the #edcmooc, the course itself has turned to explore the notion of MOOCs in the context of utopian/dystopian views of technology and education. The questions I raised in the post are still running through my mind. However they were at a much more holistic than personal level.

This week, I’ve been really trying to think about things from my student (or learner) point of view. Are MOOCs really changing the way I engage with formal education systems? On the one hand yes, as they are allowing me (and thousands of others) to get a taste of courses from well established institutions. At a very surface level who doesn’t want to say they’ve studied at MIT/Stanford/Edinburgh? As I said last week, there’s no fee so less pressure in one sense to explore new areas and if they don’t suit you, there’s no issue in dropping out – well not for the student at this stage anyway. Perhaps in the future, through various analytical methods, serial drop outs will be recognised by “the system” and not be allowed to join courses, or have to start paying to be allowed in.

But on the other hand, is what I’m actually doing really different than what I did at school and when I was an undergraduate or was a student on “traditional’ on line, distance courses. Well no, not really. I’m reading selected papers and articles, watching videos, contributing to discussion forums – nothing I’ve not done before, or presented to me in a way that I’ve not seen before. The “go to class” button on the Coursera site does make me giggle tho’ as it’s just soo American and every time I see it I hear a disembodied American voice. But I digress.

The element of peer review for the final assignment for #edcmooc is something I’ve not done as a student, but it’s not a new concept to me. Despite more information on the site and from the team this week I’m still not sure how this will actually work, and if I’ll get my certificate of completion for just posting something online or if there is a minimum number of reviews I need to get. Like many other fellow students the final assessment is something we have been concerned about from day 1, which seemed to come as a surprise to some of the course team. During the end of week 1 google hang out, the team did try to reassure people, but surely they must have expected that we were going to go look at week 5 and “final assessment” almost before anything else? Students are very pragmatic, if there’s an assessment we want to know as soon as possible the where,when, what, why, who,how, as soon as possible. That’s how we’ve been trained (and I use that word very deliberately). Like thousands of others, my whole education career from primary school onwards centred around final grades and exams – so I want to know as much as I can so I know what to do so I can pass and get that certificate.

That overriding response to any kind of assessment can very easily over-ride any of the other softer (but just as worthy) reasons for participation and over-ride the potential of social media to connect and share on an unprecedented level.

As I’ve been reading and watching more dystopian than utopian material, and observing the general MOOC debate taking another turn with the pulling of the Georgia Tech course, I’ve been thinking a lot of the whole experimental nature of MOOCs. We are all just part of a huge experiment just now, students and course teams alike. But we’re not putting very many new elements into the mix, and our pre-determined behaviours are driving our activity. We are in a sense all just ghosts in the machine. When we do try and do something different then participation can drop dramatically. I know that I, and lots of my fellow students on #oldsmooc have struggled to actually complete project based activities.

The community element of MOOCs can be fascinating, and the use of social network analysis can help to give some insights into activity, patterns of behaviour and connections. But with so many people on a course is it really possible to make and sustain meaningful connections? From a selfish point of view, having my blog picked up by the #edcmooc news feed has greatly increased my readership and more importantly I’m getting comments which is more meaningful to me than hits. I’ve tried read other posts too, but in the first week it was really difficult to keep up, so I’ve fallen back to a very pragmatic, reciprocal approach. But with so much going on you need to have strategies to cope, and there is quite a bit of activity around developing a MOOC survival kit which has come from fellow students.

As the course develops the initial euphoria and social web activity may well be slowing down. Looking at the twitter activity it does look like it is on a downwards trend.

#edcmooc Twitter activity diagram

#edcmooc Twitter activity diagram

Monitoring this level of activity is still a challenge for the course team and students alike. This morning my colleague Martin Hawskey and I were talking about this, and speculating that maybe there are valuable lessons we in the education sector can learn from the commercial sector about managing “massive” online campaigns. Martin has also done a huge amount of work aggregating data and I’d recommend looking at his blogs. This post is a good starting point.

Listening to the google hang out session run by the #edcmooc team they again seemed to have under estimated the time sink reality of having 41,000 students in a course. Despite being upfront about not being everywhere, the temptation to look must be overwhelming. This was also echoed in the first couple of weeks of #oldsmooc. Interestingly this week there are teaching assistants and students from the MSc course actively involved in the #edcmooc.

I’ve also been having a play with the data from the Facebook group. I’ve had a bit of interaction there, but not a lot. So despite it being a huge group I don’t get the impression, that apart from posting links to blogs for newsfeed, there is a lot of activity or connections. Which seems to be reflected in the graphs created from the data.

#edc Facebook group friends connections

#edc Facebook group friends connections


This is a view based on friends connections. NB it was very difficult for a data novice like me to get any meaningful view of this group, but I hope that this gives the impression of the massive number of people and relative lack of connections.

There are a few more connections which can be drawn from the interactions data, and my colleagye David Sherlock manage create a view where some clusters are emerging – but with such a huge group it is difficult to read that much into the visualisation – apart from the fact that there are lots of nodes (people).

#edcmooc Facebook group interactions

#edcmooc Facebook group interactions


I don’t think any of this is unique to #edcmooc. We’re all just learning how to design/run and participate at this level. Technology is allowing us to connect and share at a scale unimaginable even 10 years ago, if we have access to it. NB there was a very interesting comment on my blog about us all being digital slaves.

Despite the potential affordances of access at scale it seems to me we are increasingly just perpetuating an existing system if we don’t take more time to understand the context and consequences of our online connections and communities. I don’t need to connect with 40,000 people but I do want to understand more about how, why and how I could/do. That would be a really new element to add to any course, not just MOOCs (and not something that’s just left to a course specifically about analytics). Unless that happens my primary driver will be that “completion certificate”. In this instance, and many others, to get that I don’t really need to make use of the course community. So I’m just perpetuating an existing where I know how to play the game, even if it’s appearance is somewhat disguised.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/02/08/ghosts-in-the-machine-edcmooc/feed/ 7
#oldsmooc week 2 – Context and personal learning spaces http://blogs.cetis.org.uk/sheilamacneill/2013/01/18/oldmooc-week-2-context-and-personal-learning-spaces/ http://blogs.cetis.org.uk/sheilamacneill/2013/01/18/oldmooc-week-2-context-and-personal-learning-spaces/#comments Fri, 18 Jan 2013 12:38:25 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1992 Well I have survived week 1 of #oldsmooc and collected my first online badge for doing so -#awesome. My last post ended with a few musings about networks and visualisation.

I’m also now wondering if a network diagram of cloudscape (showing the interconnectedness between clouds, cloudscapes and people) would be helpful ? Both in terms of not only visualising and conceptualising networks but also in starting to make more explicit links between people, activities and networks. Maybe the mindmap view is too linear? Think I need to speak to @psychemedia and @mhawskey . . .

I’ve been really pleased that Tony Hirst has taken up my musings and has been creating some wonderful visualisations of clouds, cloudscapes and followers. So I should prefix prefix this post by saying that this has somewhat distracted me from the main course activities over the past few days. However I want to use this post to share some of my thoughts re these experiments in relation to the context of my learning journey and the potential for Cloudworks to help me (and others) contextualise their learning, activities, networks, and become a powerful personal learning space/ environment.

Cloudworks seems to be a bit like marmite – you either love or hate it. I have to admit I have a bit of a soft spot for it mainly because I have had a professional interest in its development.(I also prefer vegemite but am partial to marmite now and again). I’ve also used it before this course and have seen how it can be useful. In someways it kind of like twitter, you have to use it to see the point of using it. I’ve also fully encouraged the development of its API and its open source version Cloud Engine.

A short bit of context might be useful here too. Cloudworks was originally envisaged as a kind of “flickr for learning designs”, a social repository if you like. However as it developed and was used, it actually evolved more into an aggregation space for ideas, meetings, conferences. The social element has always been central. Of course making something social, with tagging, favouring etc, does mean that navigation isn’t traditional and is more “exploratory” for the user. This is the first time (that I know of anyway) it has actually been used as part of a “formal” course.

As part of #oldsmooc, we (the leaners) are being encouraged to use Cloudworks for sharing our learning and activities. As I’m doing a bit more on the course, I’m creating clouds, adding them to my own #oldsmooc and other cloudscapes, increasingly favouriting and following other’s clouds/cloudscapes. I’m starting to find that concept of having one place where my activity is logged and I am able to link to other spaces where I create content (such as this blog) is becoming increasingly attractive. I can see how it could really help me get a sense of my learning journey as I process through the course, and the things that are useful/of interest to me. In other words, it’s showing potential to be my personal aggregation point, and a very useful (if not key) part of my personal learning environment. But the UI as it stands is still a bit clunky. Which is where the whole visualisation thing started.

Now Tony has illustrated how it possible to visualise the connections between people, content, activities, what think would be really useful would be an incorporation of these visualisations into a newly designed profile page. Nick Frear has already done an alpha test to show these can be embedded into Cloudworks.

A move from this:

My Cloudworks profile page

My Cloudworks profile page

To something kind of like this:

Potential Cloudworks Profile page

Potential Cloudworks Profile page

Excuse the very crude graphic cut and paste but I hope you get the idea. There’s lots of space there to move things around and make it much more user friendly and useful.

Ideally when I (or any other user) logged into our profile page, our favourite spaces and people could easily been seen, and we could have various options to see and explore other network views of people/and our content and activities. Could these network views start to give learners a sense of Dave Cormier’s rhizomatic learning; and potentially a great level of control and confidence in exploring the chaotic space which any MOOC creates?

The social “stuff” and connections is all there in Cloudworks, it just needs a bit of re-jigging. If the UI could be redesigned to incorporate these ideas , then I for one would be very tempted to use cloud works for any other (c)MOOC I signed up for. I also need to think a lot more about how to articulate this more clearly and succinctly, but I’d be really interested in other views.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/01/18/oldmooc-week-2-context-and-personal-learning-spaces/feed/ 0
Quick review of the Larnaca Learning Design Declaration http://blogs.cetis.org.uk/sheilamacneill/2013/01/08/quick-review-of-the-larnaca-learning-design-declaration/ http://blogs.cetis.org.uk/sheilamacneill/2013/01/08/quick-review-of-the-larnaca-learning-design-declaration/#comments Tue, 08 Jan 2013 15:04:47 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1970 Late last month the Larnaca Declaration on Learning Design was published. Being “that time of year” I didn’t get round to blogging about it at the time. However as it’s the new year and as the OLDS mooc is starting this week, I thought it would be timely to have a quick review of the declaration.

The wordle gives a flavour of the emphasis of the text.

Wordle of Larnaca Declaration on Learning Design

Wordle of Larnaca Declaration on Learning Design

First off, it’s actually more of a descriptive paper on the development of research into learning design, rather than a set of statements declaring intent or a call for action. As such, it is quite a substantial document. Setting the context and sharing the outcomes of over 10 years worth of research is very useful and for anyone interested in this area I would say it is definitely worth taking the time to read it. And even for an “old hand” like me it was useful to recap on some of the background and core concepts. It states:

“This paper describes how ongoing work to develop a descriptive language for teaching and learning activities (often including the use of technology) is changing the way educators think about planning and facilitating educational activities. The ultimate goal of Learning Design is to convey great teaching ideas among educators in order to improve student learning.”

One of my main areas of involvement with learning design has been around interoperability, and the sharing of designs. Although the IMS Learning Design specification offered great promise of technical interoperability, there were a number of barriers to implementation of the full potential of the specification. And indeed expectations of what the spec actually did were somewhat over-inflated. Something I reflected on way back in 2009. However sharing of design practice and designs themselves has developed and this is something at CETIS we’ve tried to promote and move forward through our work in the JISC Design for Learning Programme, in particular with our mapping of designs report, the JISC Curriculum Design and Delivery Programmes and in our Design bashes: 2009, 2010, 2011. I was very pleased to see the Design Bashes included in the timeline of developments in the paper.

James Dalziel and the LAMS team have continually shown how designs can be easily built, run, shared and adapted. However having one language or notation system is a still goal in the field. During the past few years tho, much of the work has been concentrated on understanding the design process and how to help teachers find effective tools (online and offline) to develop new(er) approaches to teaching practice, and share those with the wider community. Viewpoints, LDSE and the OULDI projects are all good examples of this work.

The declaration uses the analogy of the development of musical notation to explain the need and aspirations of a design language which can be used to share and reproduce ideas, or in this case lessons. Whilst still a conceptual idea, this maybe one of the closest analogies with universal understanding. Developing such a notation system, is still a challenge as the paper highlights.

The declaration also introduces a Learning Design Conceptual Map which tries to “capture the broader education landscape and how it relates to the core concepts of Learning Design“.

Learning Design Conceptual Map

Learning Design Conceptual Map

These concepts including pedagogic neutrality, pedagogic approaches/theories and methodologies, teaching lifecycle, granularity of designs, guidance and sharing. The paper puts forward these core concepts as providing the foundations of a framework for learning design which combined with the conceptual map and actual practice provides a “new synthesis for for the field of learning design” and future developments.

Components of the field of Learning Design

Components of the field of Learning Design

So what next? The link between learning analytics and learning design was highlighted at the recent UK SoLAR Flare meeting. Will having more data about interaction/networks be able to help develop design processes and ultimately improving the learning experience for students? What about the link with OERs? Content always needs context and using OERs effectively intrinsically means having effective learning designs, so maybe now is a good time for OER community to engage more with the learning design community.

The Declaration is a very useful summary of where the Learning Design community is to date, but what is always needed is more time for practising teachers to engage with these ideas to allow them to start engaging with the research community and the tools and methodologies which they have been developing. The Declaration alone cannot do this, but it might act as a stimulus for exisiting and future developments. I’d also be up for running another Design Bash if there is enough interest – let me know in the comments if you are interested.

The OLDS MOOC is a another great opportunity for future development too and I’m looking forward to engaging with it over the next few weeks.

Some other useful resources
*Learning Design Network Facebook page
*PDF version of the Declaration
*CETIS resources on curriculum and learning design
*JISC Design Studio

]]>
http://blogs.cetis.org.uk/sheilamacneill/2013/01/08/quick-review-of-the-larnaca-learning-design-declaration/feed/ 7
Analytics and #moocmooc http://blogs.cetis.org.uk/sheilamacneill/2012/08/21/analytics-and-moocmooc/ http://blogs.cetis.org.uk/sheilamacneill/2012/08/21/analytics-and-moocmooc/#comments Tue, 21 Aug 2012 10:34:37 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1717 This is my final post on my experiences of the #moocmooc course that ran last week, and I want to share a few of my reflections on the role of analytics (and in this case learning analytics), primarily from my experiences as a learner on the course. I should point out that I have no idea about the role of analytics from the course teams point of view, but I am presuming that they have the baseline basics of enrollment numbers and login stats from the Canvas LMS. But in this instance there were no obvious learner analytics available from the system. So, as a learner, in such an open course where you interact in a number of online spaces, how do you get a sense of your own engagement and participation?

There are some obvious measures, like monitoring your own contributions to discussion forums. But to be honest do we really have the time to do that? I for one am quite good at ignoring any little nagging voices saying in my head saying “you haven’t posted to the discussion forum today” :-) A little automation would probably go a long way there. However, a lot of actual course activity didn’t take place within the “formal” learning environment, instead it happened in other spaces such as twitter, storify, google docs, YouTube, blogs etc. Apart from being constantly online, my phone bleeping every now again notifying me of retweets, how did I know what was happening and how did that help with engagement and motivation?

I am fortunate, mainly due to my colleague Martin Hawskey that I have a few analytics tricks that I was able to utilise which gave me a bit of an insight into my, and the whole class activity.

One of Martins’ most useful items in his bag of tricks is his hashtag twitter archive. By using his template, you can create an archive in google docs which stores tweets and through a bit of social network analysis magic also gives an overview of activity – top tweeters, time analysis etc. It’s hard to get the whole sheet into a screen grab hopefully the one below gives you and idea. Follow the link and click on the “dashboard” tab to see more details.

Dashboard from #moocmooc twitter archive

From this archive you can also use another one of Martin’s templates to create a vizualisation of the interactions of this #hashtag network.

Which always looks impressive, and does give you a sense of the “massive” part of a MOOC, but it is quite hard to actually make real any sense of;-)

However Martin is not one to rest on his SNA/data science laurels and his latest addition, a searchable twitter archive, I feel was much more useful from a learner’s (and actually instructors) perspective.

Again it has time/level of tweets information, this time clearly presented at the top of the sheet. You can search by key word and/or twitter handle. A really useful way to find those tweets you forgot to favourite! Again here is a screenshot just as a taster, but try it out to get the full sense of it.

#moocmooc searchable twitter arcive

#moocmooc searchable twitter arcive

Also from an instructor/course design point of view you, from both of these templates you can see time patterns emerging which could be very useful for a number of reasons – not least managing your own time and knowing when to interact to connect with the most number of learners.

Another related point about timing relates to the use of free services such as storify. Despite us all being “self directed, and motivated” it’s highly likely that if an assignment is due in at 6pm – then at 5.50 the service is going to be pretty overloaded. Now this might not be a problem, but it could be and so it worth bearing in mind when designing courses and suggesting submission times and guidance for students.

I also made a concerted effort to blog each day about my experiences, and once I was able to use another one of Martin’s templates – social sharing, to track the sharing of my blogs on various sites. I don’t have a huge blog readership but I was pleased to see that I was getting a few more people reading my posts. But what was really encouraging (as any blogger knows) was the fact that I was getting comments. I know I don’t need any software to let me know that, and in terms of engagement and participation, getting comments is really motivating. What is nice about this template is that it stores the comments and the number other shares (and where they are), allowing you get more of an idea of where and how your community are sharing resources. I could see my new #moocmooc community were engaging with my engagement – warm, cosy feelings all round!

So through some easy to set up and share templates I’ve been able to get a bit more of an insight into my activity, engagement and participation. MOOCs can be overwhelming, chaotic, disconcerting, and give learners many anxieties about being unconnected in the vast swirl of connectedness. A few analtyics can help ease some of these anxieties, or at least give another set of tools to help make sense, catch up, reflect on what is happening.

For more thoughts on my experiences of the week you can read my other posts.

*Day 1 To MOOC or not to MOOC?
*Day 2 Places where learning takes place
*Day 3 Massive Participation but no-one to talk to
*Day 4 Moocmooc day 4
*Day 5 Designing a MOOC – moocmooc day 5

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/08/21/analytics-and-moocmooc/feed/ 17
Some useful resources around learning analytics http://blogs.cetis.org.uk/sheilamacneill/2012/05/14/some-useful-resources-around-learning-analytics/ http://blogs.cetis.org.uk/sheilamacneill/2012/05/14/some-useful-resources-around-learning-analytics/#comments Mon, 14 May 2012 09:10:41 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1540 As I’ve mentioned before, and also highlighted in a number of recent posts by my colleagues Adam Cooper and Martin Hawskey, CETIS is undertaking some work around analytics in education which we are calling our Analytics Reconnoitre.

In addition to my recent posts from the LAK12 conference, I thought it would be useful to highlight the growing number of resources that the our colleagues in Educause have been producing around learning analytics. A series of briefing papers and webinars are available which covering a range of issues around the domain. For those of you not so familiar with the area, a good starting point is the “Analytics in Education: Establishing a Common Language” paper which gives a very clear outline of a range of terms being used in the domain and how they relate to teaching and learning.

For those of you who want to delve a bit deeper the resource page also links to the excellent “The State of Learning Analytics in 2012: A Review and Future Challenges” report by Rebecca Ferguson, from the OU’s KMI, which gives a comprehensive overview of the domain.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/05/14/some-useful-resources-around-learning-analytics/feed/ 1
5 things from LAK12 http://blogs.cetis.org.uk/sheilamacneill/2012/05/09/5-things-from-lak12/ http://blogs.cetis.org.uk/sheilamacneill/2012/05/09/5-things-from-lak12/#comments Wed, 09 May 2012 11:19:24 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1526

@sheilmcn I’d love to know the 5 most interesting things you learned from #lak12 today:)

— Brandon Muramatsu (@bmuramatsu) May 2, 2012

Following that challenge, I’m going to try and summarise my experiences and reflections on the recent LAK12 conference in the five areas that seemed to resonate with me over the 4 days of the conference (including the pre conference workshop day) which are: research, vendors, assessment, ethics and students.

Research
Learning Analytics is a newly emerging research domain. This was only the second LAK conference, and to an extent the focus of the conference was on trying to establish and benchmark the domain. Aberlardo has summarised this aspect of the conference far better than I could. Although I went to the conference with an open mind, and didn’t have set expectations I was struck by the research focus of the papers, and the lack of large(r) scale implementations. Perhaps this is due to the ‘buzzy-ness’ of the term learning analytics just now (more on that in the vendor section of this post) – and is not meant in any way as a critisism of the conference or the quality of the papers, both of which were excellent. On reflection I think that the pre-conference workshops gave more of an opportunity for discuss than the traditional paper presentation with short Q&A format which the conference followed. Perhaps for LAK13 a mix of presentation formats might be included. With any domain which hopes to impact on teaching and learning there are difficulties breaching the research and practice divide and personally I find workshops give more opportunity for discussion. That said, I did see a lot of interesting presentations which did have potential, including a reintroduction to SNAPP which Lori Lockyer and Shane Dawson presented at the Learning Analytics meets Learning Design workshop; a number of very interesting presentations from the OU on various aspects of their work in research and now applying analytics; the Mirror project, an EU funded work based learning project which includes a range of digital, physical and emotional analytics and the GLASS system presented by Derek Leony, Carlos III, Madrid to name just a few.

George Seimens presented his vision(s) for the domain in his keynote (this was the first keynote I have seen where the presenter’s ideas were shared openly during the presentation – such a great example of openness in practice). There was also an informative panel session around the differences and potential synergies with the Educational Data Mining community. SOLAR (the society for learning analytics research ) is planning a series of events to continue these discussions and scoping of the domain, and we at CETIS will be involved in helping with a UK event later this year.

Vendors
There were lots of vendors around. I didn’t get any impression of any kind of hard sell, but every educational tool be it LMS/VLE/CMS now has a very large, shiny new analytics badge on it – even if what is being offered is actually the same as before, but just with parts re-labelled. I’m not sure how much (or any) of the forward thinking research that was presented will filter down into large scale tools, but I guess that’s an answer in itself for the need for the research in this area. So we in the education community can be informed and ask questions challenging the vendors and the systems they present. I was impressed with a (new to me) system called canvas analytics which colleagues from the community college sector in Washington State briefly showed me. It seems to allow flexibility and customisation of features and UI, is cloud based and so has a more distributed architecture, has CC licensing built in, and a crowd sourced feature request facility.

With so many potential sources of data it is crucial that systems are flexible and can pull and push data out to a variety of end points. This allows users – both at the institutional back end and the UI end – flexibility over what they use. CETIS have been supporting JISC to explore notions of flexible provision through a number of programmes including DVLE.

Lori Lockyer made an timely reflection on the development of learning design drawing parallels with the learning analytics. This made me immediately think of the slight misnomer of learning design, which in many cases was actually more about teaching design. With learning analytics there are similar parallels but what also crossed my mind on more than one occasion was the notion of marketing analytics as a key driver in this space. This was probably more noticeable due to the North American slant of the conference. But I was once again struck by the differences in approaches to marketing of students in North America and the UK. Universities and colleges in the US have relatively huge marketing budgets compared to us, they need to get students into their classes and keep them there. Having a system or integrated systems which manage retention numbers, and if you like the more business intelligence end of the analytics spectrum, could gain traction far more quickly than ones that are exploring the much harder to qualify effective learning analytics. Could this lead us into a similar situation with VLEs/LMSs where there was a perceived need to have one (“everyone else has got one”), vendors sold the sector something which kind of looked like it did the job? Given my comments earlier about flexibility and pervasiveness of web services, I hope not, but some dark thoughts did cross my mind and I was drawn back to Gardner Campbell’s presentation questioning some of the narrow definitions of learning analytics.

Assessment
It’s still the bottom line, and the key driver for most educational systems, and in turn analytics about those systems. Improving assessment numbers gets senior management attention. The Signals project at Purdue is one of the leading lights in the domain of learning analytics, and John Campbell and the team there have, and continue to do an excellent job of gathering data from mainly their LMS and feed it back to students in ways that do have an impact. But again, going back to Gardner Campbell’s presentation, learning analytics as a research domain is not just about assessment. So, I was heartened to see lots of references to the potential for analytics to be used in terms of measuring competencies, which I think could have potential for students as it might help to contextualise existing and newly developed/ing competencies, and allow some more flexible approaches to recognition of competencies to be developed. More opportunities to explore the context of learning and not just sell the content? Again, relating back the role of vendors, I was reminded of how content driven the North American systems is. Vendors are increasingly offering competitive alternatives for elective courses with accreditation, as well as OERs (and of course collecting the data). In terms of wider systems, I’m sure that an end to end analytics system with content and assessment all bundled in is not that far off being offered, if it isn’t already.

Ethics
Data and ethics, collect one and ignore the other at your peril! My first workshop was one run by Sharon Slade and Finella Gaphin from the OU and I have to say, I think it was a great start to the whole week (not just because we got to play snakes and ladders) as ethics and our approaches to them underline all the activity in this area. Most attention just now is focusing on issues of privacy, but there are a host of other issues including:
*power – who gets to decided what is done with the data?
*rights – does everyone have the same rights to use data? who can mine data for other purposes?
*ownership – do students own their data – what are the consequences of opt outs?
*responsibility – is there shared responsibility between institutions and students?

Doug Clow live blogged the workshop if you want more detailed information, and it is hoped that a basis for a code of conduct can be developed from the session.

Students
Last, but certainly not least, students. The student voice was at times deafening by its silence. At several points during the conference, particularly during the panel session on Building Organisational Capacity by Linda Baer and Dan Norris, I felt a growing concern about things being done “to” and not “with” students. Linda and Dan are conducting some insightful research into organisational capacity building and have already interviewed many (North American) institutions and vendors but there was very little mention of students. If learning analytics are going to really impact on learning and help transform pedagogical approaches, then shouldn’t we be talking about them to the students? What does really work for them? Are they aware of what data is being collected about them? Are they willing to let more data from informal sources e.g. Facebook, 4square etc be used in the context of learning analytics? Are they aware of their data exhaust? As well as these issues, Simon Buckingham-Schum made the very pertinent point, that if students were given access to their data, would they actually be able to do anything with it?

And also if we are collecting data about students shouldn’t we be also collecting similar data about teaching staff?

I don’t want to add yet another literacy to the seemingly never ending list, but this does tie in with the wider context of digital literacy development. Sense making of data and visualisations is key if learning analytics is to gain traction in practice, and it’s not just students who are falling short, it’s probably all of us. I saw lots of “pretty pictures” in terms of network visualisations, potential dashboard views, etc over the week – but did I really understand them? Do I have the necessary skills to properly de-code and make sense of them? Sometimes, but not all the time. I think visualisations should come with a big question mark symbol attached or overlaid – they should always raise questions. at the moment I don’t think enough people have the skills to be able to confidently question them.

Overall it was a very thought provoking week, with too much to included in one post but if you have a chance take a look at Katy Borner’s keynote Visual Analytics in Support of Education one of my highlights.

So, thanks to all the organisers for creating such a great atmosphere for sharing and learning. I’m looking forward to LAK13 and what advances will be made in the coming year and if a European location will bring some a different slant to the conference.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/05/09/5-things-from-lak12/feed/ 8
LAK12 Useful links and resources http://blogs.cetis.org.uk/sheilamacneill/2012/05/03/lak12-useful-links-and-resources/ http://blogs.cetis.org.uk/sheilamacneill/2012/05/03/lak12-useful-links-and-resources/#comments Thu, 03 May 2012 14:30:43 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1514 There has been a huge amount of activty at this year’s LAK confrence. I’m still cogitating about the issues raised and will post my reflections over the next few days. However, in the meantime there were a number of really interesting tools and resources which were presented and which are available from this Diigo site George Siemens has set up.

Doug Clow has been doing a splendid (and quite awe inspiring) job of live blogging and has summary links of resources and his posts here. Myles Danson has also done some useful live blog posts from sessions too. We also have some really useful twitter activity summaries from Tony Hirst and Martin Hawkesy.

*Update - Audrey Watters review of the conference.

And just in case you missed them :-) below is a time line view of my collected tweets and a few pictures from the past few days.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/05/03/lak12-useful-links-and-resources/feed/ 0
LAK12 Pre conference workshop quick overview http://blogs.cetis.org.uk/sheilamacneill/2012/04/30/lak12-pre-conference-workshop-quick-overview/ http://blogs.cetis.org.uk/sheilamacneill/2012/04/30/lak12-pre-conference-workshop-quick-overview/#comments Mon, 30 Apr 2012 00:58:31 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1498 I’ve had a very informative and stimulating day at the preconference workshops for the LAK12 conference. This is just a very quick post with links to some great summaries and resources that people have contributed.

*Learning Analtyics and Ethics live blog summary from Doug Clow (thanks, Doug you truly are a conference reporting machine!)

*Learning Analytics and Linked Data collective google doc – various contributors.

There has also been quite a bit of twitter activity and Tony Hirst was quick off the mark to visualise the connections. Martin Hawskey has also produced an alternative visualisation based on the twitter archive I set up last week I set up last week; and here’s another summary view from Tweetlevel.

I’ll hopefully do some more considered posts myself during the week. Based on today’s sessions this is shaping up to be a great conference.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/04/30/lak12-pre-conference-workshop-quick-overview/feed/ 2
Learning Analytics, where do you stand? http://blogs.cetis.org.uk/sheilamacneill/2012/03/09/learning-analytics-where-do-you-stand/ http://blogs.cetis.org.uk/sheilamacneill/2012/03/09/learning-analytics-where-do-you-stand/#comments Fri, 09 Mar 2012 09:19:03 +0000 http://blogs.cetis.org.uk/sheilamacneill/?p=1406 For? Against? Not bovvered? Don’t understand the question?

The term learning analytics is certainly trending in all the right ways on all the horizons scans. As with many “new” terms there are still some mis-conceptions about what it actually is or perhaps more accurately what it actually encompasses. For example, whilst talking with colleagues from the SURF Foundation earlier this week, they mentioned the “issues around using data to improve student retention” session at the CETIS conference. SURF have just funded a learning analytics programme of work which closely matches many of the examples and issues shared and discussed there. They were quite surprised that the session hadn’t be called “learning analytics”. Student retention is indeed a part of learning analytics, but not the only part.

However, back to my original question and the prompt for it. I’ve just caught up with the presentation Gardner Campbell gave to the LAK12 MOOC last week titled “Here I Stand” in which he presents a very compelling argument against some of the trends which are beginning to emerge in field of learning analytics.

Gardner is concerned that there is a danger of that the more reductive models of analytics may actually force us backwards in our models of teaching and learning. Drawing an analogy between M theory – in particular Stephen Hawkins description of there being not being one M theory but a “family of theories” – and how knowledge and learning actually occur. He is concerned that current learning analytics systems are based too much on “the math” and don’t actually show the human side of learning and the bigger picture of human interaction and knowledge transfer. As he pointed out “student success is not the same as success as a student”.

Some of the rubrics we might be tempted to use to (and in cases already are) build learning analytics systems reduce the educational experience to a simplistic management model. Typically systems are looking for signs pointing to failure, and not for the key moments of success in learning. What we should be working towards are system(s) that are adaptive, allow for reflection and can learn themselves.

This did make me think of the presentation at FOFE11 from IBM about their learning analytics system, which certainly scared the life out of me and many other’s I’ve spoken too. It also raised a lot of questions from the audience (and the twitter backchannel) about the educational value of the experience of failure. At the same time I was reflecting on the whole terminology issue again. Common understandings – why are they so difficult in education? When learning design was the “in thing”, I think it was John Casey who pointed out that what we were actually talking about most of the time was actually “teaching design”. Are we in danger of the same thing happening to the learning side of learning analytics being hi-jacked by narrower, or perhaps to be fairer, more tightly defined management and accountability driven analytics ?

To try and mitigate this we need to ensure that all key stakeholders are starting to ask (and answering) the questions Gardner raised in his presentation. What are the really useful “analytics” which can help me as a learner, teacher, administrator, etc? Which systems provide that data just now ? How can/do these stakeholders access and share the data in meaningful ways? How can we improve and build on these systems in ways which take into account the complexity of learning? Or as Gardner said, how can we start framing systems and questions around wisdom? But before we can do any of that we need to make sure that our stakeholders are informed enough to take a stand, and not just have to accept whatever system they are given.

At CETIS we are about to embark on an analytics landscape study, which we are calling an Analytics Reconnoitre. We are going to look at the field of learning analytics from a holistic perspective, review recent work and (hopefully) produce some pragmatic briefings on the who, where, why, what and when’s of learning analytics and point to useful resources and real world examples. This will build and complement work already funded by JISC such as the Relationship Management Programme, the Business Intelligence Infokit and the Activity Data Programme synthesis. We’ll also be looking to emerging communities of practice, both here in the UK and internationally to join up on thinking and future developments. Hopefully this work will contribute to the growing body of knowledge and experience in the field of learning analytics and well as raising some key questions (and hopefully some answers) around around its many facets.

]]>
http://blogs.cetis.org.uk/sheilamacneill/2012/03/09/learning-analytics-where-do-you-stand/feed/ 22