Quick review of the Larnaca Learning Design Declaration

Late last month the Larnaca Declaration on Learning Design was published. Being “that time of year” I didn’t get round to blogging about it at the time. However as it’s the new year and as the OLDS mooc is starting this week, I thought it would be timely to have a quick review of the declaration.

The wordle gives a flavour of the emphasis of the text.

Wordle of Larnaca Declaration on Learning Design

Wordle of Larnaca Declaration on Learning Design

First off, it’s actually more of a descriptive paper on the development of research into learning design, rather than a set of statements declaring intent or a call for action. As such, it is quite a substantial document. Setting the context and sharing the outcomes of over 10 years worth of research is very useful and for anyone interested in this area I would say it is definitely worth taking the time to read it. And even for an “old hand” like me it was useful to recap on some of the background and core concepts. It states:

“This paper describes how ongoing work to develop a descriptive language for teaching and learning activities (often including the use of technology) is changing the way educators think about planning and facilitating educational activities. The ultimate goal of Learning Design is to convey great teaching ideas among educators in order to improve student learning.”

One of my main areas of involvement with learning design has been around interoperability, and the sharing of designs. Although the IMS Learning Design specification offered great promise of technical interoperability, there were a number of barriers to implementation of the full potential of the specification. And indeed expectations of what the spec actually did were somewhat over-inflated. Something I reflected on way back in 2009. However sharing of design practice and designs themselves has developed and this is something at CETIS we’ve tried to promote and move forward through our work in the JISC Design for Learning Programme, in particular with our mapping of designs report, the JISC Curriculum Design and Delivery Programmes and in our Design bashes: 2009, 2010, 2011. I was very pleased to see the Design Bashes included in the timeline of developments in the paper.

James Dalziel and the LAMS team have continually shown how designs can be easily built, run, shared and adapted. However having one language or notation system is a still goal in the field. During the past few years tho, much of the work has been concentrated on understanding the design process and how to help teachers find effective tools (online and offline) to develop new(er) approaches to teaching practice, and share those with the wider community. Viewpoints, LDSE and the OULDI projects are all good examples of this work.

The declaration uses the analogy of the development of musical notation to explain the need and aspirations of a design language which can be used to share and reproduce ideas, or in this case lessons. Whilst still a conceptual idea, this maybe one of the closest analogies with universal understanding. Developing such a notation system, is still a challenge as the paper highlights.

The declaration also introduces a Learning Design Conceptual Map which tries to “capture the broader education landscape and how it relates to the core concepts of Learning Design“.

Learning Design Conceptual Map

Learning Design Conceptual Map

These concepts including pedagogic neutrality, pedagogic approaches/theories and methodologies, teaching lifecycle, granularity of designs, guidance and sharing. The paper puts forward these core concepts as providing the foundations of a framework for learning design which combined with the conceptual map and actual practice provides a “new synthesis for for the field of learning design” and future developments.

Components of the field of Learning Design

Components of the field of Learning Design

So what next? The link between learning analytics and learning design was highlighted at the recent UK SoLAR Flare meeting. Will having more data about interaction/networks be able to help develop design processes and ultimately improving the learning experience for students? What about the link with OERs? Content always needs context and using OERs effectively intrinsically means having effective learning designs, so maybe now is a good time for OER community to engage more with the learning design community.

The Declaration is a very useful summary of where the Learning Design community is to date, but what is always needed is more time for practising teachers to engage with these ideas to allow them to start engaging with the research community and the tools and methodologies which they have been developing. The Declaration alone cannot do this, but it might act as a stimulus for exisiting and future developments. I’d also be up for running another Design Bash if there is enough interest – let me know in the comments if you are interested.

The OLDS MOOC is a another great opportunity for future development too and I’m looking forward to engaging with it over the next few weeks.

Some other useful resources
*Learning Design Network Facebook page
*PDF version of the Declaration
*CETIS resources on curriculum and learning design
*JISC Design Studio

Institutional Readiness for Analytics – practice and policy

So far in our Analytics Series we have been setting out the background, history and context of analytics in education at fairly broad and high levels. Developing policy and getting strategic buy-in is critical for any successful project (analytics based or not), so we have tried to highlight issues which will be of use to senior management in terms of the broader context and value of analytics approaches.

Simon Buckingham Schum at the OU (a key figure in the world of learning analytics) has also just produced Learning Analytics Policy Brief for the UNESCO Institute for Information Technologies in Education. Specifically focussing on learning analytics Simon’s paper highlights a number of key issues around “the limits of computational modelling, the ethics of analytics, and the educational paradigms that learning analytics promote”, and is another welcome addition to the growing literature on learning analytics; and is a useful complementary resource to to the CETIS series. I would recommend it to anyone interested in this area.

Moving from the policy to practicalities is the focus of our next paper, Institutional Readiness for Analytics. Written by Stephen Powell (with a little bit of input from me), this paper drills down from policy level decisions to the more pragmatic issues faced by staff in institutions who want to start to make some sense of their data through analytics based techniques. It presents two short cases studies (from the University of Bolton and the Open University) outlining the different approaches each institution has taken to try and make more sense of the data they have access to and how that can begin to make an impact on key decisions around teaching, learning and administrative processes.

The OU is probably slightly “ahead of the game” in terms of data collection and provisioning and so their case study focuses more on staff development issues through their Data Wrangler Project, whereas the University of Bolton case study looks more at how they are approaching data provisioning issues. As the paper states, although the two approaches are very different “they should be considered as interrelated with each informing the work of the other in a process of experimentation leading to the development of practices and techniques that meet the needs of the organisation.”

As ever if you have thoughts or any experiences of using analytics approaches in your institution, we’d love to hear from you in the comments.

The paper is available for download here, and the other papers in the series from are available here.

Exploring Digital Futures

One of the most enjoyable aspects of the programme support aspect of my job is that I get to find out about a lot of really innovative work taking place across a diverse range of UK universities. On the flip side of this, I do sometimes yearn to be part of the development of projects instead of always just being on the outside looking in once plans have been made and funding secured. I also often wonder if anything I write about in my blog does actually make any difference or is useful to the wider to community.

So I was delighted yesterday to spend the afternoon at Edinburgh Napier University at an internal seminar exploring their digital future and technological ambitions. I was even more delighted a couple of weeks ago when Keith Smyth contacted me about attending the event, and said that the series of blog posts I wrote with my Strathclyde colleague Bill Johnston on the Digital University, had been really useful and timely for Napier in terms of them starting to think about how to develop their approach to a digital strategy.

Yesterday’s seminar was an opportunity for staff from across the institution to come together and share their experiences and views on what their real needs and aspirations are in terms of the future (digital) shape of the university. Napier are already involved in a number of innovative projects internally, and are committed to open practice, particularly in regards to their work in learning technology. For example their 3E Framework for effective use of technology in teaching and learning, is available via a CC licence and is being used/adapted by over 20 institutions worldwide who have all agreed to share their adaptations. A great example of how open practice can not only improve internal working practices but also have an impact in terms of helping community knowledge grow in an open, shareable way too. The framework is also linked to a resource bank,with examples of the framework in action, which again is openly available.

Like many institutions, podcasting is a growing trend and their College2Uni podcasts which were originally designed to help student transition from college to university are now being used for wider community driven information sharing initiatives. Plans for an open access journal are also well underway.

But what/where next? What should the long, medium and short term goals for the institution be? Participants were asked to consider “what will today’s ten year old’s expect when they come to University in 2020?” Delegates were divided into six groups set short (i.e. can be in place in a year) as well as longer term aspirational goals. The six themes were:

*Developing digital literacies
*Digital equivalence
*Digitally enhanced education
*Digital communication and outreach
*Digital scholarship
*Digital infrastructure and integration

Again, another wee ego boost, was seeing how the matrix Bill and I have developed, provided a framework for the discussions and planning of the workshop.

MacNeill/Johnston conceptual matrix (revised, October 2012)

MacNeill/Johnston conceptual matrix (revised, October 2012)

It was also a good opportunity for me to highlight work from a number of JISC programmes including Developing Digital Literacies, Assessment and Feedback, and Curriculum Design and Delivery and the growing number of resources from all these programmes which are available from the Design Studio.

There was a genuine enthusiasm from all the delegates a number of suggestions for easily achievable short term goals including single sign on for all uni accounts, more co-ordinated and easily accessible communication channels (for staff and students), experimenting with lay out of lecture spaces, developing a more coherent strategy for mobile devices. Longer term goals were generally centred on ubiquitous access to information, continuous development of staff and student skills including supporting open practices, ways to differentiate Napier and how to take advantage of affordances of the all pervasive MOOCs and indeed the changing landscape of HE. Content maybe more plentiful in 2020 but not everyone has the skills to take an MIT/Stanford/Everyotherbignameuniversity open course without support. There are a lot of skills which we know employers are looking for which aren’t supported through these large scale distance models of education. The need for new spaces (both digital and physical) for experimentation and play for both staff and students was highlighted as a key way to support innovation. You can get a flavour of the discussion by searching the #digiednap archive.

The next steps for Napier, are the forming of working group to take forward the most popular ideas from the session. There was a bit of the old “dotmocracy” with delegates voting for their preferred short terms ideas:

and work on more strategic developments over the coming year. I am really looking forward to working with colleagues in Napier as a critical friend to these developments, and being part of a project from the outset and seeing first hand how it develops.

UK SoLAR meeting feedback

Last month in collaboration with the colleagues at the OU we co-hosted the inaugural UK SoLAR Flare. A number of blogs, pictures and videos of the day are available on the SoLAR website.

This was the first meeting in the UK focusing on learning analytics and as such we had quite a broad cross section of attendees. We’ve issued a small survey to get some feedback from delegates, and many thanks to all the attendees who completed it. We had 20 responses in total and you can access collated results of the survey from here.

Overall, 100% of respondents found the day either very useful or useful, which is always a good sign, and bodes well for the beginnings of a new community of practice and future meetings

The need for staff development and a range of new skills is something that is being increasingly identified for successful analytics projects and is an underlying theme our current Analtyics Series. The role of the Data Scientist is being increasingly recognised as a key role both in the “real” world and in academia. So what roles did our attendees have? Well, we did have one data scientist, but perhaps not that surprisingly the most common role was that of Learning Technologist with 5 people. The full results were as follows:

learning technologist 5
manager 3
lecturer 3
developer 3
researcher 3
data scientist 1
other 2
(other answers; “director/agile manager” “sort of learning technologist but also training”

So a fair spread of roles which again bodes well for the development of teams with the skill needed to develop successful analytics projects.

We also asked attendees to share the main idea that they took away from the day. Below is a selection of responses.

“That people are in the early stages of discussion.”

“Learning analytics needs to reach out end-users”

“The overall idea was how many people are in the same position and that the field is in a very experimental stage. This improves the motivation to be experimental.”

“more a better understanding of the current status than a particular idea. But if I had to chose one idea it is the importance of engaging students in the process.”

“Early thoughts on how learning analytics could be used in the development of teaching staff.”

“That HE is on the cusp of something very exciting and possibly very enlightening regarding understanding the way students learn. BUT the institution as a whole needs to be commited to the process, and that meaningful analysis of the mass of potential data that is ‘out there’, is going to be critical. There is also the very important issues of ethics and who is going to do what with the data………I could go on, and on, and on…….”

Suggestions for further meetings included:

“It would be great to involve more academic teaching staff and students in future meetings.”

“I think bringing together the different stakeholders (technologists, teachers, students, data scientists, statisticians) is a great feature for this group. It is easy to break into silos and forget the real end-user. Having more students involved would be great.”

“An international project exchange. Have, say, 10 – 15 lightning talks. Then organise a poster session with posters corresponding to the lightning talks. People whose interest was drawn by one project or another will have the chance to follow up on that project for further information. Also maybe an expert panel (with people that have experience with putting learning analytics into educational practice) that can answer questions sent in beforehand by people wanting to set up a learning analytics project/activity. This can also be done Virtually”

“Would really welcome the opportunity to have a ‘hands on’ session possibly focussing upon the various dashboards that are out there.”

You can access the full results at the SoLAR website.

Analytics for Understanding Research

After a bit of exploration of the history, meanings and definitions of analytics from Adam Cooper, today our Analytics Series continues with the Analytics for Understanding Research paper (by Mark Van Harmelen).

Research and research management are key concerns for Higher Education, and indeed the wider economy. The sector needs to to ensure it is developing, managing, and sharing research capacity, capabilities, reputation and impact as effectively and efficiently as possible.

The use of analytics platforms has the potential to impact all aspects of research practice from the individual researcher in sharing and measuring their performance, to institutional management and planning of research projects, to funders in terms of decision making about funding areas.

The “Analytics for Understanding Research” paper focuses on analytics as applied to “the process of research, to research results and to the measurement of research.” The paper highlights exemplar systems, metrics and analytic techniques backed by evidence in academic research, the challenges in using them and future directions for research. It points to the need for the support and development of high quality, timely data for researchers to experiment with in terms of measuring and sharing their reputation and impact, and the wider adoption of platforms which utilise publicly available (and funded) data to inform and justify research investment.

Some key risks involved in the use of analytics to understand research highlighted in the paper are:

*Use of bibliometric indicators as the sole measure of research impact or over-reliance on metrics without any understanding of the context and nature of the research.
*Lack of understanding of analytics and advantages and disadvantages of different indicators on the part of users of those indicators. Managers and decision makers may lack the background needed to interpret existing analytics sensitively.
*The suitability of target-based assessment based on analytics is unproven. A wider assessment approach was tentatively recommended above (in most detail on page 29).
*There is a danger of one or a few vendors supplying systems that impose a particular view of analytics on research management data.

However it also points to some key opportunities including:

*Access to high-quality timely analytics may enable professionals to gauge their short-term performance, and use experimentation to discover new and novel ways to boost their impact.
*Adoption of CERIF-based CRIS across UK HE institutions and research institutes, with automatic retrieval of public data by UK Research Councils may help motivate increases in public funding of scientific and other scholarly activity; vitally important to the UK economy and national economic growth.
*Training as to the advantages, limitations and applicability of analytics may assist in the effective use of analytics its lay users, including researchers, research managers, and those responsible for policy and direction in institutions and beyond.

As ever, if you have any thoughts or experiences you’d like to share, please do so in the comments.

The paper is available to download here .

The papers published to date in the series are all available here.

Legal, Risk and Ethical Aspects of Analytics in Education

After some initial feedback on the CETIS Analytics Series, we’ve had a wee re-think of our publication schedule and today we launch the “Legal, Risk and Ethcial Aspects of Analytics in Education” written by David Kay (Sero Consulting), Naomi Korn and Professor Charles Oppenheim.

As all researchers are only too well aware, any practice involving data collection and reuse has inherent ethical and legal implications of which institutions must be cognisant. Most institutions have guidelines and policies in place for the collection and use of research data in place. However, the gathering of usage data primarily from internal systems, is an area where it is less commonplace for institutions to have legal and ethical guidelines in place. As with a number of developments in technology, current laws have not developed at a similar pace.

The “Legal, Risk and Ethical Aspects of Analytics in Higher Education” paper provides a concise overview of legal and ethical concerns in relation to analytics in education. It outlines a number of legal actors which impinge on analytics for education, in particular:

* Data Protection
* Confidentiality & Consent
* Freedom of Information
* Intellectual Property Rights
* Licensing for Reuse.

The paper also recommends a set of common principles which have universal application.

*Clarity; open definition of purpose, scope and boundaries, even if that is broad and in some respects extent open-ended,

*Comfort & care; consideration for both the interests and the feelings of the data subject and vigilance regarding exceptional cases,

*Choice & consent; informed individual opportunity to opt-out or opt-in,

*Consequence & complaint; recognition that there may be unforeseen consequences and therefore provision of mechanisms for redress.

Being aware of the legal and ethical implications of any activity requiring data collection is fundamental before undertaking any form of data analysis activity, and we hope this paper will be of use in helping inform and develop practice. As ever, if you have any comments/ examples please use the comments section to share them with us.

The paper is available to download here.

The papers published so far in the series are:

*Analytics, What is Changing and Why does it Matter?
*Analytics for the Whole Institution; Balancing Strategy and Tactics
*Analytics for Learning and Teaching

Analytics for Teaching and Learning

It’s all been about learning analytics for me this week. Following the SoLAR UK meeting on Monday, I’m delighted to announced that next paper in the CETIS Analytics Series, “Analytics for Teaching and Learning” launches today.

Building on from “Analytics for the Whole Institution, balancing strategy and tactics“, this paper (written by Mark Van Harmelen and David Workman) takes a more in-depth look at issues specifically related to applying analytics in teaching and learning.

The Analytics for Teaching and Learning paper examines:

” the use of analytics in education with a bias towards providing information that may help decision makers in thinking about analytics in their institutions. Our focus is pragmatic in providing a guide for this purpose: we concentrate on illustrating uses of analytics in education and on the process of adoption, including a short guide to risks associated with analytics.”

Learning analytics is an emerging field of research and holds many promises of improving engagement and learning. I’ve been following developments with interest and I hope a healthy level of scepticism and optimism. A number of VLEs (or LMSs if you’re in North America) are now shipping with built in analytics features aka dashboards. However, as I pointed out in the “Analytics, what is changing and why does it matter?” paper, there really isn’t a “magic analytics” button which will suddenly create instantly engaged students and better results. Effective use and sense making of any data requires lots of considerations. You need to think very carefully about the question(s) you want the data help you to answer and then ensure that results are shared with staff and students in ways that allow them to gain “actionable insights”. Inevitably the more data you gather, the more questions you will ask. As Adam summarised in his “how to do analytics right” post a simple start can be best. This view was echoed at discussions during the SoLAR meeting on Monday.

Starting at small scale, developing teams, sharing data in meaningful ways, developing staff/student skills and literacies are all crucial to successful analytics projects. The need for people with both data handling, interpretation and within education, pedagogic understanding is becoming more apparent. As the paper points out,

“There are a variety of success factors for analytics adoption. Many of them are more human and organisational in nature than technical. Leadership and organisational culture and skills matter a lot.”

Again if you have any thoughts/experiences to share, please feel free to leave a comment here.

The paper can be downloaded from here.

JISC Curriculum Design Programme Synthesis report now available

For the past four years I’ve been part of the support team for the JISC Curriculum Design Programme, and it has been a fascinating journey for everyone involved and has provided the basis for many a blog post here.  The final synthesis report for the programme is now available from the Design Studio.

Making sense of the varied findings of 12 projects over nearly 4 years is no mean feat, but Helen Beetham (with support from the rest of the team particularly Gill Ferrell, Marianne Sheppard and a little bit from me) has done a fantastic job.  The report reviews the four main areas of investigation: improving curriculum processes, reforming course information, enhancing design practice and transforming organisations. 

The main conclusions are:

*More transparent processes with shared, accessible representations of the curriculum can support better stakeholder engagement in curriculum design
*More efficient processes can save considerable administrative staff time, and may free up curriculum teams to focus on educational rather than administrative concerns
*A focus on the design process rather than its outcomes allows both for lighter-weight approval events and a shorter review cycle with more opportunity for continuous enhancement
*A single, trusted source of course information can be achieved through a centralised academic database, but similar benefits can be gained through enhancing the functions, interfaces and interoperability of existing systems.
*Trusted, relevant, timely information can support educational decision making by curriculum teams.
*Better managed course information also has benefits for students in terms of course/module selection, access to up-to-date information, and parity of experience
*Better managed information allows institutions to analyse the performance of their course portfolio as well as meeting external reporting requirements.
*Curriculum design practices can be enhanced through face-to-face workshops with access to resources and guidance.
*Particularly effective resources include concise statements of educational principle with brief examples; and tools/resources for visualising the learning process, e.g. as a storyboard or timeline, or as a balance of learning/assessment activities.
*With better quality guidance and information available, curriculum teams can build credible benefit/business cases and respond more effectively to organisational priorities.
 
I would thoroughly recommend reading the the full report to anyone who is involved in any kind of curriculum design activity.  

The report does signify the end of the programme, but plans are in place to ensure that the lessons learnt continue to be shared with the wider community. A number of openly available resources from the programme will be released over the coming months, including an info-kit style resource looking at business processes and curriculum information, and a resource pack including a number of tools and techniques developed by the projects for course development.

The Design Studio itself continues to grow with inputs from the Assessment and Feedback and Developing Digital Literacies Programmes. 

Quick links from SoLAR Flare meeting

So we lit the UK SoLAR Flare in Milton Keynes yesterday, and I think it is going to burn brightly for some time. This post is just a quick round up of some links to discussions/blogs/tweets and pics produced over the day.

Overviews of the presentations and discussions were captured by some live blogging from Myles Danson (JISC Programme Manager for our Analytics Series)

and Doug (master of the live blog) Clow of the OU.

Great overview of the day – thanks guys!

And our course we have some twitter analytics thanks to our very own Martin Hawksey’s TAGs archive for #FlareUK and the obligitory network diagram of the twitter stream (click the image to see larger, interactive version)

#FlareUK hashtag user community network

Slides from the morning presentations and subsequent group discussions are available from the the SoLAR website, and videos of the morning presentations will be available from there soon too.

As a taster of the day – here’s a little video of what went on.

Analytics for the Whole Institution; Balancing Strategy and Tactics

Following on from last week’s introductory and overview briefing paper, Analytics, what is changing and why does it matter?, this week we start to publish the rest of our series, beginning with “Analytics for the Whole Institution; Balancing Strategy and Tactics” (by David Kay and Mark van Harmelen)

Institutional data collection and analysis is not new to institutions, and most Higher Education Institutions and Further Education Colleges do routinely utilise collect data for a range of purposes, and many are using Business Intelligence (BI) as part of their IT infrastructure.

This paper takes an in-depth look as some of the issues which “pose questions about how business intelligence and the science of analytics should be put to use in customer facing enterprises”.

The focus is not on specific technologies, rather on how best to act upon the potential of analytics and new ways of thinking about collecting, sharing and reusing data to enable high value gains in terms of business objectives across an organisation.

There a number of additional considerations when trying to align BI solutions with some of the newer approaches now available for applying analytics across an organisiation.  For example, it is not uncommon for there to be a disconnect between gathering data from centrally managed systems and specific teaching and learning systems such as VLEs. So at a strategic level, decisions need to be taken about overall data management, sharing and re-use e.g. what sytems hold the most useful/ valuable data? What formats is avaiable in? Who has access to the data and how can it be used to develop actional insights? To paraphrase from a presentation I gave with my colleague Adam Cooper last week “how data ready and capabile is your organisation?”, both in terms of people and systems.

As well as data considerations, policies (both internally and externally) need to be developed in terms of ethical use of data, and also in terms of developing staff and the wider organisational culture to developed data informed practices. Of course, part of the answers to these issues lie in sharing in the sharing and development of practice through organisations suchs as JISC. The paper highlights a number of examples of JISC funded projects.  

Although the paper concentrates mainly on HEIs, many of the same considerations are relevant to the Further Education colleges. Again we see this paper as a step in widening participation and identifying areas for further work. 

At an overview level the paper aims to:

*Characterise the educational data ecosystem, taking account of both institutional and individual needs
*Recognise the range of stakeholders and actors – institutions, services (including shared above-campus and contracted out), agencies, vendors
*Balance strategic policy approaches with tactical advances
*Highlight data that may or may not be collected
*Identify opportunities, issues and concerns arising

As ever we’d welcome feedback on any of the issues raised in the paper, and sharing of any experiences and thoughts in the comments.

The paper is available to download from here.