We have just published the results of an informal survey undertaken by Cetis to:
- Assess the current state of analytics in UK FE/HE.
- Identify the challenges and barriers to using analytics.
The report is available from the Cetis publications site.
For the purpose of the survey, we defined our use of “analytics” to be the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data. In practical terms, it involves trying to find out things about an organisation, its products services and operations, to help inform decisions about what to do next.
Various domains of decision-making are encompassed in this definition: the kinds of decision that is readily understood by a business-person, whatever their line of business; questions of an essentially educational character; and decisions relating to the management of research. The line of questioning was inclusive of these three perspectives. The questions asked were:
1. Which education sector do you work in (or with)?
2. What is your role in your institution?
3. In your institution which department(s) are leading institutional analytics activities and services?
4. In your institution, how aware are staff about recent developments in analytics?
5. Do the following roles use the results of statistical analysis such as correlation or significance testing rather than simple reporting of data in charts or tables?
6. Which of the following sources are used to supply data for analytics activities?
7. Which of the following data collection and analytics technologies are in place in your institution?
8. Please name the supplier/product of the principle software in use (e.g. IBM Cognos, SPSS, Tableau, Excel)
9. Which of the following staff capabilities are in place in your institution?
10a. What are the drivers for taking analytics based approaches in your institution?
10b. What are the current barriers for using of analytics in your institution?
The informal nature of the survey, coupled with the small number of responses, means that the resulting data cannot be assumed to represent the true state of affairs. Hence, no “analytics” has been done using the data and the report is written as a stimulus both for discussion and for more thorough investigation into some of the areas where the survey responses hint at an issue.
If you have any reactions – surprise, agreement, contention, etc – or evidence that helps to build a better picture of the state of analytics, please comment.
[edit: I should have mentioned the EDUCAUSE/ECAR 2012 survey of US HE for comparison – http://www.educause.edu/library/resources/2012-ecar-study-analytics-higher-education]
Pingback: Policy and Strategy for Systemic Deployment of Learning Analytics – Barriers and Potential Pitfalls | Adam Cooper