Comments on: On the Question of Validity in Learning Analytics http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/ Cetis Blogs Wed, 07 Jan 2015 09:19:39 +0000 hourly 1 http://wordpress.org/?v=4.1.22 By: Our pick of Cetis posts 2014 | Christina Smart http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/#comment-97692 Wed, 07 Jan 2015 09:19:39 +0000 http://blogs.cetis.org.uk/adam/?p=830#comment-97692 […] On the Question of Validity in Learning Analytics […]

]]>
By: Adam Cooper http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/#comment-97258 Tue, 06 Jan 2015 10:26:25 +0000 http://blogs.cetis.org.uk/adam/?p=830#comment-97258 Since this is cross-posted from the LACE site, some comments and ping-backs are missing; I would like to draw attention to a “response” post from Timothy Harfield at:
http://timothyharfield.com/blog/2014/10/27/against-dumbing-down-a-response-to-adam-coopers-on-the-question-of-validity-in-learning-analytics/

]]>
By: Worth Reading” is a hand-picked weekly collection of new and not-so-new articles, ideas, events and other items for busy professionals in higher education that prefer to spend their reading wisely. | Acrobatiq http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/#comment-37748 Thu, 30 Oct 2014 22:01:11 +0000 http://blogs.cetis.org.uk/adam/?p=830#comment-37748 […] On the Question of Validity in Learning Analytics […]

]]>
By: Howard http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/#comment-33092 Mon, 27 Oct 2014 18:01:49 +0000 http://blogs.cetis.org.uk/adam/?p=830#comment-33092 A recent book “Validity in Educational and Psychological Assessment” (Newton & Shaw, 2014) that I’ve been reading has a historic view of test validity that may be relevant. Their take is that validity is now heading toward being included with overall practice evaluation that merges a research based approach into technical and social / ethical issues. Evaluation here includes making arguments about the achievement of measurement objectives, decision-making objectives and impact objectives.

]]>
By: Adam Cooper http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/#comment-22391 Tue, 21 Oct 2014 13:37:57 +0000 http://blogs.cetis.org.uk/adam/?p=830#comment-22391 Mike –
thanks for the comment. My remark “learning analytics undertaken without validity being accounted for would be ethically questionable” was a little bit of comment-baiting…

I certainly acknowledge that we need to allow experimentation and some degree of risk-taking, but I think we also need to take a nuanced view of what is effective, sometimes recognising that “hard facts” about effectiveness may be misleading or irrelevant, or be in need of revision. Sometimes “moving the needle” may be too stringent a requirement, or an inappropriate means of addressing change in a complex system; sometimes the qualitative effects of a LA initiative may justify it.

So, a less provocative formulation would be: “learning analytics undertaken at scale without considering what kind of evidence is appropriate, and what evidence actually exists.”

Cheers, Adam

]]>
By: Mike Sharkey http://blogs.cetis.org.uk/adam/2014/10/17/on-the-question-of-validity-in-learning-analytics/#comment-21023 Mon, 20 Oct 2014 18:41:39 +0000 http://blogs.cetis.org.uk/adam/?p=830#comment-21023 Good points here, Adam. We need to have more dialog about measuring the impact of analytics. That is, did it move the needle? As someone who recently changed from a practitioner to a vendor, I’ve seen this from multiple sides. I wouldn’t say that analytics without validity is ethically questionable. I just think it’s REALLY hard to undertake a project that proves efficacy. It takes many months to do and it involves a lot of setup to frame the test correctly. Since these tend to be big obstacles, you find folks relying on a measure like accuracy or using the output from one small pilot as their evidence. It’s not unethical. Just because something doesn’t stand up to a rigorous challenge, that doesn’t mean it’s bad. It just means you need to be extra careful.

On the topic of validity, I have executed a fairly rigorous experiment in the past (http://bluecanarydata.com/evidence). I’m also launching one with a new client this week. If all goes well and if the client allows, I’ll share the outcomes in 6 months (yes…it’s going to take that long to see if we’ve successfully moved the needle).

Thanks,

Mike Sharkey
Blue Canary
http://www.bluecanarydata.com

]]>