• Skip to main content
  • Skip to footer

MakingBetter

Skills and Strategies to Engage, Change and Grow Others

  • Data Interoperability
    • xAPI Camp Archives
    • xAPI Quarterly
    • Investigating Performance
  • Who We Are
  • News
  • Let’s Talk

strategy

Question-storming

April 29, 2021 By MakingBetter

white clouds

“Question-storming” is an exercise I tapped into back in the very early xAPI days. I hit a wall; one I suspect most of y’all come up against when you’re ready to start doing something meaningful with xAPI. I struggled to articulate questions that felt like it made xAPI worth doing, and I would get discouraged about it.

A headshot of Aaron sporting a thick "fu-manchu" mustache.
Speaking of question-able choices, what about my mustache circa 2016?

When I look at what analytics solutions do (not data visualization tools – I’m talking web analytics), the analytics are geared towards defined funnels and targeted actions that map to KPIs. That’s a little too simplistic for what I plan to infer from the datasets y’all and I will build, together, with xAPI.

xAPI can provide feedback that is more attuned to how people think and relate information. It requires a different perspective on what questions are important to answer with data than how analytics work most everywhere else in technology. It’s no wonder to feel frustrated at the start to design, engineer or architect beyond replicating patterns established in eLearning or video.

Waxing Philosophical

All the learning environments we create online, all the social media we use, the apps we build, the laptop or tablet or phone you’re reading this on… it’s all built on a system that, at its very metaphorical base, is one of files and folders. One might think of hard drives as file cabinets, etc.… nice, neat and tidy. There are rights to knowing things in a folder vs only knowing about a specific file. There are ways to reference information precisely. This is important and useful stuff for referencing things technically.

The thing is that everything I read and learned and ultimately synthesized through my master’s program at University of Wisconsin informs me that humans organize information related to people, to places, to shared experiences. These relationships are not limited to strict hierarchies, like file and folder structures, as much of our technologies are. A hypothesis I have is that after two decades of constant online living, we as people get hyperfocused on the box we’re in, when we need to be looking through all the things that are outside our box we can play with.

Constants like time, for example, exist beyond the limits of your, and my, creativity (or lack of it). It’s a handy place to start question-storming because time is a concept that everyone involved with making, or taking meaning from the data can understand similarly.

Prioritize Research

At Elsevier, my product’s manager, Lauren, fleshed out a quarterly OKR that puts applied research and development as a focus for our team’s work. Each quarter, we decided, we’ll come up with a question to answer with our data set that we don’t quite know how to answer. A chance for us to improve our capacity and capability for learning analytics, as a cross-functioning product team that includes software and quality engineers, product managers, UX designers, commercial folks, content folks, users in multiple roles.

It’s the investment in drinking our homebrew that excites me most. We are all new at doing this. We get smarter about learning analytics when we establish trust among everyone who depends on the same set of data, regardless of why it may be important for different reasons to different people.

Yes, I’m talking ‘bout Praxis.

Question-storming “How Long Does It Take…?”

So, with time as our ally, Team Apollo at Elsevier began question-storming “How long does it take to get through an activity?”

Seems like a really simple question, right? Give it a few seconds to get your own doubts going. Turn those into questions.

  • Did he mean “learning activity” or, like, a survey?
  • What does it mean to “get through” an activity, specifically?
    • All the way to the last frame of linear content?
    • Do you need to have seen all the content?
    • Must you have passed in order to be considered complete?

Now… reframe the main question to reflect all the permutations of these bulleted conditions on how to interpret it. That is the question-storming.

Next post, I’ll highlight what we produced to answer our research question(s), what statements we used… I’ll go full nerd. I might get into how the results set a research agenda for the year. Meanwhile, in the comments, maybe share your takes:

For now, what are other ways to interpret “How long does it take to get through an activity?”

Filed Under: Learning Analytics Tagged With: design, question-storming, strategy, xAPI

Metrics in Learning Analytics

January 13, 2021 By Aaron

Before life in the United States was rudely interrupted (to put it extremely mildly) last week, I teased the concept of metrics in learning analytics. Where dimensions frame a problem space, metrics are (literally) the things I look at to model the problem space. In other words, metrics are how I make sense of an analysis.

let’s learn about metrics in learning analytics!

Problem Spaces need a model that describes the space (dimensions) that enable me to make sense of it, or “perform an analysis.”

More than counting.

My analysis is always going to be defined by my metrics. Often, I need to count or tally things, but COUNTS ARE NOT ANALYSIS. In presentations, I often talk about how an xAPI statement is an observation. I explicitly associate that a given metric will likely have an xAPI statement to support an observation (observations = metrics = xAPI statements).

Here’s an example of how I generally approach learning and job-task analysis (relating the learning to doing). While these dimensions of the learner, the learning activity, and the learning experience might make sense to a learning professional, it’s important to remember that these dimensions aren’t (and shouldn’t be) the only way to do learning analytics. I’m expressing a complex enough model that if 80% of readers copy and paste without more forethought, it hopefully will point them in better directions than they were going before reading (the whole “making better” thing).

“All models are wrong, but some are useful.”

Anyone putting together an learning analytics strategy is arbitrarily framing a box, more or less, in which learning happens. Only through the pinholes of such a theoretical box can learning be observed. Then, only by putting those observations together can an analysis be performed.

simple model of learning analytics dimensions: Learner by learning activity by learning experience
Pinholes in a theoretical closed box

So what should those pinholes be? This seems like a pretty straightforward question. Working with stakeholders to define metrics in learning analytics requires patience, planning and practice. I’ll share methods for getting to these metrics in the coming weeks.

Barring further interruptions, tomorrow I’ll review MakingBetter’s work over the past year. On Friday, expect some catch-up on what I have been up to, with Elsevier, ADL and IEEE. Lots. To. Share.

Filed Under: Learning Analytics Tagged With: dimensions, learning analytics, metrics, strategy, xAPI

Dimensions in Learning Analytics

January 5, 2021 By Aaron

simple model of learning analytics dimensions: Learner by learning activity by learning experience

Dimensions, in learning analytics, are the ways we describe the (theoretical) space in which we’ll analyze… learning. I think of a given xAPI Profile as the technical instructions I want tools, systems and services to calibrate on for the way I need to make sense of the data. Yesterday, I shared a demo given of the forthcoming ADL xAPI Profile Server. Today, I step back to share a mental picture of how learning analytics can be framed.

In 2019, I collaborated with Judy Katz on some ideas for visualizing learning analytics strategy for the xAPI Camp @ Learning Solutions 2019. Judy and I each had our takes at the time on how we might frame the “space,” but with almost two years to reflect and multiple opportunities to put those ideas into practice, it’s abundantly clear how we each labeled the dimensions was less important than how we organized and measured similarly.

Dimensions

A simple way to think about learning analytics is that when we want to analyze learning, we look for information about:

simple model of learning analytics dimensions: Learner by learning activity by learning experience
A rather simple model of the dimensions of learning analytics.
  • The Learner,
  • The Learning Activity, and
  • The Learning Experience

… and all of these analyzed over Time as a constant means of comparison.

Dimensions tend to stay affixed, once set. The trajectories along which we measure things likely will remain the same. Much investment of time and human capital is built on this framing, so before anyone codes anything, this model should be treated as something that can be revised, even tossed out, in favor of something better. The investment of time and effort in planning is minimal, no matter how long the process takes, compared to the costs of implementing the wrong learning analytics strategy.

Metrics

Along each of these dimensions, I’d identify metrics. Metrics help to understand the dimension of learning I’m analyzing. For example, I might break a Learner dimension down into the type of Learner, or user, anticipated by the learning experience. If I’m developing a learning solution for practicing nurses, the “Learner” likely includes the “nurse” but may also include other roles, like a Nurse Preceptor or a Nurse Educator. A dimension like “Learner” should account for every type of participant.

Tomorrow, I’ll start diving into the ways I might break down those metrics more thoughtfully, related to the Learning Activity vs. the Learning Experience.

xAPI-Camp-Learning-Solutions-2019-Learning-Analytics-StrategyDownload

Filed Under: Learning Analytics Tagged With: dimensions, learning analytics, metrics, strategy, xAPI Profiles

Footer

Sign up for email updates

Learn about xAPI

  • About xAPI
  • Adaptivity
  • Authoring Tools
  • Case Studies
  • cmi5
  • Content Strategy
  • Data Strategy
  • Learning Analytics
  • xAPI Statements

Learn More

Upcoming xAPI Camps!

Search

  • Home
  • Connections Forum
  • Investigating Performance
  • xAPI Quarterly
  • xAPI Camp
  • How We Can Help
  • Case Studies
  • Learning Tech Consulting
  • Talk to Us!
  • News

Copyright ©2014-2022, MakingBetter d/b/a Bowe & Silvers Partners LLC.

 

Loading Comments...