• Skip to main content
  • Skip to footer

MakingBetter

Strategy. Ecosystems. Meaning.

  • Consulting
    • Learning Technology Package
    • Case Studies
      • Connecting Skill Growth and Business Results
      • Data Analysis and Visualization
      • Dashboard and Adaptive Strategy
  • Data Interoperability
    • DISC
    • xAPI Camp
      • Upcoming Camps
      • Camp Archives
    • xAPI Quarterly
    • Investigating Performance
  • Who We Are
  • News
  • Let’s Talk

Learning Analytics

Metrics in Learning Analytics

January 13, 2021 By Aaron

Before life in the United States was rudely interrupted (to put it extremely mildly) last week, I teased the concept of metrics in learning analytics. Where dimensions frame a problem space, metrics are (literally) the things I look at to model the problem space. In other words, metrics are how I make sense of an analysis.

let’s learn about metrics in learning analytics!

Problem Spaces need a model that describes the space (dimensions) that enable me to make sense of it, or “perform an analysis.”

More than counting.

My analysis is always going to be defined by my metrics. Often, I need to count or tally things, but COUNTS ARE NOT ANALYSIS. In presentations, I often talk about how an xAPI statement is an observation. I explicitly associate that a given metric will likely have an xAPI statement to support an observation (observations = metrics = xAPI statements).

Here’s an example of how I generally approach learning and job-task analysis (relating the learning to doing). While these dimensions of the learner, the learning activity, and the learning experience might make sense to a learning professional, it’s important to remember that these dimensions aren’t (and shouldn’t be) the only way to do learning analytics. I’m expressing a complex enough model that if 80% of readers copy and paste without more forethought, it hopefully will point them in better directions than they were going before reading (the whole “making better” thing).

“All models are wrong, but some are useful.”

Anyone putting together an learning analytics strategy is arbitrarily framing a box, more or less, in which learning happens. Only through the pinholes of such a theoretical box can learning be observed. Then, only by putting those observations together can an analysis be performed.

simple model of learning analytics dimensions: Learner by learning activity by learning experience
Pinholes in a theoretical closed box

So what should those pinholes be? This seems like a pretty straightforward question. Working with stakeholders to define metrics in learning analytics requires patience, planning and practice. I’ll share methods for getting to these metrics in the coming weeks.

Barring further interruptions, tomorrow I’ll review MakingBetter’s work over the past year. On Friday, expect some catch-up on what I have been up to, with Elsevier, ADL and IEEE. Lots. To. Share.

Filed Under: Learning Analytics Tagged With: dimensions, learning analytics, metrics, strategy, xAPI

Dimensions in Learning Analytics

January 5, 2021 By Aaron

simple model of learning analytics dimensions: Learner by learning activity by learning experience

Dimensions, in learning analytics, are the ways we describe the (theoretical) space in which we’ll analyze… learning. I think of a given xAPI Profile as the technical instructions I want tools, systems and services to calibrate on for the way I need to make sense of the data. Yesterday, I shared a demo given of the forthcoming ADL xAPI Profile Server. Today, I step back to share a mental picture of how learning analytics can be framed.

In 2019, I collaborated with Judy Katz on some ideas for visualizing learning analytics strategy for the xAPI Camp @ Learning Solutions 2019. Judy and I each had our takes at the time on how we might frame the “space,” but with almost two years to reflect and multiple opportunities to put those ideas into practice, it’s abundantly clear how we each labeled the dimensions was less important than how we organized and measured similarly.

Dimensions

A simple way to think about learning analytics is that when we want to analyze learning, we look for information about:

simple model of learning analytics dimensions: Learner by learning activity by learning experience
A rather simple model of the dimensions of learning analytics.
  • The Learner,
  • The Learning Activity, and
  • The Learning Experience

… and all of these analyzed over Time as a constant means of comparison.

Dimensions tend to stay affixed, once set. The trajectories along which we measure things likely will remain the same. Much investment of time and human capital is built on this framing, so before anyone codes anything, this model should be treated as something that can be revised, even tossed out, in favor of something better. The investment of time and effort in planning is minimal, no matter how long the process takes, compared to the costs of implementing the wrong learning analytics strategy.

Metrics

Along each of these dimensions, I’d identify metrics. Metrics help to understand the dimension of learning I’m analyzing. For example, I might break a Learner dimension down into the type of Learner, or user, anticipated by the learning experience. If I’m developing a learning solution for practicing nurses, the “Learner” likely includes the “nurse” but may also include other roles, like a Nurse Preceptor or a Nurse Educator. A dimension like “Learner” should account for every type of participant.

Tomorrow, I’ll start diving into the ways I might break down those metrics more thoughtfully, related to the Learning Activity vs. the Learning Experience.

xAPI-Camp-Learning-Solutions-2019-Learning-Analytics-StrategyDownload

Filed Under: Learning Analytics Tagged With: dimensions, learning analytics, metrics, strategy, xAPI Profiles

Level Up Learning Data with xAPI in 2021

January 4, 2021 By Aaron

You’re likely going to be reading up seriously on how to work with learning data sometime in 2021. This will be true regardless of whether you work in online K-12, Higher Education or professional training — anywhere in the world. If you will “own” xAPI know-how on your team (or maybe for the whole organization), I want to help you.

Megan and I have certainly been in your shoes.

So, I made a resolution for 2021 – I’m going to try and blog daily again like I did maybe 15 years ago about Flash For eLearning. It’s time for us all to level our data game up.

First, let’s start with this webinar (above), where I recently demonstrated a new tool coming in Q2 from ADL, the xAPI Profile Server we produced for ADL with our friends from Veracity. In the video, I explain what this is, what problems it’s going to help with and how you’ll likely put it to work. Spoiler: it’s a tool to help you author and serve xAPI Profiles, doing a lot of the heavy lifting of complex rules thingies for you.

Next, I’ll blog about how to frame a problem space to apply learning analytics with dimensions.

Filed Under: Experience API, Learning Analytics Tagged With: demos, recordings, webinars, xAPI, xAPI Profile Server, xAPI Profiles

xAPI Camp Preview: dominKnow ONE

March 19, 2019 By Aaron

On March 26, 2019, I’m gathering best examples of the tools that enable, and the professionals who define xAPI’s best practices at xAPI Camp (number 14!) hosted by the eLearning Guild at the Learning Solutions conference in Orlando, FL. In the run-up to the event, I’m highlighting these industry colleagues, and why I think they’re enabling and defining best practices. Today’s post focuses on what dominKnow is doing for savvy instructional designers with their ONE platform, with different authoring modes for different types of content authoring.

If you have some questions about the learner, the learning experience or the efficacy of the learning program, there’s some impressive flexibility in dominKnow ONE I want to point out.

About that Competency Dashboard in RISC…

In my last post, I shared a competency reporting dashboard in RISC’s VTA. RISC’s dashboard (like many useful dashboards) relies on specific, well-formed data. Two companies offer such flexibility to customize what and how you track learners engaging with their authoring tools: Trivantis and dominKnow.

For, seriously, a little extra effort, generating custom statements that conform with profiles or just follow best practices offers so many more valuable insights. In what’s been demonstrated previously, much of the workflow to track to competencies or learning objectives is even automated, meaning as a content author, you wouldn’t have to “code” so much as keep content organized… which you’d want to do anyway as just good instructional design. dominKnow ONE by default, and with no programming, provides a hierarchy and tracking of Course > Module > Learning Object, in which the “Learning Object” contains the content and interactions that, together, would support meeting a particular learning objective or competency. When a learner satisfies the requirements to complete the learning object, the dashboard will show that learner has met the competency requirement(s).

Taking it to the next level and optimizing an individual’s learning

You probably already know that xAPI allows us to track all sorts of information, including the aforementioned objectives/competencies. The data can be leveraged by content, though — a capability that many rapid authoring tools just don’t leverage. At next week’s xAPI Camp at Learning Solutions, dominKnow will demonstrate how you can track author-defined competencies in your content and then use the xAPI data you tracked to dynamically personalize the learner’s content. If a learner has already demonstrated/completed a given competency, why should they be forced to re-demonstrate it or an author need to create unique learning content for each user use case? Not only is it cool — it’s interoperable and dominKnow will demonstrate that while doing this requires instructional design skills, no coding skills are required.

Why is this good?

Out of the box, dominKnow is making it easy to organize content without locking you into this default way of organizing your content, leveraging that to personalize your learning, and enabling you to more easily tie it to your company’s objectives. That’s exactly the flexibility and ease I want to track and report out with xAPI. For most content I need to work with, the 1:1 Learning Object:Competency Objective relationship works, but if I have something more complex, I can take dominKnow ONE down the rabbit hole. As much of xAPI’s best practices require tinkering, the more flexible tools I have available to me, the more use I have for them.

Filed Under: Content Strategy, Experience API, Learning Analytics

xAPI Camp Preview: RISC VTA

March 12, 2019 By Aaron

On March 26, 2019, I’m gathering some of the best tools that enable, and the professionals who define, xAPI’s best practices. This will happen at the upcoming xAPI Camp (number 14!) hosted by the eLearning Guild at the Learning Solutions conference in Orlando, FL. In the run-up to the event, I aim to highlight who’s going to be there, and why I think they’re enabling and defining best practices. Today’s post focuses on the outstanding capabilities RISC has built into its Virtual Training Assistant (VTA) learning management system that enable flexible training approaches, like spaced learning.

When Knowing Matters

A question I get asked regularly is “when is xAPI really worth doing?” and my answer is always something like “xAPI is worth doing when the stakes are high and the cost of failure is too heavy.” When it comes to safety and health — when lives are at stake — that’s when a deep analysis with xAPI is worth the effort. RISC’s VTA is geared for high-stakes compliance needs. If your workforce is on a gas or oil pipeline and you care that they can do the job and not hurt themselves or others in the process, then knowing the capabilities of a given worker and the team(s) they’re on matters.

My funk soul brother Duncan Welder goes into detail on how RISC approaches tracking competencies in VTA here. It’s worth the read and to see the dashboard in action, but I can tell a lot about what they’re doing in terms of how they’re using xAPI by what I see here.

Why is this good?

Checklists, Competency Assessments, Questions, Pass/Fail activities — these are different types of xAPI activities VTA is reporting on. Using xAPI to also detail incident reporting makes it possible to juxtapose the learning activities with real-world outcomes to demonstrate the correlations between learning performance and job performance. There is so much depth we can get into with this established, as RISC can continue to get more granular with the progress toward specific competencies and the incident trends over time.

That’s exactly what I want to track and report out with xAPI. From what I’ve seen in the market, there are some really good LRSs providing solid reporting on data when there’s no assumption about what’s actually in the data. RISC, however, has a history of designing reports that tells specific stories about specific learning activities, like their PDF Annotator and Message Board features. What RISC can report may not be for everyone, but if you want specific insights, there’s no other vendor on the market that’s able to give you these insights with this fidelity.

Andrew Downes, in his stunning blog series documenting just what exactly customers are tracking with Watershed, breaks down a high level categorization of learning analytics distinguishing the learner, the learning experience and the learning program. Downes cites that Wateshed customers learning analytics are about the learning program, and ery little attention is tracked by Watershed customers about the learning experience or the learners themselves. In that light, it’s even more impressive to me that RISC is tracking competencies in this way because performance-to-competency is a dimension that sheds light not just on the learning program but the learner and the learning experience, too.

How does this work with “spaced learning?”

When Duncan and Art Werkenthin present at xAPI Camp on March 26, they’re going to focus on spaced learning — the delivery of small, repetitive pieces of learning content with breaks between the learner’s sessions with the content. It’s a researched approach to learning content delivery that is particularly effective for long-term retention. There are a lot of technical things RISC is doing with xAPI (like relying on cmi5 for reliably handling the launch of xAPI-enabled learning content). VTA’s support for cmi5 reduces the variability of the learning experience and technically enables a spaced learning approach (among enabling lots of other learning experiences). The reality is that RISC’s implementation of cmi5 makes a lot of things possible where we need to know who the learner is, and we want to track learning activity to an LMS.

And, with RISC able to track a spaced learning experience, they can look at the data on the content delivered over time, against the performance outcomes (like incidents reported) and optimize the spacing relative to the type of content, the subject of the content, the complexity of the job performance, etc. VTA clearly gives you all the tools you need to do this. All you need is content that uses xAPI in specific ways to populate these reports with data that is formatted to deliver insights. After all, I’m highlighting what RISC can report, but that requires data that is useful and usable to populate these reports with appropriately formatted data.

For that, I can’t wait to share my next blog post on what DominKnow is doing that enables what RISC (and other tools) can report on.

Interested in xAPI visualizations but can’t make xAPI Camp?  Duncan and Art have a session Wednesday, March 27th at Learning Solutions on getting measurable results with xAPI titled Design with the End In Mind.

Filed Under: Experience API, Learning Analytics Tagged With: best practice, competencies, xAPI

Drink freely and Chat xAPI at LAK

March 9, 2017 By Megan

I’m very excited to be at LAK next week. So excited that I’m arranging (free) drinks Tuesday at 5:30! You can find us at Malone’s Social Lounge and Taphouse. See that? Social is built right into the name, can’t help but chat xAPI with friends there. It’s just a 2 minute walk from the conference (evidence below.)

Details

What

Drink freely with xAPI friends

When

Tuesday March 14th 5:30-6:30

Where

Malone’s
525 Seymour Street, Vancouver, BC.
On the corner of Seymour and Pender

Who

You. And anyone you want to bring along.

Filed Under: Community, Learning Analytics

Learning Analytics Policy – Looking for Input!

January 31, 2017 By Megan

I’m heading to LAK 17 in Vancouver, BC to help a great crew from many countries, in a workshop on learning analytics policy (LAP). Since learning analytics policy is evolving everywhere, we’re looking for input from people, everywhere. K-12, corporate, higher ed – all important.

There’s opportunity to contribute a paper for this event, so if you’re really motivated, go ahead and fill out this survey.  

Examples Please!

Researching this for the learning analytics community, what we really need are more examples. Answers to any of the following questions would be helpful in establishing a baseline of knowledge we are putting together.

  • What learning analytics policies have you encountered?
  • How did they affect your implementation?
  • What workarounds have you planned, knowing certain policies would need to be accommodated?
  • How did a policy affect your outcome?

Narrative descriptions, case studies or just links to the policy documents would all be helpful. Please, send me an email at megan@makingbetter.us if you have something to contribute! 

Filed Under: Learning Analytics, Uncategorized

Footer

Sign up for email updates

Learn about xAPI

  • About xAPI
  • Adaptivity
  • Authoring Tools
  • Case Studies
  • cmi5
  • Content Strategy
  • Data Strategy
  • Learning Analytics
  • xAPI Statements

Learn More

Upcoming xAPI Camps!

Search

  • Home
  • Connections Forum
  • Investigating Performance
  • xAPI Quarterly
  • xAPI Camp
  • How We Can Help
  • Case Studies
  • Learning Tech Consulting
  • Talk to Us!
  • News

Copyright 2017 MakingBetter