• Skip to main content
  • Skip to footer

MakingBetter

Strategy. Ecosystems. Meaning.

  • Consulting
    • Learning Technology Package
    • Case Studies
      • Connecting Skill Growth and Business Results
      • Data Analysis and Visualization
      • Dashboard and Adaptive Strategy
  • Data Interoperability
    • DISC
    • xAPI Camp
      • Upcoming Camps
      • Camp Archives
    • xAPI Quarterly
    • Investigating Performance
  • Who We Are
  • News
  • Let’s Talk

News

Metrics in Learning Analytics

January 13, 2021 By Aaron Leave a Comment

Before life in the United States was rudely interrupted (to put it extremely mildly) last week, I teased the concept of metrics in learning analytics. Where dimensions frame a problem space, metrics are (literally) the things I look at to model the problem space. In other words, metrics are how I make sense of an analysis.

let’s learn about metrics in learning analytics!

Problem Spaces need a model that describes the space (dimensions) that enable me to make sense of it, or “perform an analysis.”

More than counting.

My analysis is always going to be defined by my metrics. Often, I need to count or tally things, but COUNTS ARE NOT ANALYSIS. In presentations, I often talk about how an xAPI statement is an observation. I explicitly associate that a given metric will likely have an xAPI statement to support an observation (observations = metrics = xAPI statements).

Here’s an example of how I generally approach learning and job-task analysis (relating the learning to doing). While these dimensions of the learner, the learning activity, and the learning experience might make sense to a learning professional, it’s important to remember that these dimensions aren’t (and shouldn’t be) the only way to do learning analytics. I’m expressing a complex enough model that if 80% of readers copy and paste without more forethought, it hopefully will point them in better directions than they were going before reading (the whole “making better” thing).

“All models are wrong, but some are useful.”

Anyone putting together an learning analytics strategy is arbitrarily framing a box, more or less, in which learning happens. Only through the pinholes of such a theoretical box can learning be observed. Then, only by putting those observations together can an analysis be performed.

simple model of learning analytics dimensions: Learner by learning activity by learning experience
Pinholes in a theoretical closed box

So what should those pinholes be? This seems like a pretty straightforward question. Working with stakeholders to define metrics in learning analytics requires patience, planning and practice. I’ll share methods for getting to these metrics in the coming weeks.

Barring further interruptions, tomorrow I’ll review MakingBetter’s work over the past year. On Friday, expect some catch-up on what I have been up to, with Elsevier, ADL and IEEE. Lots. To. Share.

Filed Under: Learning Analytics Tagged With: dimensions, learning analytics, metrics, strategy, xAPI

Dimensions in Learning Analytics

January 5, 2021 By Aaron Leave a Comment

simple model of learning analytics dimensions: Learner by learning activity by learning experience

Dimensions, in learning analytics, are the ways we describe the (theoretical) space in which we’ll analyze… learning. I think of a given xAPI Profile as the technical instructions I want tools, systems and services to calibrate on for the way I need to make sense of the data. Yesterday, I shared a demo given of the forthcoming ADL xAPI Profile Server. Today, I step back to share a mental picture of how learning analytics can be framed.

In 2019, I collaborated with Judy Katz on some ideas for visualizing learning analytics strategy for the xAPI Camp @ Learning Solutions 2019. Judy and I each had our takes at the time on how we might frame the “space,” but with almost two years to reflect and multiple opportunities to put those ideas into practice, it’s abundantly clear how we each labeled the dimensions was less important than how we organized and measured similarly.

Dimensions

A simple way to think about learning analytics is that when we want to analyze learning, we look for information about:

simple model of learning analytics dimensions: Learner by learning activity by learning experience
A rather simple model of the dimensions of learning analytics.
  • The Learner,
  • The Learning Activity, and
  • The Learning Experience

… and all of these analyzed over Time as a constant means of comparison.

Dimensions tend to stay affixed, once set. The trajectories along which we measure things likely will remain the same. Much investment of time and human capital is built on this framing, so before anyone codes anything, this model should be treated as something that can be revised, even tossed out, in favor of something better. The investment of time and effort in planning is minimal, no matter how long the process takes, compared to the costs of implementing the wrong learning analytics strategy.

Metrics

Along each of these dimensions, I’d identify metrics. Metrics help to understand the dimension of learning I’m analyzing. For example, I might break a Learner dimension down into the type of Learner, or user, anticipated by the learning experience. If I’m developing a learning solution for practicing nurses, the “Learner” likely includes the “nurse” but may also include other roles, like a Nurse Preceptor or a Nurse Educator. A dimension like “Learner” should account for every type of participant.

Tomorrow, I’ll start diving into the ways I might break down those metrics more thoughtfully, related to the Learning Activity vs. the Learning Experience.

xAPI-Camp-Learning-Solutions-2019-Learning-Analytics-StrategyDownload

Filed Under: Learning Analytics Tagged With: dimensions, learning analytics, metrics, strategy, xAPI Profiles

Level Up Learning Data with xAPI in 2021

January 4, 2021 By Aaron

You’re likely going to be reading up seriously on how to work with learning data sometime in 2021. This will be true regardless of whether you work in online K-12, Higher Education or professional training — anywhere in the world. If you will “own” xAPI know-how on your team (or maybe for the whole organization), I want to help you.

Megan and I have certainly been in your shoes.

So, I made a resolution for 2021 – I’m going to try and blog daily again like I did maybe 15 years ago about Flash For eLearning. It’s time for us all to level our data game up.

First, let’s start with this webinar (above), where I recently demonstrated a new tool coming in Q2 from ADL, the xAPI Profile Server we produced for ADL with our friends from Veracity. In the video, I explain what this is, what problems it’s going to help with and how you’ll likely put it to work. Spoiler: it’s a tool to help you author and serve xAPI Profiles, doing a lot of the heavy lifting of complex rules thingies for you.

Next, I’ll blog about how to frame a problem space to apply learning analytics with dimensions.

Filed Under: Experience API, Learning Analytics Tagged With: demos, recordings, webinars, xAPI, xAPI Profile Server, xAPI Profiles

The One-Year Countdown to an xAPI Profile Server

November 18, 2019 By Aaron

tl;dr: Wednesday, 20 November at 1pm EST, the Advanced Distributed Learning (ADL) Initiative will host a webinar on an xAPI Profile Publishing Tool & Server being developed for 2020. 

Since 2017, Megan & I have been pretty heads-down, laser-focused on doing what we can to harden our best practices to work with xAPI to further support ADL’s mission. We researched and documented conformance requirements for learning record stores (LRSs). We gathered best practices and worked with deep subject matter experts to develop a companion spec to xAPI (xAPI Profiles). We gathered the requirements for an xAPI Profile Server capable of publishing, validating and serving (via an API) xAPI Profiles. Now, thanks to ADL, we’re designing and developing a reference xAPI Profile Server with our friends (and former colleagues) at Veracity Technology Consultants!

How Will an xAPI Profile Help L&D Capabilities Scale?

Depending on how removed you are from touching anything that looks like code, if you’re reading this, you’re likely to have some idea of how much non-value-added work goes into how  L&D content and services are delivered today. Software integrations are difficult and almost always custom. Testing eLearning in LMSs (specifically SCORM or AICC) is manual and laborious. Reporting anything other than completions and scores from your LMS can seem impossible. Even if you’re managing content, you’re probably not managing what given learning content tracks, let alone reports to the LMS. This state of eLearning has stagnated over 10 years, which resulted in critical processes that must still be rigid (and often fragile) to author, publish, test, deliver and revise eLearning content everywhere.

xAPI Profiles make it possible to exercise some control for the quality and consistency of learning data generated by related learning activities. xAPI Profiles as specified today directly address how verbs, activity types, attachment types and extensions can be governed by xAPI Profiles. The xAPI Profile Server will provide a wizard-like interface to make authoring and publishing. The xAPI Profile Server will support publishing of the JSON-LD xAPI Profile so applicable technical rules that should govern any subscribing learning experience can be managed together with one tool. Users will be able to validate their xAPI Profiles. The application will serve xAPI Profiles via an API enabling workflows integrations with the means to revise and push changes in xAPI Profiles to subscribing systems and services.

On Wednesday, 20 November, Jason Haag of Veracity Technical Consultants and I will co-present on the work our teams are working on together. We’ll talk about some ins and outs of what you can do with xAPI Profiles, and some more details on what to expect from the xAPI Profile Server. Sign up for the webinar: https://attendee.gotowebinar.com/register/7745952068399329549

Filed Under: Experience API

xAPI Camp Preview: dominKnow ONE

March 19, 2019 By Aaron

On March 26, 2019, I’m gathering best examples of the tools that enable, and the professionals who define xAPI’s best practices at xAPI Camp (number 14!) hosted by the eLearning Guild at the Learning Solutions conference in Orlando, FL. In the run-up to the event, I’m highlighting these industry colleagues, and why I think they’re enabling and defining best practices. Today’s post focuses on what dominKnow is doing for savvy instructional designers with their ONE platform, with different authoring modes for different types of content authoring.

If you have some questions about the learner, the learning experience or the efficacy of the learning program, there’s some impressive flexibility in dominKnow ONE I want to point out.

About that Competency Dashboard in RISC…

In my last post, I shared a competency reporting dashboard in RISC’s VTA. RISC’s dashboard (like many useful dashboards) relies on specific, well-formed data. Two companies offer such flexibility to customize what and how you track learners engaging with their authoring tools: Trivantis and dominKnow.

For, seriously, a little extra effort, generating custom statements that conform with profiles or just follow best practices offers so many more valuable insights. In what’s been demonstrated previously, much of the workflow to track to competencies or learning objectives is even automated, meaning as a content author, you wouldn’t have to “code” so much as keep content organized… which you’d want to do anyway as just good instructional design. dominKnow ONE by default, and with no programming, provides a hierarchy and tracking of Course > Module > Learning Object, in which the “Learning Object” contains the content and interactions that, together, would support meeting a particular learning objective or competency. When a learner satisfies the requirements to complete the learning object, the dashboard will show that learner has met the competency requirement(s).

Taking it to the next level and optimizing an individual’s learning

You probably already know that xAPI allows us to track all sorts of information, including the aforementioned objectives/competencies. The data can be leveraged by content, though — a capability that many rapid authoring tools just don’t leverage. At next week’s xAPI Camp at Learning Solutions, dominKnow will demonstrate how you can track author-defined competencies in your content and then use the xAPI data you tracked to dynamically personalize the learner’s content. If a learner has already demonstrated/completed a given competency, why should they be forced to re-demonstrate it or an author need to create unique learning content for each user use case? Not only is it cool — it’s interoperable and dominKnow will demonstrate that while doing this requires instructional design skills, no coding skills are required.

Why is this good?

Out of the box, dominKnow is making it easy to organize content without locking you into this default way of organizing your content, leveraging that to personalize your learning, and enabling you to more easily tie it to your company’s objectives. That’s exactly the flexibility and ease I want to track and report out with xAPI. For most content I need to work with, the 1:1 Learning Object:Competency Objective relationship works, but if I have something more complex, I can take dominKnow ONE down the rabbit hole. As much of xAPI’s best practices require tinkering, the more flexible tools I have available to me, the more use I have for them.

Filed Under: Content Strategy, Experience API, Learning Analytics

xAPI Camp Preview: RISC VTA

March 12, 2019 By Aaron

On March 26, 2019, I’m gathering some of the best tools that enable, and the professionals who define, xAPI’s best practices. This will happen at the upcoming xAPI Camp (number 14!) hosted by the eLearning Guild at the Learning Solutions conference in Orlando, FL. In the run-up to the event, I aim to highlight who’s going to be there, and why I think they’re enabling and defining best practices. Today’s post focuses on the outstanding capabilities RISC has built into its Virtual Training Assistant (VTA) learning management system that enable flexible training approaches, like spaced learning.

When Knowing Matters

A question I get asked regularly is “when is xAPI really worth doing?” and my answer is always something like “xAPI is worth doing when the stakes are high and the cost of failure is too heavy.” When it comes to safety and health — when lives are at stake — that’s when a deep analysis with xAPI is worth the effort. RISC’s VTA is geared for high-stakes compliance needs. If your workforce is on a gas or oil pipeline and you care that they can do the job and not hurt themselves or others in the process, then knowing the capabilities of a given worker and the team(s) they’re on matters.

My funk soul brother Duncan Welder goes into detail on how RISC approaches tracking competencies in VTA here. It’s worth the read and to see the dashboard in action, but I can tell a lot about what they’re doing in terms of how they’re using xAPI by what I see here.

Why is this good?

Checklists, Competency Assessments, Questions, Pass/Fail activities — these are different types of xAPI activities VTA is reporting on. Using xAPI to also detail incident reporting makes it possible to juxtapose the learning activities with real-world outcomes to demonstrate the correlations between learning performance and job performance. There is so much depth we can get into with this established, as RISC can continue to get more granular with the progress toward specific competencies and the incident trends over time.

That’s exactly what I want to track and report out with xAPI. From what I’ve seen in the market, there are some really good LRSs providing solid reporting on data when there’s no assumption about what’s actually in the data. RISC, however, has a history of designing reports that tells specific stories about specific learning activities, like their PDF Annotator and Message Board features. What RISC can report may not be for everyone, but if you want specific insights, there’s no other vendor on the market that’s able to give you these insights with this fidelity.

Andrew Downes, in his stunning blog series documenting just what exactly customers are tracking with Watershed, breaks down a high level categorization of learning analytics distinguishing the learner, the learning experience and the learning program. Downes cites that Wateshed customers learning analytics are about the learning program, and ery little attention is tracked by Watershed customers about the learning experience or the learners themselves. In that light, it’s even more impressive to me that RISC is tracking competencies in this way because performance-to-competency is a dimension that sheds light not just on the learning program but the learner and the learning experience, too.

How does this work with “spaced learning?”

When Duncan and Art Werkenthin present at xAPI Camp on March 26, they’re going to focus on spaced learning — the delivery of small, repetitive pieces of learning content with breaks between the learner’s sessions with the content. It’s a researched approach to learning content delivery that is particularly effective for long-term retention. There are a lot of technical things RISC is doing with xAPI (like relying on cmi5 for reliably handling the launch of xAPI-enabled learning content). VTA’s support for cmi5 reduces the variability of the learning experience and technically enables a spaced learning approach (among enabling lots of other learning experiences). The reality is that RISC’s implementation of cmi5 makes a lot of things possible where we need to know who the learner is, and we want to track learning activity to an LMS.

And, with RISC able to track a spaced learning experience, they can look at the data on the content delivered over time, against the performance outcomes (like incidents reported) and optimize the spacing relative to the type of content, the subject of the content, the complexity of the job performance, etc. VTA clearly gives you all the tools you need to do this. All you need is content that uses xAPI in specific ways to populate these reports with data that is formatted to deliver insights. After all, I’m highlighting what RISC can report, but that requires data that is useful and usable to populate these reports with appropriately formatted data.

For that, I can’t wait to share my next blog post on what DominKnow is doing that enables what RISC (and other tools) can report on.

Interested in xAPI visualizations but can’t make xAPI Camp?  Duncan and Art have a session Wednesday, March 27th at Learning Solutions on getting measurable results with xAPI titled Design with the End In Mind.

Filed Under: Experience API, Learning Analytics Tagged With: best practice, competencies, xAPI

xAPI Camp DevLearn Presentations

November 2, 2018 By Megan

Here are links to all the presentations from xAPI Camp DevLearn. Enjoy!
Introduction by Megan Bowe
xAPI State of the State by Aaron Silvers
Immersive Learning by John Blackmon
Working Better, Together by Paul Schneider
Identifying Competency Gaps by Art Werkenthin and Duncan Welder
xAPI and the Evolving Learning Ecosystem by Patrick Selby
A Viable Model for Learning Analytics by Nick Washburn
Yet Analytics Case Study by Allie Tscheulin

Filed Under: Experience API

iFest and xAPI Profile Servers

September 7, 2018 By Megan

We’ve been hard at work the last few months identifying what an xAPI Profile Server should do beyond the basic requirements in the xAPI Profile Specification as part of a BAA with ADL. Last week we took the show on the road to iFest in Washington DC. We brought along a poster which explained why a profile server is necessary and how it will improve interoperability. That poster won a people’s choice award for best narrative!

Congratulations to our poster winners! The people have spoken! The Jefferson Institute won Best Poster Design and Data Interoperability Standards Consortium won Best Poster Narrative. @Jeffersoninst @DataInterop #iFEST2018 pic.twitter.com/bY9EIrxFY1

— ADL Initiative (@ADL_Initiative) August 28, 2018

(evidence^)

That’s why I’m writing this post, because we have never won a poster award. In fact, there are a lot of awards we haven’t won. And we’re okay with that. Since this one was a people’s choice award, we thought more people might like to see it. So, here it is…

 

Another post is coming with a summary of the profile server research. Big thanks to ADL for a great event and thanks to our designer, Jason, for a prize poster!

Filed Under: Experience API

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Interim pages omitted …
  • Go to page 10
  • Go to Next Page »

Footer

Sign up for email updates

Learn about xAPI

  • About xAPI
  • Adaptivity
  • Authoring Tools
  • Case Studies
  • cmi5
  • Content Strategy
  • Data Strategy
  • Learning Analytics
  • xAPI Statements

Learn More

Upcoming xAPI Camps!

Search

  • Home
  • Connections Forum
  • Investigating Performance
  • xAPI Quarterly
  • xAPI Camp
  • How We Can Help
  • Case Studies
  • Learning Tech Consulting
  • Talk to Us!
  • News

Copyright 2017 MakingBetter