• Skip to main content
  • Skip to footer

MakingBetter

Strategy. Ecosystems. Meaning.

  • Consulting
    • Learning Technology Package
    • Case Studies
      • Connecting Skill Growth and Business Results
      • Data Analysis and Visualization
      • Dashboard and Adaptive Strategy
  • Data Interoperability
    • DISC
    • xAPI Camp
      • Upcoming Camps
      • Camp Archives
    • xAPI Quarterly
    • Investigating Performance
  • Who We Are
  • News
  • Let’s Talk

xAPI

Metrics in Learning Analytics

January 13, 2021 By Aaron

Before life in the United States was rudely interrupted (to put it extremely mildly) last week, I teased the concept of metrics in learning analytics. Where dimensions frame a problem space, metrics are (literally) the things I look at to model the problem space. In other words, metrics are how I make sense of an analysis.

let’s learn about metrics in learning analytics!

Problem Spaces need a model that describes the space (dimensions) that enable me to make sense of it, or “perform an analysis.”

More than counting.

My analysis is always going to be defined by my metrics. Often, I need to count or tally things, but COUNTS ARE NOT ANALYSIS. In presentations, I often talk about how an xAPI statement is an observation. I explicitly associate that a given metric will likely have an xAPI statement to support an observation (observations = metrics = xAPI statements).

Here’s an example of how I generally approach learning and job-task analysis (relating the learning to doing). While these dimensions of the learner, the learning activity, and the learning experience might make sense to a learning professional, it’s important to remember that these dimensions aren’t (and shouldn’t be) the only way to do learning analytics. I’m expressing a complex enough model that if 80% of readers copy and paste without more forethought, it hopefully will point them in better directions than they were going before reading (the whole “making better” thing).

“All models are wrong, but some are useful.”

Anyone putting together an learning analytics strategy is arbitrarily framing a box, more or less, in which learning happens. Only through the pinholes of such a theoretical box can learning be observed. Then, only by putting those observations together can an analysis be performed.

simple model of learning analytics dimensions: Learner by learning activity by learning experience
Pinholes in a theoretical closed box

So what should those pinholes be? This seems like a pretty straightforward question. Working with stakeholders to define metrics in learning analytics requires patience, planning and practice. I’ll share methods for getting to these metrics in the coming weeks.

Barring further interruptions, tomorrow I’ll review MakingBetter’s work over the past year. On Friday, expect some catch-up on what I have been up to, with Elsevier, ADL and IEEE. Lots. To. Share.

Filed Under: Learning Analytics Tagged With: dimensions, learning analytics, metrics, strategy, xAPI

Level Up Learning Data with xAPI in 2021

January 4, 2021 By Aaron

You’re likely going to be reading up seriously on how to work with learning data sometime in 2021. This will be true regardless of whether you work in online K-12, Higher Education or professional training — anywhere in the world. If you will “own” xAPI know-how on your team (or maybe for the whole organization), I want to help you.

Megan and I have certainly been in your shoes.

So, I made a resolution for 2021 – I’m going to try and blog daily again like I did maybe 15 years ago about Flash For eLearning. It’s time for us all to level our data game up.

First, let’s start with this webinar (above), where I recently demonstrated a new tool coming in Q2 from ADL, the xAPI Profile Server we produced for ADL with our friends from Veracity. In the video, I explain what this is, what problems it’s going to help with and how you’ll likely put it to work. Spoiler: it’s a tool to help you author and serve xAPI Profiles, doing a lot of the heavy lifting of complex rules thingies for you.

Next, I’ll blog about how to frame a problem space to apply learning analytics with dimensions.

Filed Under: Experience API, Learning Analytics Tagged With: demos, recordings, webinars, xAPI, xAPI Profile Server, xAPI Profiles

xAPI Camp Preview: RISC VTA

March 12, 2019 By Aaron

On March 26, 2019, I’m gathering some of the best tools that enable, and the professionals who define, xAPI’s best practices. This will happen at the upcoming xAPI Camp (number 14!) hosted by the eLearning Guild at the Learning Solutions conference in Orlando, FL. In the run-up to the event, I aim to highlight who’s going to be there, and why I think they’re enabling and defining best practices. Today’s post focuses on the outstanding capabilities RISC has built into its Virtual Training Assistant (VTA) learning management system that enable flexible training approaches, like spaced learning.

When Knowing Matters

A question I get asked regularly is “when is xAPI really worth doing?” and my answer is always something like “xAPI is worth doing when the stakes are high and the cost of failure is too heavy.” When it comes to safety and health — when lives are at stake — that’s when a deep analysis with xAPI is worth the effort. RISC’s VTA is geared for high-stakes compliance needs. If your workforce is on a gas or oil pipeline and you care that they can do the job and not hurt themselves or others in the process, then knowing the capabilities of a given worker and the team(s) they’re on matters.

My funk soul brother Duncan Welder goes into detail on how RISC approaches tracking competencies in VTA here. It’s worth the read and to see the dashboard in action, but I can tell a lot about what they’re doing in terms of how they’re using xAPI by what I see here.

Why is this good?

Checklists, Competency Assessments, Questions, Pass/Fail activities — these are different types of xAPI activities VTA is reporting on. Using xAPI to also detail incident reporting makes it possible to juxtapose the learning activities with real-world outcomes to demonstrate the correlations between learning performance and job performance. There is so much depth we can get into with this established, as RISC can continue to get more granular with the progress toward specific competencies and the incident trends over time.

That’s exactly what I want to track and report out with xAPI. From what I’ve seen in the market, there are some really good LRSs providing solid reporting on data when there’s no assumption about what’s actually in the data. RISC, however, has a history of designing reports that tells specific stories about specific learning activities, like their PDF Annotator and Message Board features. What RISC can report may not be for everyone, but if you want specific insights, there’s no other vendor on the market that’s able to give you these insights with this fidelity.

Andrew Downes, in his stunning blog series documenting just what exactly customers are tracking with Watershed, breaks down a high level categorization of learning analytics distinguishing the learner, the learning experience and the learning program. Downes cites that Wateshed customers learning analytics are about the learning program, and ery little attention is tracked by Watershed customers about the learning experience or the learners themselves. In that light, it’s even more impressive to me that RISC is tracking competencies in this way because performance-to-competency is a dimension that sheds light not just on the learning program but the learner and the learning experience, too.

How does this work with “spaced learning?”

When Duncan and Art Werkenthin present at xAPI Camp on March 26, they’re going to focus on spaced learning — the delivery of small, repetitive pieces of learning content with breaks between the learner’s sessions with the content. It’s a researched approach to learning content delivery that is particularly effective for long-term retention. There are a lot of technical things RISC is doing with xAPI (like relying on cmi5 for reliably handling the launch of xAPI-enabled learning content). VTA’s support for cmi5 reduces the variability of the learning experience and technically enables a spaced learning approach (among enabling lots of other learning experiences). The reality is that RISC’s implementation of cmi5 makes a lot of things possible where we need to know who the learner is, and we want to track learning activity to an LMS.

And, with RISC able to track a spaced learning experience, they can look at the data on the content delivered over time, against the performance outcomes (like incidents reported) and optimize the spacing relative to the type of content, the subject of the content, the complexity of the job performance, etc. VTA clearly gives you all the tools you need to do this. All you need is content that uses xAPI in specific ways to populate these reports with data that is formatted to deliver insights. After all, I’m highlighting what RISC can report, but that requires data that is useful and usable to populate these reports with appropriately formatted data.

For that, I can’t wait to share my next blog post on what DominKnow is doing that enables what RISC (and other tools) can report on.

Interested in xAPI visualizations but can’t make xAPI Camp?  Duncan and Art have a session Wednesday, March 27th at Learning Solutions on getting measurable results with xAPI titled Design with the End In Mind.

Filed Under: Experience API, Learning Analytics Tagged With: best practice, competencies, xAPI

The 3rd Gear

August 5, 2015 By Aaron

Cruising down a highway through Montana in a '15 Ford MustangAs xAPI shifts into 3rd gear, with an early majority comes a need for a consortium that will steward xAPI into perpetuity — a table for other industries to sit and work out the ways xAPI will meet their particular needs. I’m talking about HR systems, medical devices, folks who make beacons and sensors, manufacturing, energy companies, engineers, school administrators. There are other groups who we aren’t talking with already and they’ll make their needs known. We’ll have certification tests. This will make it easy for folks buying software or hardware to see a stamp of approval — a neutral third party guarantees a product is 100% Grade A xAPI. With it, the industry that makes xAPI software and hardware will have to grow up. With it, new job titles, new practices, new competencies and new goals will be born out of those titles, practices, competencies and goals we already have, from wherever we are are.

Megan and I had a great, long, long-overdue and well-earned vacation road-tripping through mountains before and after xAPI Camp – Amazon. Reflecting now it couldn’t be a more apt way for us to vacation and get ready for what’s coming. xAPI is gaining highway speed. It’s gonna be surprising and exciting. We’re going to work harder than ever to keep apace with adoption, solidifying what we have and making it flexible for even more adoption. With every climb, we’re going to see something way more epic than before.

In Arapahoe, view from a windshield about to turn into a curveSo I say, “Welcome,” to the coming early majority. Welcome to xAPI.

Thank you to everyone who makes xAPI Camp happen. Our hosts. Our supporters and sponsors and partners. Our speakers! Brian Dusablon, who’s one of the best Experience Ninja around, working invisibly behind the scenes to make each event amazing for our participants. And to our participants: you are creating an active and vivrant community that is going to transform multiple industries. Through your activity and participation, you’re building something huge that’s purely driven by a shared goal: to improve ourselves, the places we work and the world we live in.

Let’s go, team. We all got work to do. We now have more horsepower and we’re gaining speed. Let’s keep our eyes on the road ahead and not just the mile marker we hit.

Put the pedal to the metal — and hang on. 🙂

Filed Under: DISC, Standards, Uncategorized Tagged With: adoption, consortium, xAPI

xAPI Shifting to 3rd Gear

August 3, 2015 By Aaron

Mark Oehlert kicking off xAPI Camp - AmazonYou might be surprised to find out we’re in 2nd gear. We shifted out of 1st gear at DevLearn last year. Between November 2014’s xAPI Hyperdrive and the xAPI track at DevLearn 2014, we hit our stride. We had people well outside of the ADL bubble talking about real world things things they were using xAPI to accomplish.

What happened at xAPI Camp Amazon?

This wasn’t merely a great event. This was a game changer. The participants, including the speakers and our partners — this event really couldn’t have come out more epic than it did.

Let me talk a bit about our speakers. They are not messing around. All killer, no filler high-octane Awesome. We got Sean Putman to come in from Detroit to talk about what he all did with Altair’s software and xAPI. Ben Erlandsson, who’s working on the boldest and most complex application for xAPI I’ve ever encountered — and how xAPI stands to help make the most impact of any social initiative to date. Myra Travin came from across town in Seattle to articulate the field of learning experience design in a way that really made sense. Kirsty Kitto and Aneesha Bakharia are rockstars in research that’s driving the learning analytics field — and they’re pointing out ways in which we need to be using xAPI better. Bill McDonald spent 25 years helping run the Aviation Industry’s Computer-Based Training Consortium, eventually overseeing their own standard evolve to be completely based on xAPI. Russell Duhon talking about what it takes to run xAPI in Enterprise… at scale. Duncan Welder talking about the ways in which their LMS is helping Big Energy improve their compliance — not just get a check in the compliance training box, using xAPI for both content analytics and social. Their presentations are all archived here.

A breakout group at xAPI Camp - AmazonThese speakers were incredible people, many of whom are outside the xAPI Developer/Contributor/Adopter circles. Real folks making real decisions about real world problems and using xAPI to help. They’re haulin’ ass.We had small emergent breakout groups and large breakout groups moderated by our partners Shelly Blake-Plock, Mike Hruska and Nick Washburn. Participants got into the weeds of learning architecture, project management and scaling the technology to respond to growing organizational demands in near real-time. Everything discussed was approachable in real-world, real-human language but nothing was watered down. People took notes and tasked themselves for next-actions. Attendees were solving real problems while they were there.

The format for the event worked, it really worked. Myra Travin saw so much in it that she wrote a post on just how well it worked. You can read it on LearnxAPI.

Why was this camp so significant?

No matter what Amazon does with xAPI, their embrace of xAPI, even to bring the community to their house so we could learn together — what xAPI enables and (frankly) what value xAPI holds for them — is huge. Just huge.

You need a company to reference when asked, “Who’s doing xAPI?”

Your answer now is “Amazon.” Done. Mic drop.

Because when we talk about the future of learning, usually our professionals from ISDs to CLOs immediately talk about recommendations… “like Amazon.”  Every conference that talks about elearning at all — K-12, Higher Ed, Community Colleges, Corporate, Gov, .mil — everyone talks about recommendations “like how Amazon does it.”

…And even Amazon is going xAPI.

You hold onto that for a moment. Let that thought just linger in the air a bit. Enjoy it. Savor it. This is the kind of moment — those of us who’ve helped make xAPI happen — this is the moment we’ve been waiting for. Given their engineering culture, given their fanatical attention to customer service, given their massive supply chain and the people power it takes to make all of those parts work in harmony, xAPI could not have a better stakeholder.

It marks the beginning of the end of early adoption for xAPI. You go back to the bell curve model of Rogers’ Diffusions of Innovations theory. We hit the level taking us to Early Majority.

This is the shift from 2nd to 3rd gear. There’s more to be done. It’s not just events, it’s not just fancy new adopters. There’s a post coming later this week on what has to happen next. (Hint: we need to formalize our work together to make sure what the early majority is doing keeps working and makes the space to iron out the wrinkles existing today.)

Filed Under: Experience API, xAPI Camp Tagged With: amazon, recap, xAPI, xapi camp

Footer

Sign up for email updates

Learn about xAPI

  • About xAPI
  • Adaptivity
  • Authoring Tools
  • Case Studies
  • cmi5
  • Content Strategy
  • Data Strategy
  • Learning Analytics
  • xAPI Statements

Learn More

Upcoming xAPI Camps!

Search

  • Home
  • Connections Forum
  • Investigating Performance
  • xAPI Quarterly
  • xAPI Camp
  • How We Can Help
  • Case Studies
  • Learning Tech Consulting
  • Talk to Us!
  • News

Copyright 2017 MakingBetter