• Skip to main content
  • Skip to footer

MakingBetter

Skills and Strategies to Engage, Change and Grow Others

  • Data Interoperability
    • xAPI Camp Archives
    • xAPI Quarterly
    • Investigating Performance
  • Who We Are
  • News
  • Let’s Talk

xAPI Profiles

Talking ‘Bout Praxis

February 22, 2021 By Aaron

I mean, eventually, we are talking about practice; today, though…

Today, I begin sharing my praxis around learning analytics. That praxis includes designing and engineering feedback loops with xAPI.

2020 sucked, but there were silver linings. Commuting to the office came to a halt, and so did about four hours worth of transition activities between home and office suddenly appear every day. Add to that meetings that people just stopped happening. Consulting Elsevier where I manage their Learning Analytics strategy, the planning and architecture in 2019 executed in 2020.

Customers were using a product and content based on things I actually had some hand in. I spent 2020 preparing what happens to take an MVP and deliver all the value envisioned for it.

Praxis?

I can write diatribes about learning under lockdown. This year, I’ll do some heavy lifting this year to support the role praxis has in creating feedback loops. Praxis accelerates professional practice by encouraging constructive discourse around the mechanics, theory and criticism of how professionals perform their practice. Megan and I encouraged such praxis around learning analytics with xAPI Quarterly and xAPI Camps

Since 2017, as I learned how Elsevier works from the inside, I figured out how to jettison some dependencies on legacy, siloed structures. In that same timespan, xAPI Profiles matured, as did xAPI.

I’ve shifted from the theoretical—specification creation—to the practical. With the work I’ve done for countless organizations in modernizing learning architecture, Elsevier is the first place I’ve worked where I’ve been able to see projects through to their fruition. Because of the lockdown, my life slowed down enough to reflect, to evaluate, to process and learn and, ya know, do things differently.

Thanks to standard tuition reimbursement for professional development at Elsevier, I finished 36 weeks of coursework for my Lean/Six Sigma Black Belt certification from Villanova in May 2020. Those extra four hours a day really came in handy for that effort, let. me. tell. you. I acquainted myself with the statistics practices I learned through my Math Education minor. Now, with actual business priorities associated with them, I had needed context to understand the real power behind this math I knew for years—the theory and the practice—to put that knowledge to use.

We’re talking about praxis.

It’s probably important to consider some inherent conceits in my practice. I tend to think of the work I do needing to hold up for 100 years and, still, be re-composable. Not necessarily because my work is so good… so much as digital works using xAPI just might be in use that long. Think of how enduring SCORM is as a technology. Look up the oldest eLearning you have in your LMS. Consider the 47 BILLION dollars spent internationally last year on learning technology upgrades, around the globe. NOW consider how much money will be spent every year on NEW learning technologies (that are increasingly going to be based in xAPI because…. reasons…). And think of how much harder it will be for the next evolution in technology to supplant all this new infrastructure.

I can’t deliver anything related to xAPI that isn’t built to endure. Learning tools should fundamentally behave flexibly, nimbly, as if it was designed and engineered for professionals like us to be so easy to keep working with. I know I’ve done quality work when the workflows that engineers and authors and administrators and learners deal with around learning technologies is more fluid, clears the obstacles to have a rich and rewarding learning experience. My passions and expectations around quality in practices of learning technology may not be entirely rational on their surface, and I can acknowledge that much 😉

Anyway, a result of such passions is that I was very happy to have had the opportunity to collaborate on some papers. Kirsty Kitto of UT-Sydney introduced me to John Whitmer of Schmidt Futures Foundation and through several online chats over 2020, I we wrote a position paper for the Society of Learning Analytics Research on Creating Data for Learning Analytics Ecosystems. Through those online chats, John particularly got me thinking a lot more clearly about the telemetry of learning data as a bit distinct from the analytics. Telemetry is about the engineering of the data — how it’s structured, where it goes, how do you get it out into a report.

Currently, practice is usually limited to what can be analyzed based only by collecting and counting things in data produced by learning content. Hint: that’s just how you collect metrics.

It’s the ways you start to operate and make decisions, even as simple as what data or decision is made and when, that start leading down the road to a learning analytics strategy. When that telemetry can be governed in ways that align in one place with the business/learning/analytics strategy (eg. xAPI Profiles), we can scale this work. With scale and impact in mind, after the election in November, John approached Megan and I with an opportunity to contribute a policy paper for the incoming Administration. Reflecting our thoughts around the potential to rapidly scale the production, collection and analysis of educational data based on the potential in the xAPI Profile Server, the Day One Project published our position on Improving Learning through Data Standards for Educational Technologies.

Wanna talk about praxis too?

xAPI Profiles standardization runs through IEEE, led by yours truly. We meet on the fourth Tuesday of every month, 3:30 Eastern. With a team that includes Will Hoyt of Yet Analytics as vice/co-chair, we’re going to have agenda and action items (homework) for participants with a wide range of technical insight/ability. If you’re new to xAPI Profiles but you know you’re going to need to put them to work, this is a good time to jump on-board as we’re going to really start the work in March 2021 with the idea that we’ll wrap standardization by March 2023. Already, there are multiple efforts that will run concurrently.

IEEE 9274.2.1 identifies the JSON-LD xAPI Profile standard. The manual one needs to implement a data strategy implied by a JSON-LD document must be understood by, well, people. Not just academics or engineers, but like… anyone… that would have to actually have to make decisions based on a standard, knowing it kinda has to work more like how USB or lightbulbs work as standards than, say… well, SCORM. As a result, there will be a 9274.2.2 activity starting very soon to figure out what all has to go in that documentation.

If you want to work with xAPI in a way that’s easy enough just by “reading the instructions,” Get involved. Contact me to get notified and reminded for these (will-be) monthly discussions. 🙂

Next post? Maybe it’s time to look at a little bit of question-storming that’s going to result in some learning analytics strategy.

Filed Under: Learning Analytics Tagged With: elsevier, ieee, journals, learning analytics, praxis, publishing, research, solar, telemetry, xAPI Profile Server, xAPI Profiles

2020 was all About xAPI Profiles (for us)

January 19, 2021 By Aaron

stainless steel can with fire
stainless steel can with fire
Photo by Lisa Fotios on Pexels.com

A few days ago, Megan and I passed a milestone, with now seven years of MakingBetter. 2020 was all about xAPI Profiles (for us). It’s hard for me to grok how much we got done in 2020 considering the trash fire of a year it was.

Like many of you, Megan and I had to seriously change how we’ve managed our work/life balance, household, workflow(s). In this post, I’ll highlight what we shipped last year through MakingBetter.

The ADL xAPI Profile Server

In 2018, Megan and I drafted requirements for an xAPI Profile Server. Two years later, in 2020, MakingBetter built it with ADL.

We teamed with our besties at Veracity, keeping a learning standards team-up between Jono Poltrack and myself that spans almost two decades now, and keeping the running t-shirt jokes going between Megan and Tom Creighton. I digress.

To summarize, the ADL xAPI Profile Server is a revolutionary approach to governing data for xAPI-supporting applications. It’s an authoring tool for xAPI Profiles. It’s an RDF server, so it is processing all the semantic connections on the back-end; generating JSON-LD concepts while you author. It enables authoring tools, independent development environments to be dynamic and driven from the same authoritative document. Updating an xAPI Profile hosted on a server will make changes available to anything subscribing to that profile. Imagine managing taxonomies and ontologies for learning data that don’t necessarily require re-authoring.

Megan provided the vision and technical requirements that drove this effort, and led the technical development. Crystal Kubitzky did cracking (been watching a lot of Great British Bake-off) UX research and design, making full use of the USDS design system. Eva Rendle took on the unenviable task of managing the herculean effort to design, develop and deliver this product in 13 months’ time.

As a result, it represents that best of what can happen when everyone on a project is oriented to the same goal. Collective focus, talent and hope drove this project in 2020. The love our team put in that effort shines through when you use it.

It launches in Q2 2021. Watch me introduce and demo some of the features of the ADL xAPI Profile Server here.

xAPI Profile Authoring

MakingBetter worked with Veracity to produce a significant xAPI Profile. For the long-game on xAPI, we produced a profile with ADL to support the prototype development of the Total Learning Architecture (TLA). The profile expresses the TLA’s Master Object Model (MOM). In the coming weeks, expect posts to surface lessons learned about profile authoring and thoughts around enterprise learning analytics strategy inferred by this work.

For now, if you’re interested in reading more on the MOM and the TLA, read here.

More?

In the next post where I’ll get into what all I’m doing, with all the different hats I get to wear. After, with that bit of needed meta established, I can get into blogging about the really real nitty gritty of working with xAPI.

Filed Under: Open Source Tagged With: master object model, xAPI Profile Server, xAPI Profiles

Dimensions in Learning Analytics

January 5, 2021 By Aaron

simple model of learning analytics dimensions: Learner by learning activity by learning experience

Dimensions, in learning analytics, are the ways we describe the (theoretical) space in which we’ll analyze… learning. I think of a given xAPI Profile as the technical instructions I want tools, systems and services to calibrate on for the way I need to make sense of the data. Yesterday, I shared a demo given of the forthcoming ADL xAPI Profile Server. Today, I step back to share a mental picture of how learning analytics can be framed.

In 2019, I collaborated with Judy Katz on some ideas for visualizing learning analytics strategy for the xAPI Camp @ Learning Solutions 2019. Judy and I each had our takes at the time on how we might frame the “space,” but with almost two years to reflect and multiple opportunities to put those ideas into practice, it’s abundantly clear how we each labeled the dimensions was less important than how we organized and measured similarly.

Dimensions

A simple way to think about learning analytics is that when we want to analyze learning, we look for information about:

simple model of learning analytics dimensions: Learner by learning activity by learning experience
A rather simple model of the dimensions of learning analytics.
  • The Learner,
  • The Learning Activity, and
  • The Learning Experience

… and all of these analyzed over Time as a constant means of comparison.

Dimensions tend to stay affixed, once set. The trajectories along which we measure things likely will remain the same. Much investment of time and human capital is built on this framing, so before anyone codes anything, this model should be treated as something that can be revised, even tossed out, in favor of something better. The investment of time and effort in planning is minimal, no matter how long the process takes, compared to the costs of implementing the wrong learning analytics strategy.

Metrics

Along each of these dimensions, I’d identify metrics. Metrics help to understand the dimension of learning I’m analyzing. For example, I might break a Learner dimension down into the type of Learner, or user, anticipated by the learning experience. If I’m developing a learning solution for practicing nurses, the “Learner” likely includes the “nurse” but may also include other roles, like a Nurse Preceptor or a Nurse Educator. A dimension like “Learner” should account for every type of participant.

Tomorrow, I’ll start diving into the ways I might break down those metrics more thoughtfully, related to the Learning Activity vs. the Learning Experience.

xAPI-Camp-Learning-Solutions-2019-Learning-Analytics-StrategyDownload

Filed Under: Learning Analytics Tagged With: dimensions, learning analytics, metrics, strategy, xAPI Profiles

Level Up Learning Data with xAPI in 2021

January 4, 2021 By Aaron

You’re likely going to be reading up seriously on how to work with learning data sometime in 2021. This will be true regardless of whether you work in online K-12, Higher Education or professional training — anywhere in the world. If you will “own” xAPI know-how on your team (or maybe for the whole organization), I want to help you.

Megan and I have certainly been in your shoes.

So, I made a resolution for 2021 – I’m going to try and blog daily again like I did maybe 15 years ago about Flash For eLearning. It’s time for us all to level our data game up.

First, let’s start with this webinar (above), where I recently demonstrated a new tool coming in Q2 from ADL, the xAPI Profile Server we produced for ADL with our friends from Veracity. In the video, I explain what this is, what problems it’s going to help with and how you’ll likely put it to work. Spoiler: it’s a tool to help you author and serve xAPI Profiles, doing a lot of the heavy lifting of complex rules thingies for you.

Next, I’ll blog about how to frame a problem space to apply learning analytics with dimensions.

Filed Under: Experience API, Learning Analytics Tagged With: demos, recordings, webinars, xAPI, xAPI Profile Server, xAPI Profiles

Footer

Sign up for email updates

Learn about xAPI

  • About xAPI
  • Adaptivity
  • Authoring Tools
  • Case Studies
  • cmi5
  • Content Strategy
  • Data Strategy
  • Learning Analytics
  • xAPI Statements

Learn More

Upcoming xAPI Camps!

Search

  • Home
  • Connections Forum
  • Investigating Performance
  • xAPI Quarterly
  • xAPI Camp
  • How We Can Help
  • Case Studies
  • Learning Tech Consulting
  • Talk to Us!
  • News

Copyright ©2014-2022, MakingBetter d/b/a Bowe & Silvers Partners LLC.

 

Loading Comments...