• Skip to main content
  • Skip to footer

MakingBetter

Strategy. Ecosystems. Meaning.

  • Consulting
    • Learning Technology Package
    • Case Studies
      • Connecting Skill Growth and Business Results
      • Data Analysis and Visualization
      • Dashboard and Adaptive Strategy
  • Data Interoperability
    • DISC
    • xAPI Camp
      • Upcoming Camps
      • Camp Archives
    • xAPI Quarterly
    • Investigating Performance
  • Who We Are
  • News
  • Let’s Talk

Open Source

2020 was all About xAPI Profiles (for us)

January 19, 2021 By Aaron

stainless steel can with fire
stainless steel can with fire
Photo by Lisa Fotios on Pexels.com

A few days ago, Megan and I passed a milestone, with now seven years of MakingBetter. 2020 was all about xAPI Profiles (for us). It’s hard for me to grok how much we got done in 2020 considering the trash fire of a year it was.

Like many of you, Megan and I had to seriously change how we’ve managed our work/life balance, household, workflow(s). In this post, I’ll highlight what we shipped last year through MakingBetter.

The ADL xAPI Profile Server

In 2018, Megan and I drafted requirements for an xAPI Profile Server. Two years later, in 2020, MakingBetter built it with ADL.

We teamed with our besties at Veracity, keeping a learning standards team-up between Jono Poltrack and myself that spans almost two decades now, and keeping the running t-shirt jokes going between Megan and Tom Creighton. I digress.

To summarize, the ADL xAPI Profile Server is a revolutionary approach to governing data for xAPI-supporting applications. It’s an authoring tool for xAPI Profiles. It’s an RDF server, so it is processing all the semantic connections on the back-end; generating JSON-LD concepts while you author. It enables authoring tools, independent development environments to be dynamic and driven from the same authoritative document. Updating an xAPI Profile hosted on a server will make changes available to anything subscribing to that profile. Imagine managing taxonomies and ontologies for learning data that don’t necessarily require re-authoring.

Megan provided the vision and technical requirements that drove this effort, and led the technical development. Crystal Kubitzky did cracking (been watching a lot of Great British Bake-off) UX research and design, making full use of the USDS design system. Eva Rendle took on the unenviable task of managing the herculean effort to design, develop and deliver this product in 13 months’ time.

As a result, it represents that best of what can happen when everyone on a project is oriented to the same goal. Collective focus, talent and hope drove this project in 2020. The love our team put in that effort shines through when you use it.

It launches in Q2 2021. Watch me introduce and demo some of the features of the ADL xAPI Profile Server here.

xAPI Profile Authoring

MakingBetter worked with Veracity to produce a significant xAPI Profile. For the long-game on xAPI, we produced a profile with ADL to support the prototype development of the Total Learning Architecture (TLA). The profile expresses the TLA’s Master Object Model (MOM). In the coming weeks, expect posts to surface lessons learned about profile authoring and thoughts around enterprise learning analytics strategy inferred by this work.

For now, if you’re interested in reading more on the MOM and the TLA, read here.

More?

In the next post where I’ll get into what all I’m doing, with all the different hats I get to wear. After, with that bit of needed meta established, I can get into blogging about the really real nitty gritty of working with xAPI.

Filed Under: Open Source Tagged With: master object model, xAPI Profile Server, xAPI Profiles

Filling in xAPI’s Most Important Gaps

September 11, 2017 By MakingBetter

Cat on a Box

2017 is a year that is really testing our capacities and limits. Politics, natural disasters, cyber security threats, budgets… even ADL has hit some road bumps. We formed the Data Interoperability Standards Consortium (DISC) a few years ago specifically to hedge against disruptions to ADL’s capacity for spec stewardship so that the people who rely most on xAPI wouldn’t be held back the way the learning industry was with SCORM 2004 (after ADL stopped innovating on it… in 2004). Anyone who’s worked in or with a government for more than a few years knows to expect that there are these hiccups which, for an indeterminate amount of time, stall progress.

Industry and momentum, let alone organizational strategies, shouldn’t be subject to the whim of such hiccups — especially when the industry could address these on its own.

Working hand-in-hand with ADL over the last year, DISC and the xAPI Community hit some pretty major milestones that really stabilized xAPI, in terms of implementation, to enable more adoption, faster. Also with ADL, we identified a series of priorities to work on together over the next several years. ADL’s funding situation means that those plans would have to be put on hold for us to work contractually with ADL. However, that (realistically) could be almost a full year in waiting to get started. As an advocate for its progress, I’d rather we work on these things in a different way, not needing to rely on ADL, so that when ADL’s budget capacity is restored, they can advance things much further afield. As an industry that depends on xAPI, it’s in the xAPI Community’s best interests to keep things going.

Megan and I are sharing the priorities for open source development and documentation we worked out with ADL with the hope that the xAPI Community will seize this moment for the opportunity it presents: to chart its own path to growth and ubiquity.

  1. Stand up a xAPI Profile server. With the xAPI Profiles specification released in June, the xAPI Community has finally addressed a huge gap in achieving semantic interoperability. However, without even a reference implementation of an xAPI Profile Server, the spec will never see its potential to scale xAPI across industries, and we will continue to have challenges of semantic interoperability across implementations. There’s a minimal development effort as a lot of the xAPI Profile Server leverages existing JSON-LD tools.
  2. Stand up a service to test for valid, well-structured xAPI Profiles. Right now the only way to create xAPI Profiles in JSON-LD format is to do so by hand, which is a laborious process that is prone to manual errors. It’d be helpful to have something available in lieu of tools that validated an xAPI Profile, highlighted where the errors are and the nature of them, so that the people producing xAPI Profiles could do so easily, and so those xAPI Profiles could be used by others with confidence. This likely requires a bit more development than the xAPI Profile Server requires.
  3. Stand up a tool to help people publish xAPI Profiles in proper JSON-LD format. What would be even better than producing xAPI Profiles by hand would be a web-based application with an interface that made it much easier for people to create valid xAPI Profiles, so that subject matter experts (or, more likely, data people working with subject matter experts) could generate profiles without needing to encode JSON-LD themselves. Not only does this require some development savvy — it would benefit from having a product manager and interaction designer so that this could be both usable and useful to many, thus encouraging more organizations to generate and share xAPI Profiles.
  4. Develop conformance requirements and a conformance test for xAPI Profiles and the xAPI Profile Server. Much like DISC facilitated with the xAPI Community in 2016, it’s likely that many LRS vendors (and probably authoring tool vendors as well) will incorporate an xAPI Profile Server and want to validate the data collected in their LRSs against valid xAPI Profiles. It would be wise to get ahead of inevitable interoperability challenges and develop (first) conformance requirements and (later) conformance tests that could ensure consistency in what we deem to be “conformant.” Related — if it’s a common goal by LRS vendors and stakeholders that LRSs will validate statements against profiles, that should be made explicit in the xAPI specification and additional LRS Conformance requirements must be developed, as well as additional unit tests in the LRS Conformance Test Suite.

That’s a solid list, and the community could rally itself to take this on and commit to seeing it done in the next twelve months. The organizations willing to commit resources to develop these efforts to a defined set of outcomes should be embraced by the community.

The things listed above are needed now. The market can probably tolerate a year without it impeding xAPI adoption, so long as these things are available to the market and implemented as organizations budget for FY19. There’s more for to do in the coming twelve months, but in the general category of making things, these are things with definitive outcomes that would help everyone.

So… who’s going to take on what in filling in these pretty important gaps?

Tim?

Filed Under: Experience API, Open Source

What is the plan for DISC, xAPI, and ADL?

August 3, 2016 By Aaron

Towards the end of last year, our big announcement was the formation of the not-for-profit Consortium formed to handle the governance and address the evolution of xAPI with/for ADL. xAPI, being open-source and licensed as Apache 2.0, doesn’t require ADL’s permission for us to do this. That’s a feature, not a bug — we licensed xAPI in this way so that it couldn’t be trapped in US DoD the way SCORM was. All the same, ADL’s done an amazing job stewarding xAPI through its R&D roots to now and to whatever degree we can work in concert with ADL (and US DoD, by extension) it’s to our collective benefit to do so. Which is why we haven’t done a whole lot in terms of visible activity since the beginning of the year. We’ve worked out the details of what we’re going to take on with ADL so the connections between DISC and ADL can be very explicit, very concrete and very visible. ADL’s announcement about our collaboration can be found here.  Here’s a look at what we have planned over the next four years.

DISC’s Year-One Priority Outcomes

To ensure the interoperability of software and hardware that purports to be xAPI-conformant, DISC is revitalizing the xAPI Conformance working group and conducting a research effort to facilitate definition of the requirements for xAPI conformance. Concurrently, DISC will work with a stakeholder group to define requirements for software and hardware certification of xAPI conformance, and to make recommendations to ADL for a program that would confer certification on software and hardware with Learning Record Store (LRS) functionality.

DISC will develop and deploy a publicly available index of such certified LRS products for use by stakeholder involved in acquisitions.

The thing is, that focusing on LRSs is just the start. To make implementations of xAPI interoperable, we can’t limit the responsibility on LRSs. We also can’t focus solely on content for a lot of reasons. One lesson from the work done to test conformance and to certify content leveraging the Sharable Content Object Reference Model (SCORM) was that certifying work products, such as content, offered diminished return on the investment in capacity needed to efficiently certify such work products. After about two years of research and requirements gathering, we’ve determined that a program to certify professionals who create learning experiences leveraging xAPI, with practice-based evaluation of their work products, is an idea worth championing so DISC will work with a stakeholder group to define requirements for a such a professional, Learning Record Provider certification program. We’ll explore the requirements for the contents of a body of knowledge, the competencies needed professionally and the evaluation criteria. This will inform the requirements for a program that confers certification on professionals who demonstrate competence in creating learning record providers that work interoperably with xAPI-conformant LRSs.

And, just to make sure we’re all on the same page — this is all within year one, going into June(ish) 2017. Aggressive goals? Megan and I would say we’re assertive, but no less so than any other goal we’ve put out there (and met). We predict there are going to be some serious reasons why we need to get LRS certification ready for early 2017, in terms of catalysts that will drive market adoption of xAPI. The time is now, friends.

DISC’s Year 2-4 Activities

In our first year, DISC will establish a baseline for data interoperability of the xAPI specification itself. To strengthen that for the long-term, DISC is going to vet the impact of supporting JSON-LD on existing implementations of xAPI, and, should stakeholders choose to support JSON-LD, DISC will take on responsibilities to update any hardware, software and/or professional certification materials, processes, procedures and evaluation criteria to support that evolution. That seems really geeky, but it’s important that before we go bigger on professional certifications and larger services, we figure out what, if anything, we’re to do with JSON-LD as an industry. Three years ago, it wasn’t big enough for us to rally support to address it (argue about that all you want — it wasn’t). Now it clearly is, so we’re going to address it.

To strengthen data interoperability and to reinforce appropriate cybersecurity and information assurance practices in applications of xAPI within US DoD and other verticals, DISC intends to identify a profile of xAPI as an exemplar for US DoD and FedRamp such that it serves as a model for other xAPI stakeholder groups to use in extending the xAPI specification to work with various laws and policies with which xAPI may interact. In other words – FedRamp describes what US government needs in terms of security. Other nations — other organizations — may have more or different needs. So, by doing this, DISC would produce best-practice recommendations with the intent of providing advice for data privacy considerations around xAPI and how personal data ownership may be defined to inform cybersecurity and information assurance goals considering situation-appropriate operational, business and technical perspectives.

Help?

These are just the highlights. Megan and I had some incredible help organizing our scope and putting the mandate into actionable outcomes thanks to our friend and Project Czar extraordinaire, Wendy Wickham, as well as our Board of Directors.

Referencing our previous post, DISC activity is just the first topic we had to update you on. Next, we’ll talk to what we’ve been seeing drive xAPI’s adoption recently.

 

Filed Under: DISC, Open Source, Standards

What You Missed at TRYxAPI – ATD ICE

May 18, 2015 By Aaron

Riptide Software hosted a fantastic event on Saturday for the launch of TRYxAPI.com. The afternoon featured John Delano, Nick Washburn and myself leading a critically needed dialogue around the Experience API about what matters to executive leadership, and to a field that is still largely making sense of major changes happening in the learning & development field.

Our discussions featured Trish Uhl, Russell Duhon, Duncan Welder, Phil and Nick Stephenson, Lizelle van den Berg, Ian Gibson, Damon Regan of ADL with participation by many more.

John kicked off the afternoon talking about lessons learned consulting up to senior leaders in manufacturing and tech. In his presentation, John added richer insight into the executive mindset.

“I know training is important. I just don’t know how valuable. So I’ll spend the minimum on it.”

John said this translates down to L&D as “There’s no budget and there’s no permission to do anything different.” How John proposed we counter this mindset is by looking for “performance opportunities” — looking for common learning models and having a plan for how to discuss and execute on different types of opportunities for L&D to make a business impact — starting with needs for information dissemination and skills development.

Common learning models #TryxAPI pic.twitter.com/borf90Je6n

— Aaron E. Silvers (@aaronesilvers) May 16, 2015

After much discussion over different learning models and ways to map performance in terms of outcomes, behaviors, systems, content and competency, we all participated in smaller discussions as we practiced defining the value propositions of each others’ projects. This was in service to leveraging Saltbox’s Learning Model Canvas.

After a break where we had an opportunity to use our drink tickets (thank you, Nick ;), I re-introduced the Experience API for not-so-technical practitioners and consultants who need to understand what it is, who it benefits and what challenges it addresses.

The real highlight of the day, though, was the new site put together by Nick and his team at Riptide Software: TRYxAPI.com. What Nick and the team have put together is a highly usable and useful means to understand not only what xAPI does for businesses — it identifies the open source tools that are freely available for people to use on their own to literally try xAPI in their own organizations with a blueprint for how to replicate those case studies. This customer-centric approach is evident in the case study examples shared to model how other vendors and other organizations can share their tools and their case studies.

Two of Riptide’s examples really resonated strongly. The discussion around what Riptide did for Gate Retail Onboard was a clear example of how a small prototype project for xAPI proved huge ROI in terms of linking how improving the digital availability of performance support, and evaluating its use, impacted sales numbers and then encouraged full-on adoption as revenues dramatically increased for the company. The second example shared was the work Riptide has done in concert with the US Army, improving its sharpshooting training while also improving its sharpshooting training facility.

Heatmap analysis of target training for realtime feedback to soldiers using xapi @RiptideLearning @RISC_Inc #tryxapi pic.twitter.com/eUxbSUgRqi

— W. Duncan Welder IV (@DuncanWIV) May 16, 2015

These case studies inspired a wealth of conversation not just around the shift for L&D with the opportunity to support major business impacts with explicit goals, but the nature of using xAPI to evaluate the very systems themselves, like what Sean Putman started with Altair Software, where the same information being evaluated to help folks improve their performance is being used to also improve the software they’re learning about.

Nick teased that another TRYxAPI event may happen in time for December’s I/ITSEC conference in Orlando. I’ll be there. I can’t wait.

 

Filed Under: Community, Experience API, Open Source, Uncategorized

xAPI Communities of Practice

January 8, 2015 By Aaron

Advanced Distributed Learning Communities of Practice

Last year, ADL began organizing Communities of Practice around the design considerations and implementation of xAPI in different contexts. There are many ways xAPI can be applied — how one designs and develops best really depends on the use case.

Advanced Distributed Learning Communities of PracticeYesterday, Ben Betts announced his leadership of a Badges Community of Practice, and I was both surprised and elated to see all the different groups coordinating with ADL.

The outcome of this work likely achieves similar goals as what Rustici Software demonstrates with “Recipes” — coming up with more common ways of implementing things that would be really challenging to put into the xAPI specification itself, but are the kinds of things we, as an industry, want everyone to be doing similarly.

This work is important as it informs xAPI’s current refinements as a specification. It’s value is multiplied as the different ways in which these Communities of Practice use xAPI become explicit. The overlaps in vocabulary inform what we should maintain or reinforce going forward. The things that distinguish one practice from another will inform the standardization and stewardship, pointing to areas where education around xAPI or evolution of the technology require development.

The organizers of each of these communities of practice are great people who will appreciate use case scenarios, case studies and ideas. Since they’re volunteering their time, I speak from experience in saying they would also appreciate any help one can lend even with meeting minutes, reviewing community documents and staying organized. 🙂

Click here to learn about ADL’s Communities of Practice for xAPI

 

Filed Under: Community, Experience API, Open Source, Uncategorized

IEEE "State of the xAPI" Survey Results Released

June 2, 2014 By Aaron

With over 40 different vendors and 80 different commercial and open-source tools, I’m happy to release the results of the first State of the xAPI: Tools Survey on behalf of the IEEE xAPI Study Group.

This data has been collected from vendors voluntarily sharing information beyond just simply stating they’ve adopted. In this comma-separated-value (CSV) file, you’ll see which tools support the Experience API at Version 1.0 and/or Version 1.0.1 (over half of the tools support it), which should provide some insight about what tools on the list might work well with others. The survey results in this form are licensed CC0, which is a Creative Commons license meant for public domain. As long as you cite it, you are encouraged to use this information.

The survey ran for the last two months — from April through May, 2014.

In the next month, I’ll prepare an initial draft reporting on the results of the survey. I will do this in GitHub as a Markdown document to encourage others to help author (and be attributed for their work). This report will be licensed CC-BY, also meant for the public to freely cite as needed.

So what are you waiting for? Get the State of the xAPI: Tools Survey results today!

Filed Under: Experience API, Open Source, Standards, Uncategorized

How Open Source Can Reframe a Market

February 19, 2014 By Aaron

Business Model Canvas Template

Update: This is cross-posted to the Learning Locker blog, too!

I don’t know how many open source project leaders think about the overall design of their project, including the community and the market in which the software or the *thing* they’re building will exist in, but it’s something Megan & I worked through a lot about with the rollout of the Experience API and it’s something we and everyone involved with Learning Locker is consciously planning for.

Business Model Canvas Template
We often use tools, like a Business Model Canvas, to sketch out how value flows among all the people involved around a project.

To design a successful open source project, one must also design the market for it. I guess one could probably say the same about most efforts, though it’s proven especially true as I’ve had several experiments before xAPI that failed. Those projects didn’t consider a lot of thought around the business models for the projects, let alone how people might derive value (let alone generate value) by their participation.

Open Source is about altruism and the best in people coming together to create something that helps themselves (and others)… but to sustain such altruism, there has to be something that individuals find valuable (not necessarily money — but the delivery of something people can identify and want). The thing people come together to do has to also generate some kind of value.

It’s in this light I want to talk a little about Learning Locker.

History Lesson

Not to go all standards wonkish on you, but let’s take a very light look at the last fifteen years of history in the eLearning industry and look at what’s happened that provides a model for how Learning Locker meets a market need today.

In 2002, SCORM® Version 1.2 came out and it remains the most widely adopted version of the spec, even though arguably millions of dollars and years of effort went into its successor, SCORM 2004. Lots of reasons why SCORM 2004 never quite took off, and I won’t go into all of them in this blog post (lest I get mocked for writing so many words), but chief among the reasons why SCORM 1.2 became so widely adopted, I would argue, was the presence of Moodle and what it’s availability on the market did for SCORM 1.2 adoption.

You see, kids, up until Moodle, there really was no widely adopted free and open-source route to running eLearning content that used SCORM. That sentence to the outsider to our field reads as something really niche, but the open source community around Moodle that worked to incorporate the run-time for SCORM 1.2 content did more to advance eLearning adoption than all the proprietary vendors of learning management systems could have done.

That’s because for people around the world Moodle, being open source and a fairly easy-to-use set of software, meant instant infrastructure — especially in academic, not-for-profit, non-government and government institutions, let alone workforce development concerns. That Moodle had SCORM 1.2 running meant that people could develop content that worked with the spec… and as their needs grew and their budgets grew, they could afford to upgrade their infrastructure beyond Moodle when they needed to.

In short, Moodle created more customers around the world for eLearning — content, authoring tools and systems.

Back to the Future

The stage is set again with Experience API. Right now the market is consolidated around a few main proprietary vendors of learning record store functionality, incorporated into multiple proprietary products and a handful of independent systems with their own learning record store functionality. The specification is moving pretty rapidly toward international standardization. But if you want something free and open source to use that’s enterprise ready? Well, while there are a few open source efforts popping up, the one that attracted our attention (and many others) is Learning Locker, and it aims to fill a market demand for Experience API that’s been unmet so far by the industry.

It will likely grow adoption of the spec itself… and grow the market for purchasing proprietary software as a result.

We don’t even know what kinds of proprietary tools this emerging market will need in the future, as what xAPI does is different from the needs SCORM meets. I do know this: while the appetite for open source solutions grows, eventually many of the adopters of such solutions will be able to strengthen their organizational capacity. They will develop the kind of resources that make more infrastructure investment worthwhile and that grows a new market for new proprietary solutions that cater to some interesting new needs.

The investment in Learning Locker is one of sweat equity. I’m most excited about about the ability to catalyze a market in weening off of technologies and products that aren’t really helping. Designing for value in open source is the kind of challenge I love: it focuses my attention on removing the barriers to adopting fresh and better-designed tools from a more diverse community; that it’s open source means we can see how contributors emerge as the next market leaders for learning technology, clearly helping others as well as themselves.

That’s a positive way to challenge the status quo and make lots of things better: create value. It’s not just for the business types — it’s for anyone who really has the chutzpah to make measurable change. Watch us do just that, and even better, join us.

Filed Under: Community, Experience API, Open Source, Standards, Uncategorized

Footer

Sign up for email updates

Learn about xAPI

  • About xAPI
  • Adaptivity
  • Authoring Tools
  • Case Studies
  • cmi5
  • Content Strategy
  • Data Strategy
  • Learning Analytics
  • xAPI Statements

Learn More

Upcoming xAPI Camps!

Search

  • Home
  • Connections Forum
  • Investigating Performance
  • xAPI Quarterly
  • xAPI Camp
  • How We Can Help
  • Case Studies
  • Learning Tech Consulting
  • Talk to Us!
  • News

Copyright 2017 MakingBetter