Blog Post

Data-Driven Optimization: Deciphering Key Program Metrics

By:
Shannon Howard
Published:
December 6, 2023
Updated:
How to measure the health of education programs

How do you measure the health of your education programs?

We’ve talked a lot about the importance of tying learning objectives to business outcomes. That’s important.

But when it comes to actually measuring and improving the learning program itself, how do you do that?

That’s what we’ll cover in this blog. We’ll walk through three types of metrics to measure, how you’d use each, and how to analyze and interpret data to uncover areas for possible improvement.

With these insights in hand, you’ll be able to focus your limited time and resources on the content updates that matter most. 

3 Types of Data You Should Be Measuring

When it comes to assessing education program health, there are three main types of metrics you should be measuring and analyzing. 

1. Engagement

Engagement metrics are indicative of interactions learners took with your content and within your learning platform. For example: Page views, bounces, enrollments, time on page, watch time, etc. 

These metrics can give you an understanding of how easily learners are finding content, as well as what they do once they find the right piece of content.

When it comes to engagement metrics, the research shows “measuring multiple factors of engagement is critical, as single-dimension definitions are missing the whole story” (Deng et al., 2020). So you want to be certain to look at engagement metrics in relation to other engagement metrics.

2. Completion

The next metric to measure is completion rates. This is looking at the number of people who enrolled in a course or learning activity compared to those who complete, abandon, or are actively in progress. 

Completion metrics help you understand if the content is relevant and helpful, as well as if there are sections where content could be improved to encourage completions. Drop-off rates can highlight where content is broken or why content, as a whole, is not performing. 

Outside of the content itself, completion rates might also indicate a need for reminders to continue learning. Learning science research found that “[lack of completion] might indicate a need for more intervention, i.e., automatic hints, virtual tutors, pushes, notifications, etc.” (Stamper et al., 2013). (Learn more about incentives to get your customers to complete training.)

3. Efficacy

Learning efficacy tells you if your content is teaching what it’s supposed to teach. If your content doesn’t enable learners to pass related assessments, that tells you your content is failing to impact the established learning objectives.

When we’re creating education content, we want to validate that the time and effort it took for a learner to go through a course or curriculum was time well-spent. We want them to be able to retain and apply knowledge. 

Assessing learning program efficacy can give you great insights into:

  • The quality of your content
  • The structure/order of your content
  • How well users are absorbing and retaining information within your formalized learning pathway

How to Analyze and Interpret Data About Program Health

Now that you’re familiar with the metrics to measure, let’s look at how we might actually analyze those metrics to mine insights about how our program is performing—and to identify areas for improvement.

Let’s look at a fictitious example:

Acme Company wants to build “Y Academy” for freelancers and agencies to help them learn how to use the product and grow industry expertise to train people on their platform.

The academy is focused on certification as an education initiative. Their target audience is customers who want to learn more about the platform and partners who can charge for consulting on the platform.

This is what one of their dashboards shows:

Let’s start with the first chart, on the bottom left, where we can see bounces and views. What do you notice? The bounce rate is pretty high. 

Now we can ask: Why are people bouncing (leaving the page quickly after viewing it)? A high bounce rate may tell us that the program isn’t adequately named or described. When learners come to the page, they don’t quite see what they expect, and they bounce as a result. 

Acme Company might consider looking at how they name and position the certification, or they might look at the learner experience after “discovering” the certification. Is it clear what the certification is and how to get started?

Now, let’s look at the second chart on completions. The completion rate is OK, but we see a pretty high abandonment rate.

(Wondering the difference between in progress and abandonment? It’s natural for learning to take some time. We have interruptions at work and at home that can keep us from finishing learning content in one sitting. But if content is abandoned for an extended period of time, we look at that as “abandonment”. Otherwise, the learner is considered “in progress.”)

Abandonment can be the result of a number of things:

  • The learner didn’t need all of the content to get “unstuck.”
  • The learner didn’t enjoy or value the content.
  • The learner forgot about the content or was distracted by other activities.

With so many options, how do you figure out what it is?

One thing we can do is look at the pass/fail rate: 

With the pass rate so high, it’s unlikely that the content isn’t valuable or useful.

But we can also look at drop-off rate throughout the certification as another data point:

If most learners are dropping off at the same module, that might indicate that the module isn’t relevant or engaging—a prime candidate for improvement.

Additionally, if you don’t have any nudges in place, such as email notifications or in-app reminders, it may be worth experimenting to see if those move in-progress learners to competition—and re-engage learners who abandoned the course.

See how each of these numbers is tied to one another—and how looking at them in relation to each other can help us understand the bigger picture of what’s working and what’s not?

Want to see someone walk through this in real-time? Check out this recording of former Intellum Product Manager, Radhika Solanki, walking through three example dashboards to extrapolate insights:

A Final Note on Cohort Analysis

In our example dashboard above (for Acme Company), you’ll notice that the certification serves multiple audiences: Customer Group A, Customer Group B, and Partner Group 1. 

In addition to looking at engagement, completion, and efficacy metrics as a whole for the certification, we can segment data by audience. How is Customer Group A interacting with the certification? What do engagement, completion, and pass/fail rates look like for them? This can help us understand where the certification may be meeting the needs of some audiences but not others—so we can define a plan of action to improve the content or learning experience for that group.

About the Author

Photo of woman wearing a red shirt with a black sweater with her arms crossed.
Shannon Howard
Director of Content & Customer Marketing
Shannon Howard is an experienced Customer Marketer who’s had the unique experience of building an LMS, implementing and managing learning management platforms, creating curriculum and education strategy, and marketing customer education. She loves to share Customer Education best practices from this blended perspective.