Blog Post

Learning Efficacy Explained: How Effective Is Your Workplace Education Program?

By:
Robyn Hazelton
Published:
April 21, 2023
Updated:
Illustration of a target with an arrow in the center bullseye.

Maybe you have an internal education team to create content for your learning destination, or maybe you hired freelance instructional designers. Maybe you have formalized learning pathways like certifications, or maybe you have a collection of informal resources like reference materials, blog posts, how-to videos, and community forums. No matter which way you slice it, those are educational resources. 

And building those education resources costs money. 

When you’re investing in anything as a business—whether it’s hiring additional staff or creating a certification from scratch—you need to know your investment is paying off. For education initiatives in particular, most program owners tend to focus on learning outcomes as a measure of success without considering the business outcomes the program was built to achieve. But to prove ROI, you have to think about both. 

Learning outcomes are both easy to think about and easy to measure. The education space (particularly the customer education space) is flooded with metrics to help you track your content’s success. But to understand the connection between learning outcomes and business outcomes, we need to answer two fundamental questions: 

  1. What does it mean when we say a program is effective? 
  2. What’s the difference between efficacy and effectiveness?


Efficacy in Customer Education

When you think of measuring successful learning outcomes, you’re probably thinking of efficacy. Oxford Languages defines efficacy as “the ability to produce a desired or intended result.”

Efficacy tells you if your content is teaching what it’s supposed to teach. If your curriculum doesn’t enable students to pass the related assessment, this is a low-efficacy program. Efficacy is a fancy academic term, but it’s a useful one. Perhaps you’re testing too early on information that’s taught later in the course, or your content itself is unclear or confusing. Assessing your learning program efficacy can give you great insights into:

  • The quality of your content
  • The structure/order of your content
  • How well users are absorbing and retaining information within your formalized learning pathway

If most learners can pass the quiz or assessment only after following the curriculum, then you have a high-efficacy program. If learners can pass an assessment without ever consuming your content, you have a low-efficacy assessment that tells you nothing about the quality of your content. If almost everyone is getting a particular question wrong, then you know either the question is written poorly, or your content isn’t properly teaching the material relevant to that question.

Luckily, efficacy issues are often easily identified through data visualization that’s attached to assessments. But measuring efficacy is only one piece of the effectiveness puzzle.

Learning Efficacy vs. Effectiveness

If efficacy tells you whether your content is teaching what it’s supposed to teach, effectiveness shows you how well that learning is working in the real world

If I get diagnosed with high blood pressure, the doctor will give me some dietary advice and prescribe some medicine. Her treatment plan is focused on efficacy: will it lower my blood pressure? Sure it will … if I follow her advice.

The effectiveness of my doctor’s treatment plan hinges on multiple other factors: Did I follow her advice? Did I pick up the prescription? Did I take the medicine when I was supposed to? Did I change my diet—or did I keep eating hamburgers every day? 

The success of a workplace education program hinges on how well the curriculum explains a topic and how successful the program is at getting people to finish it. In a recent study, we found the No. 1 challenge education leaders face is learners abandoning content midway through. Both the medicine you didn’t take and the course an employee never finished are perfectly ineffective

Learning Effectiveness From the Inside Out

I’ve heard the same story a thousand times: we built an incredible learning destination, but no one’s signing up! 

Just because you built it, doesn’t mean they’ll come. Your groundbreaking and engaging platform (learning destination) isn’t effective if no one shows up. On the flip side, if they show up and have a poor experience due to subpar content or a poor learning environment, they won’t finish. 

Luckily, we can easily measure KPIs to diagnose issues that need correcting. Any education program owner worth their salt should have a solid understanding of the basic internal metrics that can be used to assess the health and effectiveness of a workplace learning program: 

Acquisition: How many people are starting the program? 

Engagement: How much content are they consuming inside the program? Which pieces are they going to most often? 

Completion: How many people are completing the program? What’s your drop off rate? 

Efficacy: Can learners demonstrate proficiency? What are the assessment pass rates? 

Ratings: Even if learners passed the assessment, did they hate the curriculum? Did they personally feel like it helped them achieve their goals? 

For example, if you find that a large number of ideal students begin the program but fail to complete it, you can draw the conclusion that your acquisition efforts are working but your engagement is low. From there, you’ll need to diagnose the problem to determine what exactly is driving low engagement. Is it irrelevant content? Is it poor learning destination UX?

Measuring Internal Metrics With Intellum

These five pillars can be easily accessed from within the Insights Module of the Intellum platform, and they provide an easy, at-at-glance way to assess the effectiveness of your learning program. Insights offers a strategic view into how your broader initiative and your individual content pieces are performing. 


The dashboards within Insights are full of valuable information like: 

  • Which content pieces are working the way they should
  • Which content pieces are creating bottlenecks (and could be scrapped)

And when viewed as a funnel, you can understand more about where you’re losing people or what the weakest link is in the chain.

These metrics are indispensable for understanding and improving your learning platform, but they still don’t fully encompass effectiveness. The real test of an effective program or initiative comes from OUTSIDE the platform. 

Learning Effectiveness From the Outside In

If I followed the doctor’s guidelines to a T, started eating healthy, and took all my medicine, but I didn’t get any better, then the program still wasn’t effective. The metric we were looking to hit was “lower blood pressure,” not “number of hamburgers eaten” or “number of pills swallowed.”  

To understand the effectiveness of your program, there is one simple question you have to ask yourself:

What did we build this course to do? 

And from there, what are the external metrics that will tell us if we’re achieving that goal? 

We don’t just want to ask if a learner base went from intermediate to advanced or failing to passing. We must also ask: What metrics should that knowledge acquisition drive? 

  • Did learners start implementing the content in the way you’d hoped? 
  • Did that implementation drive revenue or reduce support tickets or increase employee satisfaction or reduce time to onboard for new employees?  

Don’t start with the content you think people need and build up. Start from the above big picture question, and build your program from the top down. If you can identify your ideal end state and work backwards, designing your program and defining success suddenly get a whole lot easier.  

If my initiative was to reduce customer support tickets by creating a robust customer education pathway, I should be asking: Did customers who completed the program submit significantly fewer tickets than those who didn’t complete the program? 

To measure a program’s effectiveness, we have to tie content consumption and learner engagement to the data outside the program. 

“What matters most to us is whether learners implement the things they’ve learned, integrating them into their practice, and ultimately impacting the core business metrics we’re striving for.” - Jaclyn Anku, Head of Education & Community at Gusto

Here are some great examples of external measures of effectiveness that can help you understand if your education platform is doing what you built it to do:

  • Onboarding: Increase employee retention, increase job satisfaction
  • Skills and enablement: Decrease learner skill to acquisition time, get them better at their job faster
  • Customer support: Fewer support tickets, fewer CSM calls
  • Partner education: Increased product adoption, increased client adds 

Through this lens, we can finally understand if a program is truly effective.    

Connecting the Dots on Learning Effectiveness

Measuring internal effectiveness through metrics like efficacy, engagement, and completion is critical to building, designing, growing, and perfecting your education initiative. But an effective course is one that has the desired business impact. It’s one that does what you built it to do.  And if you’re doing it right, you shouldn’t be building anything until you understand the external metrics your learning destination is going to impact. 

If you’ve already built a course and are looking to revamp it, follow the same process. Some questions you might ask yourself include:

  • How do our existing course materials help learners to achieve their goals that align with our external goals? 
  • Which course materials have we created that don’t contribute to those goals at all? 
  • Are these modules driving sales among those learners who have completed them? Would adding additional modules help us sell more products?
  • Are learners achieving their desired outcomes by using our product? 

With Intellum, you can connect your internal Insights Module and learner data to any external metrics you may track through dozens of different CRM, business intelligence, and workforce management platforms.   

“Using the Insights feature, we can reference data-driven dashboards right from our account that tell us about the learner engagement of our programs. It’s also made it easier for our Business Intelligence teams to create the data piping needed to bring together disparate data sets and tell the impact story of the Academy.” - Jaclyn Anku, Head of Education & Community at Gusto 

Until you connect the dots to that external data, you could have a program with perfect efficacy, incredible completion rates, and a Net Promoter Score of 98, and it could still be an ineffective investment.

At the end of the day, an education initiative is a business initiative. And in an uncertain economic climate, it’s your job as an education leader to demonstrate the value of the work you do. If you’re not thinking about effectiveness in terms of ROI—whether that means driving revenue or decreasing support tickets—you’ll never make the business case to put more dollars toward your program.

But the best part is that every education initiative has a path to a positive ROI. All you have to do is ask yourself: What did we build this to do? And how can we prove that it’s doing it?

About the Author

Robyn Hazelton headshot
Robyn Hazelton
Vice President of Marketing and Growth
Robyn is the VP of Marketing and Growth at Intellum and helps to ensure that every interaction an individual has with the brand is as awesome as possible. An experienced and trusted leader with a history of consistently impacting revenue, she's always talking about funnel management and biased toward action.

Recommended For you