Blog Post

How to Measure Partner Training ROI

By:
Brian Timberlake
Published:
February 6, 2024
Updated:
partner training ROI image

It’s imperative we measure and prove the value of our learning programs.

And while the ROI of customer training has been well-documented, what about the ROI of partner training?

To dive deeper into this topic, we spoke with April Trask, Senior Director of Partner Enablement at Cloudinary, and Lynn Godfrey, Lead Partner Training Program Manager at Iterable.

What Type of Partner Do You Educate?

One factor that can make proving the value of partner training difficult is the diverse types of partners an organization might work with. 

For example, Cloudinary works with a range of partners, from system integrators, digital agencies, consultancy firms, development shops and value-added resellers (VARS) to strategic technology partners, original equipment manufacturers (OEM), and managed service providers (MSP).

The partner training team at Iterable trains and enables agency and technology partners (which they call Solution Partners) on the Iterable application.

Depending on the type of partner, you’ll have different goals, content, and measures of success. 

Partner Training Metrics to Measure

How you measure success depends on the type of partner education you offer. Do you offer 1:1 training? On-demand training content? Do you have a scalable partner training program delivered through a learning management system (LMS)?

What you measure also depends on the maturity of your program. As April shared, “Early on in your program, when you’re building the foundation, you aren’t always able to measure immediate business impact. Your metrics are often focused on development and touch: Are training and enablement materials getting shipped and reaching partners?” 

As your program grows in maturity, you’re able to measure the business impact of your efforts.You want to have these business outcomes in mind from the beginning, but set the expectation that it will take time to prove the value of learning.

Use Leading and Lagging Indicators

Leading and lagging indicators can be helpful to measure performance of initiatives that take time to see results. 

Lagging indicators are the metrics you ultimately want to move the needle on. For partner training, these usually revolve around:

  • Increased revenue from partners
  • Increased partner retention
  • Faster onboarding for partners
  • Improved partner performance
  • Improved partner service

These metrics can take time to see. For example, for Cloudinary’s referral partners, it can take more than a quarter to correlate the impact of training and enablement on the business, depending on the length of the sales cycle. 

Leading indicators are faster and easier to measure. These metrics let you know if you’re heading in the right direction to reach your goals. If your goal is increased revenue from partners, what activities would let you know that’s on track to happen, or not? This might include the successful completion of onboarding or passing a partner certification.

The following are leading indicators Lynn and April look at to track partner education performance: 

  • Partner engagement with learning content (activity, visits, clicks)
  • Pass/fail rates for learning content
  • Partner certification completions 
  • Partner course completions
  • Partner satisfaction scores (PSAT)
  • Customer satisfaction scores (CSAT) for partner-owned accounts

Pro Tip: How do you benchmark satisfaction scores? Lynn told us, “We’re tracking CSATs for customers that are being onboarded by Iterable’s Professional Services team so we have a data point to compare partners’ performance against.”

What Makes Partner Training ROI Difficult to Measure?

You may be reading this thinking: That seems pretty straightforward. So why does it seem so hard, at times, to measure ROI?

April and Lynn shared several insights on this topic.

1. Partner needs can be harder to understand.

“Partners are often so much more diverse than customers—not just persona wise, but segmentations that cross-hatch personas,” April told us. 

This requires more effort in mapping the partner journey, and marketing education and enablement to different partner audiences. The complexity also means more metrics to measure—and more ways to slice and dice the data.

2. Partner motivation can be more complex.

Customers are often motivated to learn by intrinsic motivation (“I need to learn this so I can do my job.”).

Partners, on the other hand, often rely on extrinsic motivators such as distinction and incentives. As April said, “You’re never in the field alone. There is always going to be competition—other companies that will offer something similar to partners.” You have to work to keep them motivated to stay, learn, and partner with your company.

3. Partner education takes time to mature.

While this can also be true of customer education, it’s especially true for partner education. If you have multiple partner types, it takes even more time to build a foundation for partner education and enablement. These efforts take time to establish, grow, and prove the value. 

Lynn offered this insight: “Measuring and showing the value of the partner training and certification program has been difficult to show since our program is relatively new. We’re operating on a lot of assumptions on the effectiveness of the training program and the likelihood of partners sharing more leads with us once they’re certified and able to implement Iterable on their own.”

Delayed business impact requires partner education leaders to effectively manage company expectations around performance and time to ROI. 

4. Disparate data sources make ROI calculations difficult.

Much like we see in customer education, data and reporting can be challenging. 

At Iterable some data is stored in Iterable (e.g., use of the platform). Some are in their CRM and on customer contracts (e.g., contract expansion or downgrade). And yet another data, whether a partner is connected with the customer (e.g., did a partner onboard this customer), may be stored separately. Accessing and analyzing data from different sources takes time, cross-functional relationships, and a good amount of effort.

Who Should You Share Performance Metrics With?

Primary stakeholders for partner education tend to include partnership and revenue leadership teams. 

But performance shouldn’t just be shared at the team and executive level. April’s suggestion? Make performance metrics visible to everyone.

Cloudinary’s partner enablement team has a complex Salesforce dashboard that’s visible to everyone, from SE team members to senior leader in the company. 

In addition to making metrics visible, you’ll also want to proactively communicate these updates. April does this through quarterly business reviews (QBRs), as well as Town Hall presentations and Slack channel updates. Lynn’s team has a monthly Education Services internal newsletter where program metrics are shared out.

Proving the Value of Partner Education is Challenging, Yet Essential

As we learned from April and Lynn, figuring out if partner training is working involves looking at different metrics. Keeping an eye on how well partners are engaging with the training and checking if they pass or complete courses can be indicative of future performance. Tracking these early measures and setting appropriate expectations with leadership of these signs and sharing them with everyone in the team plants seeds for success. It’s also important to remember that programs aren’t static. Developing a feedback loop with partners will help you continuously improve your partner education and training programs.

About the Author

Brian Timberlake headshot
Brian Timberlake
General Manager
Brian Timberlake has been in sales for his entire career, including overseeing channel networks. He brings this expertise to Intellum customers with partner sales and education motions.