Even the best-designed training isn’t finished when the session ends.
At U of Digital, they treat every training as a prototype—an opportunity to learn what landed, what missed, and what needs to evolve. As Veronica Ripson, Head of Learning Experience, shared in our recent webinar:
“We always, always, always want to be learning. The magic happens when we find the human element—and feedback is a big part of that.” - Veronica Ripson, Head of Learning Experience Design
So, how do you turn feedback into something actionable, instead of an afterthought?
Here’s how U of Digital incorporates learner feedback into every training program, and how you can apply the same approach.
Why Feedback Matters (and Why You Need More Than One Kind)
Too often, feedback in learning is reduced to a smile sheet or CSAT score. But U of Digital has a broader view. Every session includes a dedicated observer who tracks both hard data and soft signals, like energy, participation, and group dynamic. Afterward, the team reviews all these inputs to fuel iteration.
Why? Because feedback isn’t just about knowing if a session went well.
It’s about uncovering:
- Whether the topic resonated
- Whether the delivery style was a fit
- Whether the learners walked away feeling more confident or more confused
The 3 Types of Feedback U of Digital Tracks
1. Topic-Level Feedback: Did the Content Hit the Mark?
After every session, the team analyzes:
- Which parts excited learners (e.g., questions asked, comments dropped in chat)
- Which parts felt slow or off-topic
- Which concepts sparked confusion
As Veronica put it during our recent webinar: “Sometimes we test a new topic and it just doesn’t land. That’s helpful too. It’s not just about the wins—it’s about what we learn for next time.”
They also track patterns across sessions to see if certain questions or concepts come up repeatedly. That’s a sign the curriculum needs an update, or a new explainer module.
2. Group-Level Feedback: What Did This Cohort Respond To?
U of Digital doesn’t assume that what works for one audience will work for another. Instead, they track:
- Energy levels (Are learners chatty? Camera-on? Checked out?)
- Activity preferences (Do they light up for trivia, or do they like less competitive activities?)
- Tone tolerance (Do they want empathetic hand-holding or straight-shooting expertise?)
Veronica shared this example: “One client’s team was obsessed with trivia. Hyper-competitive, loved fake internet points. Another group felt like it was too competitive. Same format—completely different reactions.”
This insight shapes future sessions for that group and helps the team brief the next facilitator accordingly.
3. Expert-Level Feedback: Was It the Right Fit?
U of Digital’s expert network includes 300+ practitioners from across marketing and ad tech—but the right person still matters. They look at:
- Whether the expert’s background resonated with the group
- Whether the expert’s delivery style (authoritative, empathetic, pragmatic) matched the tone of the session
- Whether that style helped—or hindered—the emotional outcome the team was aiming for
“We don’t just ask ‘Was the expert knowledgeable?’,” Veronica shared. “We ask: ‘Did they make people feel the way we intended?’ That matters just as much.”
How to Build Feedback Into Your Learning Ops
You don’t need a 300-person expert network to apply this. Here’s how you can build similar feedback loops into your own training process:
Assign a Session Observer. Ask a team member to quietly track participation, energy, pacing, and content resonance—just like a director in the wings.
Use a Simple Debrief Template. Include prompts like:
- What excited the group?
- What didn’t land?
- What questions were repeatedly asked?
- What feedback did the facilitator share post-session?
Track Themes Over Time. Create a simple feedback log across sessions. Look for patterns, not just one-offs.
Adjust Based on Insight. If one group is disengaged from breakout rooms, don’t scrap them completely—just don’t use them with that audience. Tune your approach to your learners.
Final Thought: Feedback Is a Gift. Use It.
When training is human, feedback isn’t just data, it’s dialogue. It’s how your program learns to grow alongside your audience.
Veronica said it best: “At the end of the day, our job is to bring out the human in everyone—SMEs, learners, even ourselves. Feedback helps us do that.”
So the next time you run a session, don’t just hit “end meeting.” Ask what worked, what didn’t, and what surprised you. Then do it better next time.
Because great training isn’t static—it’s iterative.