
In this episode of the Leading Learning Podcast, co-hosts Celisa Steele and Jeff Cobb wrap up their three-part look at reach, revenue, and impact by focusing on impact—often the least clearly defined and measured of the three pillars.
Celisa and Jeff explore how impact looks different depending on whose perspective you consider—that of learners, employers, and the learning business itself—and why measuring impact doesn’t have to be perfect to be useful. When learning businesses treat impact data as strategic intelligence, it can inform key decisions about what to offer, what to improve, and what to retire.
They also discuss how evidence of impact strengthens marketing, improves learning design, supports smarter portfolio decisions, and deepens business development conversations.
When reach, revenue, and impact reinforce one another, learning businesses are better positioned not just to grow but to thrive.
To tune in, listen below. To make sure you catch all future episodes, be sure to subscribe on Apple Podcasts, Spotify, or wherever you listen to podcasts.
Listen to the Show
Access the Transcript
Download a PDF transcript of this episode’s audio.
Read the Show Notes
Celisa Steele: [00:00:03] If you want to grow the reach, revenue, and impact of your learning business, you’re in the right place. I’m Celisa Steele.
Jeff Cobb: [00:00:10] I’m Jeff Cobb, and this is the Leading Learning Podcast.
Celisa Steele: [00:00:16] Learning businesses need to consistently perform in three areas: reach, revenue, and impact.
Jeff Cobb: [00:00:23] In the last two episodes, we’ve revisited reach and revenue as part of that larger interconnected system. In this episode, we turn to impact.
Celisa Steele: [00:00:32] Impact is what ultimately tells you whether your reach and revenue will be sustainable—it’s where your assumptions get tested.
Jeff Cobb: [00:00:40] Reach tells you who you’re getting in front of, revenue tells you what the market is willing to pay for, and impact tells you whether your work is making a meaningful difference—for learners, for their organizations, and for the field or industry you serve.
Celisa Steele: [00:00:55] Impact is often the least clearly defined of our triumvirate—reach, revenue, and impact. Learning businesses tend to value impact, but they can still struggle to articulate or measure it.
Jeff Cobb: [00:01:08] Today we want to unpack impact—what it means, whose perspective matters, and how it connects to reach and revenue.
Impact for Whom? Clarifying the Perspective
Celisa Steele: [00:01:17] When we talk about impact, there is an immediate question: impact for whom? Because there are a variety of perspectives on impact. Impact is not one thing.
Jeff Cobb: [00:01:29] We’re going to look at those multiple perspectives: the learner; the employers who are hiring those learners in the field or industry you’re serving; the learning business itself; and then, ultimately, that profession, field, or industry that you are serving.
Celisa Steele: [00:01:45] From the learner’s perspective, impact is going to tie into things like increased confidence or new knowledge and skills and the ability to apply the new knowledge and new skills on the job.
Jeff Cobb: [00:01:58] And it factors into career mobility. Is the learning that you’re delivering helping that learner to get a job, to get promoted, to expand the responsibilities that they have in their job? And is it reinforcing or helping to establish and then reinforce their professional identity?
Celisa Steele: [00:02:18] Impact, from the learner’s perspective, is going to very often be largely self-reported—but we think that’s entirely legitimate.
Jeff Cobb: [00:02:29] Definitely. Confidence and perceived usefulness matter. You want the learner to feel confident. You want them to feel the perceived usefulness from learning experiences. Those are things that will drive continuing engagement, which is so important to learning businesses.
Celisa Steele: [00:02:46] A big part of how you’re going to be able to get at the learner’s perspective on the impact will be through asking them to self-report on various measures around the learning experience. One of our go-to thinkers around measurement is Will Thalheimer. He has a great way of rethinking the smile sheet so that you’re trying to get at more than just “Did you like the food at the conference?” and get to “Are you actually applying what you’re learning?,” for example.
Jeff Cobb: [00:03:17] And this goes to that confidence question. Are you more confident in doing your job? Are you confident that you learned things that are going to help you do your job? We won’t go into detail around Will’s thinking here because we’ve talked to him a number of times, and we can point to some of those episodes for folks to take advantage of. But he does have an approach to evaluation, to feedback you get from learners that is much more structured and much more targeted than what you get through those typical smile sheets that you were referencing, Celisa. Another approach that we think has great value is the Brinkerhoff Success Case Method.
Celisa Steele: [00:03:54] Yes, and I got to speak with Rob Brinkerhoff, so we can link to that episode in the show notes for this episode. That’s where you look at who are the learners who are truly benefiting from what you offer. And then you begin to look at what contributes to that success so that, then, hopefully, you can deliver that same success to other learners.
Jeff Cobb: [00:04:15] Those are some great resources around that reporting by the learners and how you can capture that reporting effectively, whether it’s through a strong evaluation like what Will Thalheimer advocates for and provides the guidance for doing or whether it’s something like the Brinkerhoff Success Case Method. That’s the learner perspective on impact. We also want to take into account the employers’ perspective—the people who are giving jobs to those learners and are hoping that their performance is going to improve through learning experiences.
Celisa Steele: [00:04:47] Impact from the employers’ perspective very often is taking into account the capability of the workforce. They’re looking for is this learning experience helping the workforce be more capable? Is it helping to reduce any errors that might happen on the job? Or is it helping to improve the performance of employees?
Jeff Cobb: [00:05:09] This is a big one. We hear this a lot in the news today around employers, their perception of the readiness, the capability level of the workforce, particularly the entering workforce, the early-stage workforce coming into a profession. But, all along, how is your learning contributing to raising the bar at the employers where your learners are working? Is it helping to speed up onboarding? Is it helping to increase performance? Is it helping with retention? That’s a big one. We know, particularly if you buy into the generational perspective on workers/on employees, younger generations in particular seem to highly value having education and training experiences as part of what they’re provided in their working environments.
Celisa Steele: [00:05:54] You want to be getting the employers’ perspective on the impact. Are you asking employers what changed? What are the managers noticing once the learners are back on the job? Is that something you’re actively trying to gather data around?
Jeff Cobb: [00:06:11] We find most organizations are not. It is something that needs to be instituted. It can certainly factor in if, for example, you’re trying to do business-to-business selling—selling directly to employers. You want some impact data to be able to help make those kinds of deals possible. But, even if you’re selling to individuals, if an individual learner is able to demonstrate, if the employer can see that the impact has been there for the individual learner, it’s going to make it much more possible for them to make the budget arguments that they often have to make to be able to participate in professional development and continuing education experiences.
Celisa Steele: [00:06:48] The third perspective on impact is the view from your learning business. What does impact look like for you?
Jeff Cobb: [00:06:56] A big one here is do people keep coming back? We talked about smile sheets (perhaps with a hint of disdain in our voices) earlier, but, if people feel like they are enjoying/benefiting from the experience, if you’re seeing that show up in those smile sheets, hopefully you’re seeing that the follow-through from that is that those people who gave you good evaluations show up again and again. And that’s probably because your learning is having some impact on those learners. They do feel like you’re helping them with their performance, with their advancement in their career, whatever their personal measure is of impact.
Celisa Steele: [00:07:36] If you’re getting those learners to come back, then any individual learner is spending more of their time, more of their money with your organization. That contributes to the lifetime value of learners to your learning business. It often too is going to result in referrals. If people are seeing the impact from a particular course or class, whatever the experience is, that word of mouth will help you find future learners, other learners to engage with.
Jeff Cobb: [00:08:06] Yes, that’s very, very powerful stuff in getting customers and building your brand. It can help too if you are doing more business-to-business-type selling of your education. It’s going to help with those institutional renewals, keeping those business customers coming back. And, of course, if you are able to demonstrate impact consistently—have your learners, have your employers feel that impact, have those learners returning over time—then you’re going to be much more likely to have that sustainable net revenue that learning businesses seek.
Celisa Steele: [00:08:40] Revenue, particularly if a product has strong positive net revenue, can signal something that’s having impact, and that’s likely why you’re able to bring in those revenue dollars. But revenue alone is not necessarily proof of impact, so you want to make sure that you are actively looking at ways to measure the impact that you’re having, documenting that so you can use it in marketing and business-to-business sales situations. You want to have that kind of data because it’s really to help you as you serve other learners.
Jeff Cobb: [00:09:15] We’ve talked about it before, and we will talk about it again: how revenue feeds into impact; impact feeds into revenue. It’s a virtuous cycle that exists among these three components of reach, revenue, and impact.
Celisa Steele: [00:09:27] The fourth and last perspective that we’ll touch on is the impact on the field, the profession, or the industry, whatever it is you target with your learning business.
Jeff Cobb: [00:09:41] That could be things ranging from raising standards in the field or industry you serve. For us, this is a big part of leading learning. Are you leading the learning that’s happening in your field or profession and helping to raise the bar within that profession? It can contribute to shared language across the profession. A lot of language gets conveyed, transferred, shared through learning experiences (formal and informal) and can really have an influence on the direction of the field or industry you’re serving.
Celisa Steele: [00:10:13] That shared language can also feed into identity—and we touched on identity earlier. When you have a group of learners who really feel like they’re part of a community, that can help elevate the mission of whatever that profession, field, or industry is.
Jeff Cobb: [00:10:29] This is definitely a case where the whole can be a lot greater than the sum of the parts. When you’re having all of those incremental impacts on individual learners, that obviously adds up and amplifies over time, and you begin advancing the entire field, raising the bar in the entire field, as we were talking about at the beginning of this, through collective growth and knowledge and skills and even the identity across your profession or field.
The Myth: “Impact Is Too Hard to Measure”
Celisa Steele: [00:11:00] We’ve talked about four perspectives on impact. Next, we should talk about what we essentially see as a myth. The myth is that impact is too hard to measure.
Jeff Cobb: [00:11:14] Out of the gate, it feels harder than, say, reach metrics. You can tell how many people you have on your e-mail list or how many people enrolled in your courses. That’s pretty easy. It feels harder than revenue because you can run the numbers on your latest event and see how much you sold and how much it cost, and you’ve got a pretty good view into revenue.
Celisa Steele: [00:11:34] It can be a little bit harder than those reach and revenue metrics. It also can be hard as a learning business to feel like you can get direct access to impact data. If you take, for example, corporate L&D, where they have that captive audience, where they’re telling employees, “Go through this training,” and then they’re able to see on-the-job performance and track that very, or relatively, easily because they have access to the relevant managers, to the relevant data, and all of that. We understand that there is a legitimate difficulty for learning businesses in getting some of that access. But we don’t think that difficulty excuses inaction. You need to be looking at trying to get at some of that impact data. Because you still can.
Jeff Cobb: [00:12:24] Hard is not the same as impossible. This is one of those great places to remind everyone that you don’t need perfection. You need progress. You need to get some data that you can work with, making it a regular habit to do some form of longitudinal follow-up with your learners to get some idea of what impact particular learning experiences may have had on them. If you’re not getting input from employers, set up some mechanisms for periodically getting the impact from employers, using some of the methods that we referenced earlier, like the Brinkerhoff Success Case Method, to get qualitative feedback in a way that helps to inform your decisions going forward. There are methods that may not be as precise as having a spreadsheet with revenue and cost figures in it, but you can get at data that does point directionally to whether your learning is having impact or not.
Practical Ways to Think About Measuring Impact
Celisa Steele: [00:13:24] Let’s segue into some practical ways to think about measuring impact. We have mentioned some as we’ve been going along, but we can talk about a few ways. We said at the beginning that a lot of the learner perspective on impact is likely to come through self-reporting, and we think that is entirely legitimate. Part of what you want to try to get in those self-reports from learners is data around application. Are they taking anything that they learned and applying it in their daily, weekly, monthly job duties?
Jeff Cobb: [00:14:04] And what has the impact of that application been? You can ask the extent to which your learners feel like the learning that they are experiencing with you is helping them to advance in their careers. And this is qualitative. This is self-reporting again, but it’s very helpful to have a sense of whether it is helping them advance or not. And, from employers, are they considering participation in your educational activities and, related, any credentials that come out of those activities, are they considering those in hiring? Are they considering those in promoting? You want feedback over time that that is in fact happening out in the field or industry that you’re serving.
Celisa Steele: [00:14:43] You want to be asking about application. You want to be asking about any changed behavior. You want to be asking about changes in confidence and also asking about the learner’s intent to continue learning with you—a net promoter score, whether or not you use that methodology. Will people come back? Will they continue to learn with you because they saw some impact?
Jeff Cobb: [00:15:08] You can do these through interviews. These are things that can also be factored into the evaluations that you’re running after learning experiences. We do simple things like ask people to rate the return on investment that they feel that they got out of a learning experience. Or you can ask them at the point of exiting an event, “Will you apply this?” or “How will you apply this in the coming weeks/months ahead?” to collect some data through standard forms that you’re probably already using for evaluations after a learning experience.
Celisa Steele: [00:15:42] We find that immediately after a learning experience, that type of surveying tends to happen fairly regularly and consistently across learning businesses. What is often not quite as consistently enshrined is more of a longitudinal approach. If you’re only asking immediately after an experience, you don’t necessarily have the ability to really see the impact because it may take a while for someone to begin to apply their new knowledge or skills, and you want to be able to see how that impact is accumulating over time. And so we would encourage you, if you’re not already, to be thinking about, are there moments where you can naturally follow up again? Maybe you do it immediately after the event, maybe three months out, six months out, a year out, where you’re trying to, again, ask some of those questions and see what changes. Is there even greater confidence six months out? Is there greater behavior change? Any of those types of things that you were asking about, is it shifting over time?
Jeff Cobb: [00:16:47] A couple things here. You can prime for this. Make learners aware that this is going to happen so that they are aware of this and being somewhat conscious of what they’re doing with the learning experiences that you’ve provided them. The other thing is this is the type of thing that can be automated at this point. You can put into your e-mail system, possibly even within your learning management system, communication system, the triggers—whether it’s one month, three months, six months, whatever it is—to automatically send that out with standard questions that you’re going to ask of learners. You’ve got to get that set up and get it running.
Celisa Steele: [00:17:21] You can automate it. You can also take a more manual/human approach and reach out and do more of a one-on-one communication. We know that Todd Slater at NIGP: The Institute for Public Procurement is doing some of this longitudinal assessment of learning. Those are through e-mails and interactions coming from him. He feels like he gets a higher response rate sometimes because people see, “Oh, this is a particular individual really interested in my perspective.”
Jeff Cobb: [00:17:50] The fact is you should be doing both. You should be doing some more automated quantitative—or quantitative-ish—collection of data along with this much more manual qualitative approach. You can balance those out according to your resources and what’s practical for you to do. But, with all of it, you’re looking for those signals of application, of actual behavior change on the part of the learner, of changes in the confidence level of learners, and of their intent to continue engagement with you. Putting that plan in place and putting the mechanisms in place to follow through on the plan over time, you can collect significant impact data that, in the aggregate, is going to give you a strong picture of how much impact you are having on the individual learner; how much impact it might be having within the employers; how much impact it’s having for your learning business, which may be the most straightforward in some ways; and, while it may never be crystal clear, at least some indication of the impact you might be having on the broader field or industry you’re serving.
Celisa Steele: [00:18:56] In all of these lines of questioning around impact, part of what you’re getting is that deeper picture of the learner and what matters to them, what matters to their employers. All of this can be part of relationship-building with the individual learners, with the employers and businesses that you serve. This genuine curiosity and this follow-up can convey your sincere interest, and that can help then for the learner/the employer to trust you more, for there to be more of a give-and-take relationship, and all of that can strengthen your business development opportunities because you’re going to have a lot of conversation, a lot of background to draw on when you’re talking with anyone.
How Impact Interacts with Reach and Revenue
Jeff Cobb: [00:19:45] This is the third element of our triumvirate of reach, revenue, and impact. We certainly need to talk about how this third element, impact, interacts with the other two.
Celisa Steele: [00:19:59] If you’re having strong impact, that’s going to fuel your ability to reach learners. It’s going to help with word of mouth. It’s going to help with spontaneous peer recommendations. It’s going to help with the social proof. It’s going to give you that organic visibility. It’s going to help with trust. If you have that impact, that’s really going to help ease the way to reaching more learners.
Jeff Cobb: [00:20:25] Those are impact signals that draw people to you. If those signals are strong, if impact is strong, then your marketing is going to get much easier. You’re going to know much more about your market. You’re going to be more visible in your market. It’s going to be much easier to get people’s attention, so attention becomes easier to earn. Trust deepens significantly. Conversion becomes much easier. On the flip side, weak impact is going to increase your marketing friction. If that impact is not visible, if it’s not felt out there in the marketplace, you’re going to have to work harder to get people to you.
Celisa Steele: [00:21:02] In terms of impact and revenue, if we look at how those two fit together (we touched on this a little bit earlier in the episode), if you’re having strong impact, that is going to support your ability to bring in revenue because you’re going to have higher retention; you’re going to have repeat enrollments. If you have those institutional sales, those B2B relationships, the likelihood of that renewal will be stronger if you can show impact. And, if you’re showing impact, individuals are likely to be more willing to pay and maybe even to pay at a premium price because they can see the results, the impact of what coming and learning with you might mean.
Jeff Cobb: [00:21:44] Weak impact can increase your churn. People come, but they don’t stay. They don’t keep coming back. It can increase price sensitivity. Basically, it can work against you from a revenue standpoint.
Celisa Steele: [00:21:56] This takes us back to what we started this three-part miniseries out with—that reach, revenue, and impact are very tightly intertwined. Because, if you have reach without impact, you might get that activity, you might get people there for that conference or course or whatever it is, but there’s not going to be any value there.
Jeff Cobb: [00:22:16] So you’re going to have a hard time getting them to come back. And that points to the next one—revenue without impact is fragility. It makes your revenue streams fragile because they show up that one time. The value is not there. They’re not coming back. They’re not spreading that good word of mouth to get others to you.
Celisa Steele: [00:22:37] Now, if you do have impact, if you are able to show results, but you’re not reaching all of the learners that you could or should be, then you’re going to be a best-kept secret because you’re delivering well, but not enough of the people that you could be serving are able to experience that impact.
Jeff Cobb: [00:22:54] And then the final system-view aspect we’ll mention is impact without revenue—as too many organizations experience—is unsustainable. You have to have the revenue to fuel what you’re creating, what you’re delivering in terms of learning experiences in order to have that impact. If you don’t have both parts of that equation going, it’s going to be unsustainable.
Impact Data as Strategic Intelligence for a Learning Business
Celisa Steele: [00:23:22] We see impact data as strategic intelligence for your learning business. Because our view of learning businesses is that you’re much more than a course provider. You are working in an environment, an ecosystem that is made up of learning, trust, and value. When you have impact data, when you’ve been measuring impact, that provides you with strategic intelligence to make good decisions about what to offer, what not to offer, when to stop offering it, what to start—all of those myriad decisions that go into running a learning business. Impact data is going to help inform and help you make those decisions.
Jeff Cobb: [00:24:04] If you’re getting good data about the impact that you’re having, it’s going to help you improve your marketing messaging, getting the right message out there to reach people and convert them; it’s going to help you refine your products, your learning experiences; it’s going to help you prioritize what goes into your portfolio; and it’s going to help inform business development conversations, all of which keep feeding back into this cycle of being able to reach more of the right people, being able to generate the revenue that comes from connecting with and converting those people, and then ultimately having that impact that keeps the cycle going.
Celisa Steele: [00:24:40] You want to collect impact data, but you also want to make sure that you’re thinking about it not just as data—that you’re taking action based on what you’re learning about the impact, that you’re using that data to strengthen what you offer. And you can do that because you have an evidence-based view of the real-world impact of what you’re offering.
Reflection and Invitation
Jeff Cobb: [00:25:02] We’ll offer up some reflection questions here before we close out this episode, starting with this: Whose definition of impact are you using? Whose perspective does that definition prioritize, and is it right for your learning business?
Celisa Steele: [00:25:21] It gets back to the different perspectives that we talked about—the learner’s, the employers’, your learning business’s perspective, and that of the broader field, profession, or industry. Are you taking into account all of those perspectives on impact, or are you only prioritizing one or two of those views? And, if you are prioritizing a few of them, is it the “right” choice, or is it more of just that’s what you’re doing, but you haven’t necessarily been thoughtful about why you’re prioritizing that view of impact?
Jeff Cobb: [00:25:51] A related question in my mind is where is impact assumed rather than demonstrated in your learning business? You may be assuming you’re having a certain level of impact on the learners directly. Do you have evidence for that, or is that just an assumption? And, for these other perspectives we’re talking about—the employers, your business itself, the broader field or industry—to what extent are you just assuming that you’re achieving the impact that you want to? Or to what extent do you have some of that data we’ve been talking about that helps make the case for, yes, we really are having that impact?
Celisa Steele: [00:26:27] Because we are believers in the interrelatedness of reach, revenue, and impact, think about your approaches to reaching learners and to generating revenue, and what do those approaches reflect in terms of what you believe about impact? This point is around making sure that you are being thoughtful about reach, revenue, and impact and how they relate to one another.
Jeff Cobb: [00:26:53] The proverbial bottom line on all of this is, when reach, revenue, and impact reinforce one another, learning businesses don’t just grow—they thrive.
Recap and Wrap-Up
Celisa Steele: [00:27:09] In this episode and two others, we’ve revisited reach, revenue, and impact as interdependent pillars of a learning business’s success.
Jeff Cobb: [00:27:17] None of them stands alone, and each is strengthened when aligned with the others.
Jeff Cobb: [00:27:29] If you found this episode or the three-episode miniseries helpful, please share with a colleague.
Celisa Steele: [00:27:36] Thanks again for listening, and see you next time on the Leading Learning Podcast!
To make sure you catch all future episodes, please subscribe on Apple Podcasts, Spotify, or wherever you listen to podcasts. Subscribing also gives us some data on the impact of the podcast.
We’d be grateful if you’d rate us on Apple Podcasts or wherever you listen. Reviews and ratings help the podcast show up when people search for content on leading a learning business.
Finally, follow us and share the word about Leading Learning. You can find us on LinkedIn.
Related Resources
- Learner Surveys and Learning Effectiveness with Will Thalheimer
- Evidence-Based Evaluation with Rob Brinkerhoff and Daniela Schroeter
- Revisiting Reach, Revenue, and Impact—Starting with Reach
- Revisiting Reach, Revenue, and Impact—Continuing with Revenue
- Rethinking a Dangerous Art Form with Dr. Will Thalheimer
- Reach, Revenue, and Impact

Revisiting Reach, Revenue, and Impact—Continuing with Revenue
Leave a Reply