
Artificial intelligence often magnifies what’s already happening in an organization. If your metadata is a mess, an AI recommendation engine might help you give prospective customers recommendations faster, but those recommendations aren’t likely to be effective.
In this episode of the Leading Learning Podcast, co-hosts Celisa Steele and Jeff Cobb place AI in the Learning Business Maturity Model to help you pick the next move for where your learning business is. While AI can’t fix an immature learning business, used thoughtfully, AI can help a learning business on its path to greater maturity.
To tune in, listen below. To make sure you catch all future episodes, be sure to subscribe on Apple Podcasts, Spotify, or wherever you listen to podcasts.
Listen to the Show
Access the Transcript
Download a PDF transcript of this episode’s audio.
Read the Show Notes
Celisa Steele: [00:00:03] If you want to grow the reach, revenue, and impact of your learning business, you’re in the right place. I’m Celisa Steele.
Jeff Cobb: [00:00:10] I’m Jeff Cobb, and this is the Leading Learning Podcast.
Celisa Steele: [00:00:17] Dr. Philippa Hardman, AKA Dr. Phil, has said of AI, “If early 2025 was the age of rapid adoption [for learning], late 2025 is looking like the age of constructive AI pessimism….” And I feel like “constructive pessimism” does describe what we’re seeing with learning businesses.
Jeff Cobb: [00:00:37] Yes, we do seem to be past any artificial-intelligence-as-silver-bullet kind of thinking. And arguably we never should have been there. AI is a technology. It’s a tool that can be used for many things, and it can be used well or less well.
Celisa Steele: [00:00:52] Right, and AI often magnifies what’s already happening, what’s already in place. If your metadata is a mess or your catalog of courses is cluttered with offerings that should have been sunset years ago, then an AI recommendation engine might help you give prospective customers recommendations faster, but those recommendations won’t necessarily highlight the best path for the learners or result in any higher sales.
Jeff Cobb: [00:01:19] Today we want to talk about AI and its use in learning businesses by placing AI inside our Learning Business Maturity Model. We believe this can help you pick the next move for where your learning business is.
Celisa Steele: [00:01:34] AI is not going to fix an immature learning business, but AI, used thoughtfully, can help a learning business on its path to greater maturity.
Jeff Cobb: [00:01:43] So here’s what we’d like to do in this episode. First, we’ll offer a short Learning Business Maturity Model refresher with stage-by-stage AI plays mixed in.
Celisa Steele: [00:01:54] Then we’re going to look at base requirements that you’ll need to address before making progress with AI, and we’ll talk about a few pitfalls to watch out for.
Jeff Cobb: [00:02:04] We’ll close with immediate steps you can take plus a rolling plan for the next 60 to 90 days.
Learning Business Maturity Model Refresher
Celisa Steele: [00:02:11] So let’s talk about the Learning Business Maturity Model as a quick refresher. This encompasses four stages of maturity across five domains. The stages are Static, Reactive, Proactive, and, ultimately, Innovative. The domains covered are Leadership, Strategy, Capacity, Portfolio, and Marketing. AI shows up in all five of those domains, and the use of AI influences how mature a learning business is, whether it’s stuck in that Static stage or whether it’s able to advance to the Innovative stage.
Jeff Cobb: [00:02:49] We’ll offer some quick descriptions of the stages. Just hearing these may help you self-place while you listen, but we also have an assessment that you can take to gauge your learning business’s maturity.
Celisa Steele: [00:03:02] We do. Be sure to check out the show notes. There you’ll be able to get access to fuller descriptions of the stages and the domains. And you’ll also be able to find out how to access that self-assessment that Jeff just mentioned. But, for now, we’ll do short descriptions.
Jeff Cobb: [00:03:21] The first is Static. The static learning business primarily maintains past practices, with limited response to changing learner needs or market conditions. At this stage, processes may not be strategically aligned to the changes occurring in your market. Growth efforts tend to be ad hoc, and opportunities for innovation, operational improvement, and learner engagement may be flat out overlooked.
Celisa Steele: [00:03:48] That’s the first stage. The next stage is Reactive. A reactive learning business is actively responding to market conditions, but it’s often doing so in a short-term, tactical way rather than thinking long-range strategy. At this stage, organizations may meet immediate needs and do that fairly effectively, but they may struggle to proactively shape offerings, build long-term relationships, or consistently align what they’re offering in terms of learning with their broader strategic goals.
Jeff Cobb: [00:04:22] At the next stage, a proactive learning business has shifted away from primarily reacting to market conditions and moved toward a more deliberate and strategic approach to managing and growing programs and services. At this stage, an organization has already established fundamental processes and is optimizing its offerings with foresight, innovation, and efficiency.
Celisa Steele: [00:04:46] And then the fourth and final stage is the Innovative stage. Once you have an innovative learning business, this organization is really leading. It’s setting trends. It’s continuously innovating, evolving to meet learner and market needs. At this stage, the organization has a strong foundation of processes, and it’s actively shaping the field or profession or industry that it serves through its offerings, strategic foresight, and collaborative ecosystem, where it’s trying to position learning as a core driver of growth and impact. Keep those stages in mind as we now talk about how AI use might look at each of those stages.
AI Plays for Each Stage
Static
Celisa Steele: [00:05:35] At the Static stage, AI tends to be used, if it’s used at all, in isolated instances, and it’s primarily used to solve immediate problems rather than being part of a broader, longer-term strategy. We’ve got sporadic AI adoption—no real, clear strategy or integration into the core functions of the learning business. Maybe AI is used to address surface-level issues, but it’s not being used to drive more fundamental business operations or transformation. And even though AI might be providing some data or helping a learning business to assess a situation, decision-making is still largely manual. AI data might be one input along with others.
Jeff Cobb: [00:06:22] All of this means that AI tools operate in silos, which is characteristic of static learning businesses in general. Those AI tools are being added to the silos, and that leads to inconsistent results. There’s a lack of AI expertise among staff, and that results in suboptimal implementation. And there is no long-term AI adoption roadmap.
Celisa Steele: [00:06:44] In terms of what AI might be doing in a static learning business, it might be doing things like automating transcription. If you have recorded educational sessions, Webinars, or instructor-led online classes, you might be using a tool—if you’re at this Static stage—to capture what’s being said, create automatic AI-driven transcripts. Possibly you might also be making use of something like a chatbot to help with customer service inquiries, but odds are, if you’re doing that, it’s probably more internally focused and used rather than being something you forefront for any of your customers or learners. You might have some data analytics and data dashboards that are offering some AI-generated insights. Maybe you’re also making use of AI to help with some aspects of content curation. Maybe it’s helping suggest tags for learning materials, for example, and so you’re getting some auto-tagging happening.
Jeff Cobb: [00:07:42] Celisa, you were somewhat cautious in saying “might” before you went through that list of potential uses. Call me cynical if you want, but I think that most static learning businesses aren’t doing much more than that automated transcription. Maybe they’re using an AI-powered chatbot to help with customer service, but, in the case of learning businesses embedded in associations, for example, that bot is often owned and managed by another part of the association, like Membership. We know in general from our research—the data we got back last year and, so far, this year—that a lot of organizations are either not discussing AI at this point or are discussing it but have no clear plans for implementation. So far, that’s about half of the people responding to our research this year, and these static businesses are almost certainly going to fall in that 50 percent.
Celisa Steele: [00:08:37] You’re talking about research that we do on an annual basis to look at the learning business landscape. We’ve talked about that research in past episodes. We’ll definitely be talking about the data we’re gathering currently in a future episode, so stay tuned for more about that. And, yes, it’s absolutely worth pointing out that a static learning business may not be using AI at all. For a static learning business, AI use goes from zero up to what I mentioned. And most of those use cases that I was mentioning, even if they are in play, it’s probably largely because those AI features are baked into platforms that the organization is already using. Transcription of an online meeting might be because it’s easy to have an automated AI transcript created. Or, if you have data dashboards that you look at, the system behind that probably is making use of AI. It’s a little bit of a hidden use of AI. You might not even, as a static learning business, be totally aware of the use of AI.
Jeff Cobb: [00:09:43] So you’re not, in that case, probably fully benefiting from it. Part of becoming a more mature organization is becoming a more conscious organization and being intentional about having access to these tools and using them. To continue with AI in that static learning business context, a static learning business might have, as a goal, things like establishing guardrails and looking for one widespread efficiency win with AI if it is going to bring that AI in. To do that, you might do something like create a short, plain-language AI policy. Certainly a good thing for an organization to have at this point. Identify safe tools and make that list available to the relevant stakeholders. Offer some internal training—maybe something on prompt hygiene or a do/do-not session.
Jeff Cobb: [00:10:35] You might establish a dedicated AI working group or task force to explore AI applications. You could begin cleaning up your metadata. You mentioned metadata earlier, Celisa. You need good solid data to use with AI. It doesn’t have to be all your metadata but some. Clean up your titles, your summaries, your tags for your top products. Pick a number of them or a percentage of them that you’re going to tackle. You can begin tracking AI-generated insights on a more consistent and intentional basis for decision-making validation. And you might pilot AI-driven personalization in small-scale learning experiences. All aspirational things that a static learning business might want to try to strive for with AI.
Celisa Steele: [00:11:22] Yes, with all of these stages—from Static all the way up to Innovative—be thinking about how you’re going to measure and see what progress you’re making. Given what you were talking about, Jeff, at this point, at this stage, key performance indicators would be something like the percentage of staff trained on your AI policy, or you mentioned that do/do-not session. How many of your staff have attended a training like that? It might be that you’re looking at how we reduce response time if you deal with customer service inquiries about your learning products and services. Or we were talking about metadata—what percentage of your metadata has been verified and cleaned up?
Jeff Cobb: [00:12:08] A major message is, even at that Static level, just start engaging with AI at some level. Even if you’re not directly applying it to your learning programs or actively into how you’re running your learning business, just getting your staff, your volunteers experience with it. We’re big fans of Ethan Mollick, and he talks about needing to spend that initial ten hours with AI to get a feel for what it can do. That’s going to be a big step for a lot of organizations seeing the path beyond Static as a learning business. So that’s the Static level.
Reactive
Jeff Cobb: [00:12:41] Let’s talk about the Reactive level, the next stage in the maturity model. At the Reactive stage, AI adoption expands, at least in theory, to improve efficiency and automate repetitive processes. AI becomes that operational enhancer.
Celisa Steele: [00:13:00] At the Reactive stage, AI might help with things like automating some administrative or operational tasks. Maybe there are some standard e-mail responses that need to go out. That’s the kind of thing that AI might help with. Maybe AI is taking a look at your learning analytics, and that’s helping you make some decisions about what you offer. Again, that’s probably, at this stage, still AI playing a role and not yet being a real core driver of decisions but making sure that the AI insights are included in what you’re thinking about. You’re probably beginning to have AI tools integrated into some of processes at this Reactive stage, but you’re probably not yet fully connecting those AI tools to all of your processes.
Jeff Cobb: [00:13:47] Some common uses, again, in a lot of cases, are going to be things that are baked into software and processes that organizations already have, but you may be becoming a little bit more intentional about it. You might have AI-driven recommendation engines that are suggesting content based on learner behavior. You might have AI-assisted LMS automation doing things like grading or feedback generation. AI-powered e-mail marketing personalization. Or basic AI-driven learning analytics to identify trends. Again, you might find all of these things already existing in platforms that you’re using, and you’re starting to take advantage of them.
Celisa Steele: [00:14:29] At this Reactive phase, to make a little bit more progress, a learning business might be trying to standardize what’s working. Where are they beginning to see some benefits from use of AI and AI tools? They might want to look at creating a roadmap that shows how the different AI-powered systems might feed into one another and become more integrated so that you’re better able to leverage the AI insights. You might begin to explore using AI for some adaptive learning. Is there a way that you can help personalize those learning experiences a bit based on what you know about the learner and the content that you have on hand? And then, certainly, a reactive learning business is going to be providing more training for staff. How can you help them get up to speed more fully on the tools that you already have in place specifically but also the general uses and possibilities of AI?
Jeff Cobb: [00:15:26] Some simple key performance indicators (KPIs) here—in addition to what we’ve already mentioned for Static—might include things like time to create your learning experiences. That might be the total time that’s required to create a particular learning experience, a number of hours. It also might be the length over time. Does it take one week, or does it take eight weeks to get something done? Has AI helped you to collapse that process? Or things like reduction of routine tasks. If you know it took ten steps to get something done before, and now you’re getting it done in six, that’s a gain.
Proactive
Celisa Steele: [00:14:29] We’ve talked about what AI can look like in static learning businesses and then in reactive learning businesses. Now we’re going to talk about stage three, Proactive. At this stage, organizations have begun to embed AI into their strategic decision-making processes. AI is playing a role in things like program design, personalization, and even business strategy. You’re making use of predictive analytics to drive learner retention, to even inform curriculum adjustments. And you’re using automation, powered by AI, to help with marketing, operations, and learner engagement.
Jeff Cobb: [00:16:49] Yes, this Proactive stage is probably where AI really starts to sing, so to speak. When we hear about organizations that are at this point implementing AI in one way or another, and it’s showing up in our research, this is the type of role that AI is playing. Some common AI use cases are things like adaptive learning systems that tailor content dynamically to learner preferences. When we ask people how they’re going to use AI, being able to adapt to learners and provide that more personalized experience is always top of the list, and they’re starting to do it in this Proactive stage. Next would be AI-powered career path recommendations, based on skill gaps and industry trends. Another one would be predictive analytics that forecast learner retention. Are they going to be coming back? Course demand and pricing models even. AI-assisted instructional design, using it to do things like generate your first drafts of learning materials and even going beyond that in many cases—obviously looping in your subject matter experts in that process. Finally, AI-driven customer segmentation for targeted marketing strategies—making sure you’re reaching those right audiences, making them aware, driving conversion with them.
Celisa Steele: [00:18:05] And because AI is starting to sing in this phase, there are challenges or things to be aware of. I talked about constructive pessimism at the opening of the show. Being aware of some of the limitations or biases and making sure you’re addressing those. Thinking about ethics, bias, and compliance issues will be important at this phase because you are trying to rely on AI for some core and important things. You got to make sure that you’re doing that in an ethical way, that there’s not untoward bias in there, and that you are complying to whatever guidelines you need to comply to for your learning business. You want to make sure that the AI-generated insights are being used effectively for decision-making, that they’re not necessarily dictating the decision but that they’re being used to inform the decisions. Of course, there’s going to be some challenges to look at around scaling the solutions that you’re finding across your portfolio or throughout your learning business. This is about taking what’s been working and trying to institutionalize it and make sure that it is there and happening at scale.
Jeff Cobb: [00:19:17] To advance in this stage with AI—and remember, at this stage, we’re talking about organizations that are reaching the point of being conscious and intentional in how they’re thinking about their maturity and the different domains in it—you’re going to want to look at integrating AI across all five domains and measuring the outcomes you’re getting. You want to implement AI governance frameworks to ensure ethical AI use. You’re going to want to invest in deeper AI-driven learner analytics and experience mapping. You’re going to want to develop AI-powered business models, such as AI-generated courses or AI-enhanced credentialing programs or AI experiences themselves. It may be something more like a custom GPT becomes your course environment rather than the traditional course or credential.
Celisa Steele: [00:20:11] In terms of what you’re measuring at this stage in those key performance indicators, you might be looking at things like, “Is our use of AI changing our completion stats? Are we seeing more learners complete? Are we seeing the satisfaction of learners improve? Are we converting more of our prospects into learners who sign up for and complete a learning experience? Are we seeing changes in the time-to-certification or the time-to-competency?” Whatever you’re measuring there, but looking at what is the larger-scale impact of the use of AI?
Innovative
Jeff Cobb: [00:20:45] So that’s that Proactive level. Finally, we get to the Innovative stage of the maturity model. To extend our metaphor, if AI is starting to sing in that proactive stage, this is where we start to hit four-part harmony. Things are really coming together here. In that Innovative stage, AI contributes to differentiation and, you guessed it, innovation. AI is central to business operations, product innovation, and competitive differentiation. At this point, you can use AI for things like autonomously identifying markets. You’ve given it the power to go out and do the research and find new opportunities for you. AI is going to be enhancing learner experiences through hyper-personalization, getting down to that individualized learning path that so many organizations are striving for right now. And then AI-driven insights are going to be used to continuously refine your business models and to drive your revenue growth.
Celisa Steele: [00:21:51] This could look like, in action, AI-generated course content that is adapting in real-time to learner progress. Back to your point about individualization, that course is responsive to what I, Celisa, am doing in that moment, and it’s adapting and giving me the just-right content for me. AI might be powering coaching or even mentoring systems. I don’t know how many folks out there are Duolingo users, but there’s the Lily video call feature now in Duolingo, and this is where you have on call someone who can respond in real time to you—an AI agent, but it’s someone to assist, to coach, to mentor in real time. You might be using AI-driven business intelligence to inform new product development. What products should you be developing, and what should those products look like? You’re leveraging AI to help with automating decision-making, especially lower-level, very clear decision-making, but also to help with the predictive modeling so that you’re not only making the right decision in the moment but you’re trying to think about, “In the future, where do we need to be, and what should we be doing now to make sure we’re in the right spot down the road?”
Jeff Cobb: [00:23:08] AI can easily take you through all sorts of scenario planning along those lines to play out what might happen with particular products and audiences. But, of course, this does come with challenges. You’re going to need to be able to do things like ensure that AI dependency doesn’t lead to a loss of human oversight. You need to keep humans in the equation there. You need to manage the risk of AI-driven automation errors. As you start to give up some control to AI, start to use AI agents more, you’ve got to have those mechanisms for checking for errors and then maintaining a balance between AI-driven efficiency and human-centric learning experiences. You’ve just got to keep the human in mind. You can’t create a learning experience that’s so efficient that it takes the inefficient human out of the equation.
Celisa Steele: [00:23:59] If you’re at this stage, if you’re an innovative learning business, to help advance your use of AI, you’re going to want to continue investing in AI research and AI development so that you can use that for ongoing innovation. You’re going to make sure that you have good policies in place that help govern your use of AI and help ensure that you’re using it in an ethical way, that you’re being as transparent as you need to be with stakeholders about how you’re using AI. And then you also want to be thinking about ecosystem partnerships and how can AI help you in extending your market influence and your reach through some of those partnerships. You might be doing scenario simulations for practice. You might be integrating some copilots with employers so that you are surfacing just-in-time resources. You might be leveraging those coaching experiences. There are a lot of ways at this point that AI use could play out once you’re at that Innovative stage.
Jeff Cobb: [00:25:01] And you want to continue to develop KPIs that are a match for this stage. Obviously, you can build on everything that we’ve talked about for the earlier stages. But you’re going to want to look to the percentage of revenue that’s coming from AI-enabled products as you start to introduce those products. You want to look at the renewal and the retention rates around those products and what AI is helping to drive. And you want to be able to measure demonstrable performance gains. If you’re starting to individualize those learning experiences, those pathways, what kind of performance gains is that resulting in? Of course, you’re going to want to use those in your forward-looking marketing.
Summary of AI Use at Different Stages of Maturity
Celisa Steele: [00:25:40] To summarize, the story of AI use in learning businesses moves from zero, if we want to start there…
Jeff Cobb: [00:25:47] …and we probably should start there because we know a lot of organizations are at zero right now.
Celisa Steele: [00:25:51] This is no conscious use of AI up to ad hoc use of AI, where it’s much more hit or miss or gets used sometimes but not by everyone. Then it moves to inconsistent use without clear goals at the less mature end. And then we’re moving towards much more consistent, embedded, strategic use of AI as we move towards the higher ends of maturity and up to that Innovative stage.
Jeff Cobb: [00:26:16] The movement is from initial engagement to experimentation to operational enhancement to strategic support to playing a real role in differentiation and innovation for your learning business.
Celisa Steele: [00:26:31] It’s important for us to acknowledge here that we don’t see doing nothing with AI as a viable option. Even if your learning business decides to do nothing with AI, AI still arrives. It’s going to be that curious (or rogue) staff person who’s going to be engaging in the use of AI. Or AI is going to be baked into some tool that you are already using. Again, it doesn’t seem viable to us to say, “We don’t use AI.”
Base Requirements and Pitfalls
Jeff Cobb: [00:27:02] No matter where your maturity is, you’re going to need to have some things in place if you’re going to take advantage of AI. There are some base requirements for any kind of AI use case that you might pursue.
Celisa Steele: [00:27:14] You want to have these prerequisites in place. First, you need that AI policy, and we advocate for being short and clear so that it’s the kind of thing that someone’s eyes don’t glaze over as they’re trying to read through it. You want it to be short and understandable. You want to cover what’s allowed, what’s off-limits. You want to have some rules around sensitive data. You want to make sure that there’s a path for a staff person who, if they want to do something that’s not covered by the policy, who do they go to to either request a special use or to potentially say, “Hey, I think we need to change this in the policy.” And you need to have a plan for revisiting that AI policy on a regular basis. It needs to be a short-time window given all that is happening with AI. We have a forthcoming episode where I talk with Tori Miller Liu, the CEO of AIIM, and they have a very short and sweet AI policy. It’s a good model in terms of having a handful of bullet points about how you can use AI. You’re going to want that AI policy. You’re going to also want a list of safe tools and let your folks know how to access those. What are the approved apps? Some of these might be embedded in systems that you’re already using.
Celisa Steele: [00:28:34] If it’s more of a freestanding AI, then you’re going to make sure that folks know how to access it and sign in and log in and specify any settings that they might need to use in the tool. For example, checking a box that says, “Hey, don’t share my data back with the company that’s creating the AI tool.” You’re going to want to make sure that you are clear on privacy and intellectual property and accessibility issues. What type of content is someone working in your learning business able to share with AI? What sort of consent, if any, do they need? What are the copyright issues? Those sorts of things. And then you’re going to have to have internal training. If you have this policy, if you have the safe tool list, you need to make sure that folks are aware of all of that. In that internal training, you might also get into some things like prompt basics and covering the intellectual property and the personally identifying information do and do-nots. There’s that, and then there’s also the data hygiene issue, which we’ve mentioned before—cleaning up your metadata, making sure that the inputs that AI is going to be using are accurate so that the AI then can make good recommendations because you know that what you’re feeding into it is accurate.
Jeff Cobb: [00:29:57] There are two realities that you always need to plan for. This goes back to being conscious and intentional. First, shadow AI. People are going to use unsanctioned tools unless you provide safe options. To go back to those policies and planning, make sure you’re providing those options. Second, ambient AI. As we’ve already alluded to, vendors are baking AI into the tools that you already own, so you’re probably using AI without necessarily being conscious and intentional about it. You want to raise that up to the conscious and intentional level. And your policy and your safe list should cover both the shadow AI scenario and the ambient AI situation.
Celisa Steele: [00:30:43] We’ll touch on a handful of common pitfalls, things to try to avoid as you’re going further with your use of AI in your learning business. First, doing nothing or the thinking you-can-do-nothing pitfall. We’ve already covered this, but basically doing nothing doesn’t seem viable. AI exists in your organization, so don’t think that you gain anything by not taking the time to create that AI policy and do the other things that we’re talking about.
Jeff Cobb: [00:31:11] Definitely. As much as we encourage people to experiment and to pilot, you need to watch out for what we call pilot purgatory—lots of tiny tests, but nothing gets scaled or institutionalized. This is very easy to fall into because, once you start playing with AI, it’s easy to leap from one thing to another. “Oh, it can do this, and, oh, it can do that.” But you have to be strategic and figure out what is the thing that we need it to do for us right now.
Celisa Steele: [00:31:37] Another pitfall to avoid is feature shopping when it comes to what you might be looking for from platform vendors. If you are in the market for a new technology system, it can be exciting to see those shiny features around AI that a particular platform offers. But what you need to be thinking about is what are we trying to achieve? What are the outcomes that we want? And then how do we get to those outcomes, which may or may not be through those shiny, exciting features? There might be a non-AI way to do it or a better way to do it with AI.
Jeff Cobb: [00:32:13] It’s the same thing that occurs repeatedly with technologies. We’ve seen it with learning management systems again and again over the years. You go after the shiny bells and whistles, and you forget about what are those outcomes you were trying to achieve. Another one is basically the garbage in/garbage out data rule. Anybody who’s used any form of AI, mainly the generative ChatGPT-/Claude-type models, knows that you have to be careful about what you feed into it, how you contextualize what you feed into it, or what comes out the other end can be junk or worse.
Celisa Steele: [00:32:50] The last pitfall that I’ll mention is you want to avoid taking a set-and-forget approach or trying to totally offload to AI. You need to make sure that you or someone on your team is reviewing what comes out of these AI tools, that you’re making sure to review for bias or other issues, and that you remain engaged in actively assessing what’s working, how it’s working.
Sequencing
Jeff Cobb: [00:33:17] It’s probably worth saying at least a word about the sequencing of getting AI into your learning business, and we’ll propose a simple order of operations for that. First, start with policy. Make sure you’ve got those policies in place. Second, data. Make sure you’ve got your data in order, anything that’s going to be fed into the AI, that you’ve got that cleaned up, that it is good data, that you know how you’re going to contextualize it going into the AI. Then do those pilots. We said be careful with those, but you do need to do them to get your footing and figure out what the next steps are going to be. Next, review the results that you’re getting. Then, and only then, move on to trying to scale the positive that you’ve achieved out of leveraging AI. And then possibly—and this is not always the end game, but it may be an end game, and it’s something you should definitely keep in mind—productize. If you come up with something as you’re implementing AI that does represent a product that can be put out to your marketplace, you want to be able to go there. Be clear about ownership and responsibility, like anything else you would ever implement. You want clear roles. You want clear measures. You want clear KPIs at each step.
Celisa Steele: [00:34:31] With AI, things move fast, which means plans can age fast. That means you probably want a relatively short window for your AI planning—maybe a 60- or 90-day window. We’re talking months, not years. You’re going to want to revisit any plans or policies that you put in place—maybe that’s even as frequently as on a monthly basis. Just a quick look to see, “Okay, do we need to refresh the scope of what we’re covering or any of the tools in our safe list?” Or “Has anything changed about how we feel about risk related to any of our use of AI?”
Jeff Cobb: [00:35:08] We haven’t used the term so much here, but it’s being serious about your AI governance. How are you managing AI through your organization over time?
Moving Ahead: This Week and Beyond
Celisa Steele: [00:35:17] We said we would talk about some next steps, both in the shorter term and then the slightly longer term. In terms of what you might do this week, first is identify your learning business’s maturity. You can either use the quick descriptions of the stages that we offered or—and we would love for you to do this—take the online self-assessment that will allow you to see your maturity, both overall and in terms of the five domains covered by the maturity model: Leadership, Strategy, Capacity, Portfolio, and Marketing. Once you know where your maturity is overall and in terms of those domains, you want to pick what’s the domain where you might have the highest amount of leverage if you were to begin using AI or to use AI in a more conscious way in that domain.
Celisa Steele: [00:36:09] This might be where you’re seeing some of your biggest bottlenecks, or it might be the area where you see a promising opportunity. You could also aspire, in the next week, to set up a meeting where you’re going to run through potential use cases that will speak to that high-leverage domain. You’re going to want to get the right folks in the room, you’re going to want to generate some ideas and use cases that AI might help you with, and you’re going to want to then narrow it down to just one. As part of that group meeting, define a KPI of what it looks like to have achieved something meaningful with that use case. Again, you can figure out your maturity, and you could begin to put the mechanics in place of having a 60-minute use-case workshop—get that on the calendar with the right folks.
Jeff Cobb: [00:37:03] Once you’ve done that work, you can move into a rolling 60- to 90-day plan to run the pilot that you’ve sketched out. In the first month of that, you’re going to want to get your ducks in a row; make sure you’ve identified all the elements of the situation that you’re trying to improve upon. Go beyond those use cases to flesh out what is the situation here? How are we trying to change things? Whether that’s making something better or filling a gap, or something needs to be created that wasn’t there before. Make sure you’ve got all your data in order, cleaned up—anything that’s going to be used as part of this, whether it’s learner history or meta-tagging. Whatever the case is, get that data cleaned up, and then lock in on those KPIs that you’re going to use to measure the success of the pilot. In your second month or maybe beginning in that first month, you’re going to be running the pilot, going through the necessary motions, collecting the outcome data that you get from it to be able to compare against that baseline situation that you had defined. And make sure all along you’re documenting what you learn in the process.
Jeff Cobb: [00:38:06] Then sometime in month two going into month three, if it’s all working, then you start looking towards scaling. That may mean investing more in it, putting more resources against it. It may mean ramping up marketing and communication around it if you’re going to take it out to a marketplace at that point. Or, if things aren’t working, making that rapid and disciplined decision to sunset. But be sure to recycle what you have learned in the process of that pilot—that’s one of the main reasons for running pilots—and then take that back to any backlog ideas you have and start the process over again with a new pilot. Now, the guiding principle in all this—to borrow a phrase that Mike Moss, who’s the president of the Society for College and University Planning, used—is that you have permission to play. This is something his organization gives to staff and volunteers at SCUP. You’ve got that permission to play, but make sure you’re learning from that play and carrying those learnings forward.
Wrap-Up and Recap
Jeff Cobb: [00:39:11] We’ll wrap up in just a moment with a recap of what we’ve discussed in our look at AI’s place in the Learning Business Maturity Model.
Jeff Cobb: [00:39:34] If you enjoy the Leading Learning Podcast, please share this episode or another episode with a colleague or co-worker you feel would appreciate and get value from it.
Celisa Steele: [00:39:44] AI won’t magically fix an immature learning business, but AI, used thoughtfully, can help a learning business on its path to greater maturity, on that move from Static to Reactive to Proactive to Innovative.
Jeff Cobb: [00:39:57] AI can play a role in all five domains covered in the Learning Business Maturity Model: Leadership, Strategy, Portfolio, Marketing, and Capacity.
Celisa Steele: [00:40:07] We hope that thinking about AI within the context of the Learning Business Maturity Model will help you think about where AI might responsibly help you progress in the near term.
Jeff Cobb: [00:40:17] Give that question some thought on your own and with your team: Where can AI amplify your maturity in the next 30 to 90 days?
Celisa Steele: [00:40:26] Thanks again, and see you next time on the Leading Learning Podcast.
To make sure you catch all future episodes, please subscribe on Apple Podcasts, Spotify, or wherever you listen to podcasts. Subscribing also gives us some data on the impact of the podcast.
We’d be grateful if you’d rate us on Apple Podcasts or wherever you listen. Reviews and ratings help the podcast show up when people search for content on leading a learning business.
Finally, follow us and share the word about Leading Learning. You can find us on LinkedIn.

Becoming a Learning-Centric Organization with Mike Moss
Leave a Reply