This is the last installment in our seven-episode series on learning science’s role in a learning business, and we’ve covered a variety of interesting topics. But we want learning science to be more than interesting. We want you to apply it so you can elevate the success and impact of your learning business, as well as the success and impact of the learners you serve.
In this episode, we revisit comments from the interviewees in the series about the one tenet or aspect of learning science they wished was better understood by those designing and delivering learning for adult lifelong learners. We connect their perspectives and uncover the common theme: actionability. And, to help your learning business take the important step of applying learning science, we offer a framework we developed—the MIDDLE ME learning product lifecycle—to get you started.
To tune in, listen below. To make sure you catch all future episodes, be sure to subscribe via RSS, Apple Podcasts, Spotify, Stitcher Radio, iHeartRadio, PodBean, or any podcatcher service you may use (e.g., Overcast). And, if you like the podcast, be sure to give it a tweet.
Listen to the Show
Access the Transcript
Download a PDF transcript of this episode’s audio.
Read the Show Notes
Perspectives on Key Tenets of Learning Science
[00:27] – We spoke with Megan Sumeracki for the episode that focused on behavioral and cognitive psychology. She explained that we’re actually really bad at putting ourselves in the future context related to what we’re learning and assessing how well we’d do in that context.
We’re bad judges of our own learning. That is a significant finding. It shows that we can’t just ask learners if they’re getting it and if they’ll be able to do it when the time comes. We need to get them to try to do and see. That’s when we’ll know—and they’ll know—whether or not they’ve really learned it.
Celisa Steele
[04:04] – We asked learning designer and technologist Myra Roldan what aspect of learning science she wishes those designing and delivering adult lifelong learning better understood. Her comments connect with Megan’s. Myra explained that a lot of instruction focuses on theory and understanding concept, but instruction tends to fall flat on the hands-on, application piece.
What both Megan and Myra homed in on is cause and effect—or at least a logical consequence. If learners aren’t good judges of knowing if they’ve learned something, you need to give them opportunities for for trying things out. Those hands-on opportunities allow them to not only learn but to also get some feedback about their learning.
By doing something, learners often produce something that an instructor can evaluate and provide feedback on too. There’s a lot of work to be done in helping instructors and subject matter experts learn how to scaffold hands-on, application opportunities well.
Not just any doing will have good results—learners need doing and application that are spot-on, relevant, and align with how they will use whatever they are learning in the future.
That can sound obvious and easy, but we know from our own experience that when you’re trying to design something impactful in a limited amount of time, it can be hard to determine appropriate opportunities for doing and application. It requires work over time, and learning businesses should help their subject matter experts in understanding the importance of practice and scaffolding application opportunities.
[07:14] – The aspect of learning science Ruth Colvin Clark wishes learning designers better understood is the fundamentals of the mental processes and a better appreciation of the limits and strength of working memory.
This is one I’ve been guilty of, time and time again, and always have to pay attention to. So many subject matter experts, so many presenters, teachers, facilitators are guilty of this. Just stuffing too much into a learning experience, not taking the time to cut down to what’s truly essential and what’s truly reasonable for a learner to process…. Less content really can be a beautiful thing.
Jeff Cobb
[09:35] – The limitations of working memory that Ruth brought up are also an argument for microlearning, which Brenda McLaughlin, CEO at SelfStudy, raised in regards to designing effective learning experiences. She thinks we need more normalcy and systems around creating smaller interactions with education so that we’re not always thinking in terms of mapping to a course but of mapping to a competency or skill.
Brenda’s comments remind us of Cathy Moore’s take that we really need a mindset shift to get away from content-heavy learning options and offerings that can then favor more doing because there’s less emphasis on content. Microlearning embraces the less-is-more approach, and Brenda points out that, to realize its full potential, it isn’t just chunking longer content; it’s truly a different approach.
[12:04] – We asked evaluation experts Rob Brinkerhoff and Daniela Schroeter to pick one aspect of effective evaluation they wished was more broadly understood by those charged with looking at the impact of learning programs.
Daniela emphasized that there has to be a focus on performance—what the learning is about, what you want to get out of it, and if it’s actually being used.
Rob stressed that that an evaluation must be actionable. You need to know what the return on investment of the evaluation is, and it has to be actionable because, if you can’t do anything with it, there’s no point. We struggle with this ourselves, and it’s not easy territory, but the Success Case Method is one approach that can be used.
Listen to Mark Nilles speak at the Leading Learning Symposium about how he’s made use of the method to evaluate training programs.
Sponsor: SelfStudy
[15:17] – If you’re looking for a technology partner to help you optimize the learning experiences you offer, check out our sponsor for this series.
SelfStudy is a learning optimization technology company. Grounded in effective learning science and fueled by artificial intelligence and natural language processing, the SelfStudy platform delivers personalized content to anyone who needs to learn either on the go or at their desk. Each user is at the center of their own unique experience, focusing on what they need to learn next.
For organizations, SelfStudy is a complete enterprise solution offering tools to instantly auto-create highly personalized, adaptive learning programs, the ability to fully integrate with your existing LMS or CMS, and the analytics you need to see your members, users, and content in new ways with deeper insights. SelfStudy is your partner for longitudinal assessment, continuing education, professional development, and certification.
Learn more and request a demo to see SelfStudy auto-create questions based on your content at selfstudy.com.
The MIDDLE ME Learning Product Lifecycle
[16:29] – We’d like to offer a framework we developed that can help you think about where and how learning science might play a more strategic, thoughtful, and intentional role in your learning business: the MIDDLE ME learning product life cycle.
The MIDDLE ME learning product life cycle has four phases:
- Market interface (MI): understanding what your learners need and want and communicating with them about the value you have to offer
- Design & development (DD): creating what your learners need and want
- Learning experience (LE): the learners interacting with what you’ve created
- Measurement & evaluation (ME): looking at the impact of learners interacting with what you’ve created
The MIDDLE ME framework can help you focus efforts among your stakeholders: the staff that you have who are charged with providing the learning that you offer, the volunteers, facilitators, the subject matter experts, and the learners. The framework can help give you insight into what’s going on at different points for each of those key stakeholder groups.
[18:32] – Here’s how learning science might fit in in each phase of MIDDLE ME:
- Market interface (MI) is all about knowing and connecting with your audience: learners, customers, and your potential learners and customers. Here’s where learner needs assessment and/or market assessment come into play. Your internal team/staff will need to be engaged to conduct the assessments. Once you have the results of those assessments, you need to share them with your designers and developers.
- Design and development (DD) is where you plan and create the products and services that your learners need and want. This is where you cut and chunk to help with the limitations of working memory. You scaffold practice so that learners aren’t passive or getting theory only, but they engage and try things. The main stakeholder group involved here are those people doing the design and development of your learning experiences.
- The learning experience (LE) phase is where learners interact with what you’ve designed and developed based on your market interface. Mostly, you have to make sure that the good design and development work of the previous phase doesn’t get thwarted. This is also where feedback and practice happen. Stakeholders here are the learners and those supporting them in the moment.
- In the measurement and evaluation (ME) phase, we look at the impact of learners interacting with what we designed and developed. We measure and evaluate our products and services and look to draw business insights. As we heard from Rob Brinkerhoff and Daniela Schroeter, this is where you can look for evidence of the impact of the learning experience, and, in particular, you might look for success cases and failures and analyze those. This is valuable in multiple ways—marketing potential, improving offerings, proving value, and more.
We’re providing this MIDDLE ME model as a tool because it can be a simple framework for helping you put learning science into action. Providing education and learning experiences for your adult lifelong learners is really at the core of being in a learning business.
To thrive in the learning business, what you deliver has to be effective. It has to get the learners to where they’re supposed to go, and that’s what applying learning science can make happen for your learning business.
Jeff Cobb
Evidence-Based Practice Requires Evidence
[24:41] – Learning science, as a science, is grounded in evidence. That evidence-based approach is consistent with how we tend to approach life and work—we strive to try and test and study what’s working or not working and why.
We hope you’ll choose at least one action to take in at least one of the four phases of the MIDDLE ME model and then look at the results. Use the evidence you gather to grow and improve your learning business in general and your offerings in particular. Get out and do/try/measure something so you can get some evidence that will allow you to test ideas and hypotheses and make informed decisions throughout MIDDLE ME.
Learning science has the potential to help you across the board with the reach, revenue, and impact of your learning business. It’ll help you as an organization be more successful. It’s going to help your learners be more successful. It’s going to help the field, profession, or industry that they’re working in do better. Then, of course, those served by the learners are going to fare better as well. It really is a win-win-win-win when a learning business can really leverage learning science.
Celisa Steele
[26:19] – Wrap-up
This is the last episode in the seven-part series on the role of learning science in a learning business. We hope you’ve enjoyed the series, and we’d love to hear your feedback and suggestions for the future. You can leave a comment at below or e-mail us at leadinglearning@tagoras.com.
We’ll resume releasing episodes of the Leading Learning Podcast with a new series starting in October 2021, which means you have some time to experiment with infusing learning science more intentionally in your learning business before the next new series airs.
To make sure you don’t miss the remaining episodes in the series, we encourage you to subscribe via RSS, Apple Podcasts,Spotify, Stitcher Radio, iHeartRadio, PodBean, or any podcatcher service you may use (e.g., Overcast). Subscribing also helps us get some data on the impact of the podcast.
Recommendations offer evidence of the podcast’s impact, so please take a minute to rate and review the Leading Learning Podcast at https://www.leadinglearning.com/apple.
We personally appreciate your rating and review, but more importantly reviews and ratings play a big role in helping the podcast show up when people search for content on leading a learning business.
We encourage you to learn more about the series sponsor at selfstudy.com.
Finally, consider following us and sharing the good word about Leading Learning. You can find us on Twitter, Facebook, and LinkedIn.
[28:21] – Sign-off
Other Episodes in This Series:
- Learning Science for Learning Impact
- Effective Learning with Learning Scientist Megan Sumeracki
- Needs, Wants, and Learning Science
- Designing Content Scientifically with Ruth Colvin Clark and Myra Roldan
- Practice and Feedback: Evidence-Based Tools for Learning
- Evidence-Based Evaluation with Rob Brinkerhoff and Daniela Schroeter
Episodes on Related Topics:
Leave a Reply