This episode will take 30+ minutes to listen to—or less if you like to speed up the audio. But, however long it takes you to get through the episode, we hope you’ll take away at least three things you can use in your learning business. Find a sheet of paper or open a note-taking app, and write down the numbers 1, 2, and 3. As you listen to this conversation with Dr. Brian McGowan, fill in a takeaway beside each number.
Brian has studied clinician learning for the last 20 years, and, for the last 14, he has specifically focused on the intersection of behavioral science and learning science. Brian and Leading Learning Podcast co-host Celisa Steele talk about what Malcolm Knowles got wrong about adult learners, practical ways to work with less than perfect learners and less than perfect learning designers, working out loud, what the long-term impact of COVID might be on learning, the uses of generative AI to support learning businesses, podcasting as an efficient way to share research and ideas, and more.
As you listen, remember to go back to your sheet of paper or your app and take note of at least three takeaways.
To tune in, listen below. To make sure you catch all future episodes, be sure to subscribe via RSS, Apple Podcasts, Spotify, Stitcher Radio, iHeartRadio, PodBean, or any podcatcher service you may use (e.g., Overcast). And, if you like the podcast, be sure to give it a tweet.
Listen to the Show
Access the Transcript
Read the Show Notes
Brian McGowan: [00:00:00] I don’t like wasting time, and it just feels like, if you’re not learning actively, intentionally learning when you’re building content or running training programs, you’re just at dramatic risk of wasting time.
Celisa Steele: [00:00:19] I’m Celisa Steele.
Jeff Cobb: [00:00:20] I’m Jeff Cobb, and this is the Leading Learning Podcast.
Jeff Cobb: [00:00:28] You’re going to be listening to this podcast for the next 30+ minutes, maybe less if you’d like to speed up the audio. But, however long you take to get through the episode, we hope you’ll take away at least three things you can use in your learning business. Find a sheet of paper or open a note-taking app, and write down the numbers 1, 2, and 3. As you listen to this conversation with Dr. Brian McGowan, fill in a takeaway beside each number. We invited Brian back to the podcast—this is his fourth time—because he is always thoughtful and often provocative. If you work in medical education, you should definitely follow Brian and know his work. If you don’t work in medical education, you should still follow Brian as he has a lot to offer anyone thinking about designing learning for adults that works and creates desired behavior change.
Jeff Cobb: [00:01:25] By way of background, Dr. Brian McGowan has studied clinician learning for the last 20 years, and, for the last 14, he has specifically focused on the intersection of behavioral science and learning science. Brian co-founded ArcheMedX, where he serves as chief learning officer. In this episode, number 381, Brian and Celisa talk about what Malcolm Knowles got wrong about adult learners, practical ways to work with less than perfect learners and less than perfect learning designers, working out loud, what the long-term impact of COVID might be on learning, the uses of generative AI to support learning businesses, podcasting as an efficient way to share research and ideas, and more. As you listen to Brian and Celisa, remember to go back to your sheet of paper or your app and take note of at least three takeaways. Celisa and Brian spoke in September 2023.
Celisa Steele: [00:02:33] So tell us a little bit more about ArcheMedX and the work that you do there.
Brian McGowan: [00:02:37] Sure. ArcheMedX was actually intended to be a SaaS e-learning and analytics platform that was built around my original research, referred to as the Learning Actions Model. It’s an instructional design model that brings together, hopefully, the best in behavioral science and the best in learning science. And so we started it. At that time, I’d just left Pfizer and Wyeth, where I oversaw all continuing medical education at Wyeth and all oncology education at Pfizer. So I had this 10-, 12-year experience in continuing medical education. So we started ArcheMedX servicing the medical education community. And then, about four or five years ago, we spun up a new market sector, brought in some really, really talented clinical research professionals, and now spend about half of our time powering the continuing education activities and about half of our time powering a lot of things related to clinical trials, which I didn’t know much about six years ago, but, man, is it rewarding to manage site initiation visits and the training that starts up a study site or clinical research associate training. There are so many opportunities for training and performance improvement on that side of the business.
The Impact of COVID on Learning
Celisa Steele: [00:04:01] Very cool. So, Brian, you are a return guest to the Leading Learning Podcast. You’re a triple-peat in fact. It’s going back a ways, episodes 29 and 45. And then, in episode 100, you turned the tables on me and Jeff, and you interviewed us for that milestone episode. But even that one, episode 100, that was back in 2017, so it’s been a while since you’ve been on the podcast, and a lot has happened in that time, the coronavirus pandemic being one thing that pops to mind. I’m curious to get your take on what have you seen as the impact of the pandemic on learning? Or, to the extent that you want to take more of a future-oriented lens, what do you predict the lasting impact of the pandemic will be?
Brian McGowan: [00:04:58] Yes. I was saying to someone earlier today, “If I come back for a fifth time, do I get one of those Saturday Night Live robes?” The Steve Martin tribute. If you listened to the prior podcasts, you may not be surprised that I tend to aggressively question these types of things. I absolutely believe that the pandemic provided or required experimentation across all of learning businesses. Everybody had to do things differently overnight. I use the word experimentation. It’s doing some heavy lifting in that sentence because I think you would hope that experimentation often meant intentional, proactive planning and then some learning that came out of it, whether it was A/B testing or just convenience sampling, and lessons were learned, and those lessons were then integrated back into practice. You would hope that’s the case.
Brian McGowan: [00:06:00] I think, by and large, my general takeaway is that, within three or four years, there may not be much impact. We had virtual training before. Maybe there was this dramatic uptick in hybrid programming during COVID, and we’re already seeing that’s all but disappeared. There are very few people, for good reasons, I think, that are managing or playing around with hybrid programming. Learning science didn’t really change that much. I think some people got more comfortable with technology, and hopefully that helps, but I’m not sure they got good with the technology in a way that made training or teaching better. I think they just got a little bit more familiar with the technology. I don’t know. I would hope that there’s some impact, but the cynic in me thinks that the people that learned are going to actually rise to the top, but maybe 90 percent of the community is going to fall back to their prior processes.
Celisa Steele: [00:06:57] I agree. I think there was a lot of reactive experimentation (to use your loaded word there) and not a lot of intentional thought. And part of that was, just in the moment, a lot of organizations didn’t feel like they had the time to be intentional, but obviously we’re well past that now. There is plenty of time to be intentional, so it will be interesting.
A Focus on Learning Science and Research-Backed Practices
Celisa Steele: [00:07:18] Your work is evidence-based. It’s science-backed. That’s an emphasis that I think runs throughout what you do. Would you talk a little bit about how you came to have that focus on research and on science?
Brian McGowan: [00:07:36] Is it trained, or is it inherent? It’s an interesting question. So I earned my PhD in biomedical physiology, and for five or seven years combined between the PhD training and postdoctoral fellowships I was in a research lab doing experiments or in a transplant center doing experiments. And so I only know one way to think about these things. And when you’re doing experiments that may take 12 hours, and each step in the process is critical to make sure variation is limited, you learn quickly not to make assumptions, to measure twice and cut once. And then, when you get out into the real world, where you’re not in this white lab coat, not surrounded by test tubes, it just transitions into some kind of critical-thinking skill set. When I was on the agency side or overseeing editorial service and bringing in new medical writers, whether it was confirmation bias or not, I almost always found myself having more success bringing in PhDs, who were trained how to ask questions and trained how to think more than trained how to apply knowledge. And so I think that training to ask questions, to leave no stone uncovered helps a lot.
Brian McGowan: [00:09:01] And so, when you get into teaching, I’m teaching as a graduate assistant, and I realized that some people are learning, and some people are engaged, and some people aren’t. And yet some people do well in the tests, and some people don’t, and there’s not a perfect correlation between those things. It starts to raise questions—cognitive dissonance. You’re trying to figure out what’s working for some people, what’s not working for others. And it’s almost like overnight the types of questions I used to ask about cardiomyocytes and electrophysiology were now being replaced in my ruminating, chatter-filled mind as I was lying in bed thinking, “I wonder why Fred in the third row didn’t seem interested in learning today even though I tried to scaffold him? What was it that wasn’t working?” And so the next day I would come in to teach that course, and I’d try something a little bit different, and I had a lab notebook, just like I had as a research scientist, and what worked and what didn’t work. And so maybe I was inclined to that, which led me to the PhD, or maybe the PhD just did that for me and taught me how to think through those things. But I don’t like wasting time, and it just feels like if you’re not learning actively, intentionally learning when you’re building content or running training programs, you’re just at dramatic risk of wasting time.
Partner with Tagoras
Jeff Cobb: [00:10:31] At Tagoras, we’re experts in the global business of lifelong learning, and we use our expertise to help clients better understand their markets, connect with new customers, make the right investment decisions, and grow their learning businesses. We achieve these goals through expert market assessment, strategy formulation, and platform selection services. If you’re looking for a partner to help your learning business achieve greater reach, revenue, and impact, learn more at tagoras.com/services.
Barriers to Adopting Evidence-Based Approaches to Learning
Celisa Steele: [00:11:01] There are all these arguments on the side of taking an evidence-backed approach to learning, and yet I feel like there are still plenty of examples of learning products that aren’t really based on learning science or certainly don’t have necessarily an understanding of behavioral science. So why do you think that is? What are some of the barriers and/or hurdles that are preventing organizations from really saying, “Yes, let’s go all in. Let’s really take an evidence-based approach when we’re developing our professional development, our continuing education, etcetera”?
Brian McGowan: [00:11:36] I have two answers to the question. One is I think you have to respect the fact that most people are doing a job. And their job is to deliver training, or their job is to execute on some project management plan. It really is—overusing the word intentional—a very intentional life choice to view your job as a career and your work as a profession. And, when you do that, when you elevate your thought of what you’re trying to accomplish, then the idea of evidence-based practice and critical thinking is truly a no-brainer. But, if it’s a nine-to-five, and you’re trying to get through what you need to get through and do what goes on, the idea of measuring twice and cutting once just means you have to do three things now. If you cut, you have to do one thing. And in some of these work places the idea of doing one thing when you may be overloaded with 20 different projects and different timelines and stuff, the idea of measuring is a tomorrow fix. The idea of cutting is a today fix. And there’s a lot of people that don’t have the bandwidth, resources, or the interest in focusing on tomorrow fixes. That’s my first answer. My second answer—and this could derail us for a second, so I will absolutely defer to you to get us back on course, but your argument about evidence-based learning science suggests that the learning science that we all know and are familiar with is actually evidence-based. And I’ll just throw Malcolm Knowles under a bus for a split second.
Celisa Steele: [00:13:29] All right. Let’s hear it.
Brian McGowan: [00:13:31] In 1975 to 1980, Malcolm Knowles introduces andragogy, and basically for the last 45 to 50 years anytime someone says “learning science” in any of the conferences you and I have ever been at, the conferences that you have run—which are great, by the way—when they say “learning science,” basically it’s a euphemism for andragogy certainly and adult learning. And, if we summarize andragogy to two or three points, it’s that adults are self-directed, that they’re autonomous, that their learning is largely experientially based or anchored. The same exact time Malcolm was explaining andragogy to us all, behavioral scientists and cognitive scientists were saying, “You know what adults aren’t? They aren’t self-directed. They aren’t autonomous. Their perceptions versus their realities are very different. They make choices that are often based on quick thinking, fast thinking, heuristics, and biases.”
Brian McGowan: [00:14:34] And the idea that an adult is so rational that they can identify problems in their own practice, seek out the best education that’s available, sit through that education, and connect it through all these schema constructions that we talk about in neuroscience-based education, pardon my French, it’s bullshit. So it’s a framework that I think helps educational designers, instructional designers to lay out best practices, but they’re laying out best practices for best-case learners. And I would argue that 95 percent of the people in your classrooms or participating in your activities are not best-case learners. They’re Fred in the third row who’s more worried about his kid’s soccer practice or the five projects he’s got to get up and running. They’re not automaton, rational, perfect learner. And so the very premise, the core tenet of what we think about learning science being evidence-based, I could strongly argue that there’s a lot less evidence behind it than we tend to believe there is.
Celisa Steele: [00:15:52] With both of your answers to that question, there’s the underlying theme of the ideal developer or the ideal learner. There’s this ideal. And so, if you’re really bought into developing effective learning products, then you’re going to put in that extra time to measure twice and then cut once. And then, if you’re that ideal learner, and you really are highly motivated, and you know what you want, and you’re going to be able to fully engage, then you’re going to get what you need to out of that learning experience. And then I think what you’re making a very cogent point about is the fact that those ideals are very, very rare. So what do we do with that?
Brian McGowan: [00:16:33] On the designer side, on the educator side, best intentions are, “I’m going to do something different. I’m going to do something innovative. I heard this thing that someone said works better.” But, if your ears are open, your eyes are open, you can come across 25 of those things. What is the innovation that’s actually best? And the only way you can necessarily route through all of that is to dedicate the time through your own professional development to ask over and over again, “Why?” or “Tell me more” or “What’s the evidence behind it?” Clark Quinn has done a great job with his myth busters. Will Thalheimer. Julie Dirksen. There’s the Learning Guild. The Debunker Club on LinkedIn is another good one. There are a lot of people out there that are wired to unravel these myths. I would suggest that each project that we run is an opportunity for a small experiment. Just write down, before you’re running the project, “I’m going to try this, and I think learning will be better,” or “I think engagement will be better,” or “I think satisfaction will be a bit better.” And, if there’s any way to test it, great. Maybe you do it for the first 50 learners, and you make one small tweak, and you watch the next 50 learners.
Brian McGowan: [00:17:55] These small things, I think, will help people start to recognize how often their assumptions are wrong. And once you start seeing your assumptions are wrong frequently enough—hopefully it doesn’t become paralyzing because failure hurts—but these small, little failures should lead you to recalibrate your place on the Dunning-Kruger curve. Let’s make sure that our confidence and our competence are appropriately aligned for the learner. The other side of it, I think a lot of adult learners go through training—and let’s take the check-the-box training off the list, people that are just there because they have to be—but people that are actually going to learn, I still don’t think that they’re usually their best learner most of the time. And I see that because it’s so easy to validate that there was some effect of learning, and that confuses or confounds the learner into thinking that learning was effective. And so, if you sit through a half-hour course, or you sit through this 25-minute podcast, maybe there’s one thing that you take away that you’re like, “Aha, I’m going to apply that.” Hopefully, you and I can come up with one thing that people can apply, but, in reality, in 25 or 30 minutes, there are probably six or seven important takeaways.
Brian McGowan: [00:19:13] And so a learner walks away thinking, “I remember this one thing—that was a good learning experience.” And that inefficiency, I think, is the poison in the andragogy model. Yes, adult learners can usually find one thing that they take away, and, if you’re like I am, you’ve talked to thousands of learners that will say, “I attend a conference to learn one thing I can do better in practice.” Meanwhile, the flight was $1,000, the hotel was $2,000, the time out of office, the missed patient visits—all of that, and you come back with one thing. We have to do better as a profession than that. I think there are broad fixes for that, but, by and large, it’s the same problem that stops us from exercising when we know we need to exercise or eating right when we know we need to eat right. It’s just the fast-thinking reality of our human irrationality. You make the smallest steps you can. You help when you can. But don’t make assumptions that all of our learning science is evidence-based.
Celisa Steele: [00:20:19] Wow. You’re being both a downer here, Brian, and I think your excitement and what you see as the possibility of effective learning also comes through. So it’s very interesting to hear both in your answers.
Brian McGowan: [00:20:37] What if our learners could learn two things? So the optimism is there. People won’t be perfect learners. I’m not. I’m closed-minded, open-minded, closed-minded. Attention ebbs and flows, like everybody else. But I don’t look at it as a failure. I look at it as an opportunity. The next time, a year from now, I’d like every learner I talk to to say, “I took two things away from your….” It just doubled the impact of learning because their expectation wasn’t, “I want to learn one thing.” Their expectation and mindset coming to the training was, “I want to learn two things.” So start each activity by saying, “We’re all here. We’re going to be in this room for 45 minutes or an hour. I want everybody to write down on the side of their paper three numbers: 1, 2, 3. And, as you hear something important, just write a note: 1, 2, 3.” Just that framing of a learning experience, providing that structure and setting the expectation that we’re going to be talking about a lot, but I principally want you to boil down your experience to three things. Well now we’ve just tripled the effectiveness of the education. There’s a lot of optimism there.
How to Effectively Spread New Ideas and Research
Celisa Steele: [00:21:57] Sharing new research, sharing new ideas is a big part of what you do, and it can be related to the learning science side of things. It can be related more to the specific fields. I know that on LinkedIn, for example, you often share an article and then make a note about a few things that you took away from that. So talk a little bit about how you think we can effectively spread the word about new ideas, new research, new information.
Brian McGowan: [00:22:28] Either my tombstone, or if I can get permission from my wife, a tattoo right across the shoulder blades, it would say, “His entire life was focused on working out loud.” I think that’s part of the difference between a job, a career, and a profession. As a profession, I feel like what I learn is probably as important to the profession as it is to me. So, if you go on my Twitter account or look through LinkedIn, almost on a daily basis, certainly multiple times a week, I learn something new. And my guess is, if I’m learning something new—and I hope this comes across as a humble comment—but I believe that, if I learn something new, there’s probably somebody else out there that would benefit from learning that same thing. And so I work out loud. I take notes. Usually, if I’m using my note-taking app, I will just copy the note I took for myself and post it on LinkedIn. Or I post it on my company’s Slack channel. And often I do all three. And so that idea of working out loud doesn’t feel like any additional work. I think a lot of people look at all this stuff that I’ll post either on Twitter or LinkedIn and think, “Is there a marketing campaign behind that or some kind of effort?” And it’s not. There’s nothing. There’s no time I dedicate to creating social media content, but I dedicate an hour to 90 minutes a day to learn. And the byproduct of that is I’m taking notes and building my own personal learning network through it, and it doesn’t take more than a copy and a paste to add it to Twitter or add it to my LinkedIn profile.
The Uses and Implications of Generative Artificial Intelligence (AI) for Learning
Celisa Steele: [00:24:12] I think one of the things I’ve seen you sharing about—so I’d be curious to ask, hear your thoughts on this—is generative AI and the uses/implications for learning.
Brian McGowan: [00:24:26] Yes, it’s really a fun experience over the last 10 months to see these things come out. And, if you want to look under the hood a little bit, my TikTok feed is 70 percent GenAI influencers telling me every day, “These are the 10 most important GenAI apps,” or “These are the 10 prompts that you’ll need,” and it’s getting rapid-fired at me in these 32-second soundbites, minute soundbites. And so, if I find something that I’ve heard two or three times from two or three different people, then I’ll take a note to do something about it. I’ve tried ChatGPT, Claude, Perplexity, DALL-E 2, and remove.bg, which I just used yesterday for a headshot, which I’ll probably send you another headshot. But who needs Photoshop anymore? You can upload a photo, and it perfectly removes the backdrop for you. You can do the same thing with videos. Maybe a mellow note, but my wife and I spent a half an hour one day—her mom’s got end-stage dementia—and so we’re like, “Let’s see what we can do with her eulogy.” This is my wife’s idea. She’s like, “I’ve got all these stories in my mind, but I’m not a good public speaker, and I’m not a really good writer.”
Brian McGowan: [00:25:46] And so she started to dictate the stories and things that she loved about her mom. And she’s like, “Okay, now make this a little funnier. Now add a joke here.” And that was an introduction. It was a real-life working experience. So I play with it all the time. As far as learning businesses, which I know your audience is really interested in, I think it’s transformative. For ops, marketing, messaging, content development—not learning content development but general business content development—I think it’s transformative. If you’re not using it already, then you’re probably not working as smartly or productively as you could be. From a learning side—not the learning business, but from learning science, to close that loop—I’m unsure. I think that, certainly, the Claude, Perplexity, ChatGPT, LLaMA, the major models that are out there, I don’t think they’re trained in enough focused expertise to be that valuable right now when it comes to professional development training.
Brian McGowan: [00:26:57] When I wrote #SocialQI, which is now 11 years ago, I had a world-renowned cardiologist say to me, “Why do I need social media? There are only five people in the world that know what I know.” So when you get expertise at the top of the pyramid, then your expertise isn’t in a trained AI model. Your expertise may not be in a community of more than 100 people. I think more subjective exercises, GenAI, is a really good starting brainstorming place. But, when it’s objective, specific, and so evidence-based, that it’s “How do you treat somebody with third left toenail fungus?” Certainly on the healthcare side, I think we’re going to need to get to a place where we’re training models with different content. I think that’s on the roadmap, but ultimately the art and the magic part of teaching and training, I think that part is always going to rely on some kind of human interaction. Again, what’s the old saying? “You’re not going to lose your job to AI, but you’re going to lose your job to someone like you who knows how to use it effectively.” And that’s a pretty powerful takeaway. That might be our third takeaway of today’s podcast. People are writing those notes.
Celisa Steele: [00:28:21] Taking their notes. That’s right.
Possibilities of Podcasting for Learning Businesses
Celisa Steele: [00:28:29] I do want to talk a little bit about podcasting because I know you’ve been involved in podcasts for a while, beyond the three appearances here on the Leading Learning Podcast. So talk a little bit about your experience with podcasting and the role that you see podcasting potentially playing in support of learning or in support of learning businesses. Your thoughts there?
Brian McGowan: [00:28:52] I started a podcast for the Alliance for Continuing Education in the Health Professions about three and a half years ago and started interviewing authors of papers related to continuing education and healthcare, wherever the papers were published. I would just work out loud. I’d find a paper I thought was meaningful and other people could benefit from it. Wouldn’t it be helpful to get that person, the lead author, to actually try explaining that publication in a way that isn’t that science-written, passively worded mush that is what we put in journals? And, for most of the audience, for the Alliance specifically, there are research scientists, and then there are the people that do the education. They help build the slide decks. They’re junior medical writers, or they’re project managers. They’re not research scientists. So even reading a research article can be really intimidating, and, again, it’s a tomorrow fix not a today fix. The idea was could we get those authors to deconstruct, simplify, really focus on actionable insights? That podcast ended up morphing, where I interviewed 20+ of these 60-, 70-, even 80-year-old educational research scientists whose entire careers had lifted our profession, and I really wanted to tell their stories. Really, really powerful set of interviews. And so those stories can be told at scale, or those articles can be deconstructed at scale. What’s the lift of a podcast? Celisa, you and I have e-mailed back and forth two or three times, a couple LinkedIn messages, a brainstorming of four or five questions, and then a conversation.
Brian McGowan: [00:30:41] I didn’t put makeup on today. We’re not even worried about the video part of this. So the lift of the podcast is not that resource-intensive. And so, if you can do this, it offers another channel. I myself am an avid podcast listener. I walk for almost an hour and a half every morning, and I’m listening to podcasts. I split my time between sports podcasts and professional podcasts. I think the format is growing exponentially. I think the format as a means of sharing and disseminating research to different audiences is as powerful as any format because it can be almost real-time. It’s much more responsive. It’s much quicker. And so we took that experience with the Alliance, with the Legends Interview Series, and then The Journal of Continuing Education in the Health Professions, which is really our landmark journal—it’s managed by the Alliance, SACME, and AHME, hospital medical educators—they brought me in. And now, each issue of that journal, I get a list of the 13 or 15 publications about 60 days before the journal is published. I reach out to those authors, and we’re beginning to coordinate it now.
Brian McGowan: [00:32:01] So now it’s not just a podcast that people can listen to, but it’s actually a supplementary asset to the journal. And kudos to Wolters Kluwer, the publisher of that journal. When we proposed this, the production assistants were like, “Oh, we have an entire workflow of how to support journals with podcasts.” They had training that was on their Web site that helped the editorial board embrace what this was all about and a checklist to work through. So I reach out to the authors. We sit down. In two weeks I’m doing an episode on Monday, on Tuesday, and on Wednesday, just coming off the back of summer break. And so they’ll all be released each three to four weeks. The content—it’s a perpetual motion machine. People are submitting their content to the journal. It’s a highlight of their year. Each journal article that you get published is like, “Let me add that to my CV. Let me add that to my resume.” Now, imagine someone from that journal reaches out to you and says, “Would you like to have a conversation about your research?” Win-win-win-win, probably four ways. The journal. It’s winning for the profession. It’s winning for the author. It’s winning for the readers. And the lift isn’t that heavy.
Personal Approach to Lifelong Learning
Celisa Steele: [00:33:18] I just want to ask a question that we ask of all interviewees. You’ve had this question before, but, again, it’s been a while since you’ve been on, so I’ll be curious to go back and review what you’ve said in the past, find out if there are differences. But the question is simply what approaches, habits, sources do you use as you continue to grow professionally and personally?
Brian McGowan: [00:33:44] It’s a really important lesson to learn, and I think it is not learned by many people until it’s way too late. It’s one of those bedside regrets; it’s almost like retiring. If you invest early, it adds. It’s not linear. It’s exponential if you invest early in building a professional or personal learning network. Who are the people I want to follow? What are the resources that I need to stay abreast of? If you build that early enough in your career, then as new channels become available—TikTok wasn’t available a year and a half ago, and now I spend as much time on that as LinkedIn or other things—I know how to cultivate those resources because I’ve now been doing it for 25 years. And so, on any given day, a half dozen Google search alerts show up on my inbox. It’s passive. I don’t have to do anything. Each week I get about a half dozen saved searches coming from PubMed or various lit libraries that I’ve saved searches in. I don’t have to do anything. It just shows up in my inbox. I open up Twitter for my professional Twitter account. It is littered with med ed, clinical trials, researchers, health policy folks. I don’t have to do anything. TikTok now, as I said, is littered. So each of those streams is set up and funneling directly into my frontal cortex.
Brian McGowan: [00:35:16] And there are days that go by where I don’t have time for any of it. Sometimes there’s a week or two that goes by, and I don’t have time for any of it. But, when I do, whether it’s 5 minutes, 10 minutes, or 15 minutes, it’s there for me. I think what a lot of people believe about their lifelong learning is that they have to go ask questions, and they have to go look for information from time to time. I don’t think they recognize just how much energy, effort, or willpower that takes in the moment. Because, again, that’s a tomorrow fix, and that tomorrow fix is never going to usurp a today fix. But my learning network was a yesterday fix, and it’s been serving me really well for 20 years, and it’s there. I’m basically swimming in lessons I can learn every day, and, if I make a little bit of time for it, it almost always proves valuable. But I’m going to take five seconds of my final comment to say go and look at University of Penn’s Nudge toolkit. I just want everyone out there to go search “University of Pennsylvania’s Nudge toolkit” and spend five minutes looking through that. I just want a free advertisement for University of Pennsylvania.
Celisa Steele: [00:36:36] Cool. So “nudge” as in trying to push people to perform in certain ways or do certain things?
Brian McGowan: [00:36:43] Correct. Nudge [noudj] if you have a certain proclivity. Nudge [nuhj] if you’re a behavioral scientist. But, if you go to the Nudge toolkit, it’s basically all the same premise. Is there a way that we can make the easiest decisions, the right decisions, the best decisions really easy for people to take? If we all can think that way, then I think our profession is going to move pretty far forward, pretty far fast. If that makes sense. It doesn’t, but I’m sure I said some smart things earlier, so we’ll let that one go.
Jeff Cobb: [00:37:21] Dr. Brian McGowan has indeed said some smart things in this interview with Celisa and elsewhere. Brian is co-founder and CLO of ArcheMedX, which helps life sciences and healthcare organizations better equip, evaluate, and predict team and clinician performance. Connect with Brian on LinkedIn or X (formerly Twitter).
To make sure you don’t miss new episodes, we encourage you to subscribe via RSS, Apple Podcasts, Spotify, Stitcher Radio, iHeartRadio, PodBean, or any podcatcher service you may use (e.g., Overcast). Subscribing also gives us some data on the impact of the podcast.
We’d also be grateful if you would take a minute to rate us on Apple Podcasts or wherever you listen. We personally appreciate reviews and ratings, and they help us show up when people search for content on leading a learning business.