Episode 22

CET Talks: Accreditation, Learning and Leadership

Episode 22

August 27 2024 . 23 MINUTES

CET Talks podcast episode 22 featuring Jess Almlie who is the Founder of Learning Business Advisor Consulting. Jess is pictured on the bottom left of the graphic. The episode title, “The Metrics of Change” is in the center of the graphic.

The Metrics of Change: Navigating Purposeful Measurement in L&D

In the dynamic field of Learning and Development, understanding and applying effective measurement strategies is key to demonstrating value and enhancing organizational performance. Join Jess Almlie, a seasoned Learning and Performance Strategist, as she unveils how L&D professionals can shift from a “prove it” mindset to a more strategic, decision-making approach. In this episode, Jess will explore best practices for defining, implementing, and leveraging measurements that align closely with business goals, ensuring that L&D initiatives drive tangible outcomes. Tune in to learn how you can start measuring what truly matters and transform the impact of learning in your organization

Listen to the Podcast

Transcription

Host: Welcome to CET Talks, the International Accreditors for Continuing Education and Training’s podcast, where we convene thought leaders in the continuing education and training ecosystem to share ideas, research best practices, and experiences that promote the creation of a world that learns better. Enjoy the episode.

Randy Bowman: Hello, and welcome to CET Talks. My name is Randy Bowman, president and CEO of IACET, and I am here today with my co-host, Mike Veny, a certified corporate wellness specialist and CEO of an IACET-accredited provider. Good afternoon, Mike. How are you doing today?

Mike Veny: I’m doing wonderful. How are you doing, Randy?

Randy Bowman:  I’m doing great. I love that we have Jess Almlie with us today and to talk to us about measurement. I think that’s a topic you say a lot about a lot.

Mike Veny:  Yeah. You know it’s interesting. When our company was in the process of becoming accredited, I remember the whole thing about measurement. We have to measure stuff. And so I remember saying to myself, do we just need to make up numbers for certain things or what do we need to do? And if you are an aspiring provider out there who’s looking to get accredited, this is a very important episode because it’s really going to help you understand that you don’t just need to make up numbers like I thought I needed to.

Randy Bowman:  Oh, that’s always what we struggle with in data. What do you measure? How do you measure, how do you know if the measure is accurate and if it’s meaningful? So, I’m glad we have Jess Almlie with us today. She’s the founder of Learning Business Advisor Consulting, an organization that’s dedicated to helping learning and development leaders in teams to work more strategically, intentionally, and with measurable impact. She is a learning and performance strategist with over 25 years of experience across multiple industries. She holds an MS in educational leadership, a BA in organizational communication, and a certificate in improving human performance from ATD. Jess, welcome.

Jess Almlie: Thank you. I am thrilled to be here, and it is very fun to listen to your preliminary conversation and also to hear that lovely introduction. Thank you.

Randy Bowman:  Well, great. To get us started, can you help our listeners understand the real purpose of measurement in learning and development and why shifting from a ‘prove it’ mindset to an ‘informed decision making’ approach is so crucial?

Jess Almlie: Yeah, of course. This ‘prove it” mindset, let me describe that. What happens, and Mike, it’s when you said, do I need to make all the numbers up? We, as learning and development professionals, HR, people development, whatever we call ourselves and find ourselves in, we oftentimes feel this pressure to provide measurement, but we don’t go into this profession so that we can work with numbers. For most of us, that’s not the reason we started this work. It’s not the reason we continue this work and so it feels very foreign to us. Yet because we’re getting pressure to provide measurement, we start to feel like we have to provide that in order to prove that we exist, and to prove that we should have jobs. That’s the reason why we need to provide this measurement because the ask is coming from higher up. It’s coming from the C-suite, it’s coming from the people who hold the decision making power about where the resources are allocated throughout the organization. When we start to get nervous about this, we don’t know how to do it, and our emotional brain takes over. We’re thinking, “Okay, I have to provide this to prove it.” But that causes us actually more panic than it does productivity. It’s not helpful to think about it in that way. It’s also not the reason why we should be measuring. We shouldn’t be measuring to prove our existence. We should be measuring so that we can make informed business decisions, and we can help our business stakeholders to do the same, to make informed business decisions. Ultimately, this work isn’t about us. It’s about helping the business move forward and helping people get better at the jobs they do. We need to think about it in terms of making informed business decisions and some of the business decisions that can come out of our measurement, then our decisions about our L&D strategy. So it can inform what strategy we create, what strategy we follow, and what’s most important to the business. It should inform our decisions about the programs, courses, and offerings that we create and maintain. So, what changes might need to be made moving forward? Do we need to iterate? Have we spent too much time and energy in the wrong places? Are we meeting the needs of our stakeholders? Are they clicking on what we are putting out there? Are they finding what they need to do their jobs better? Are there trends in participation that maybe show sometimes of the year are better than others to engage with our business? Are there certain assets that are clicked on more than others? Those are some of the decisions about programs, courses, and offerings that our measurement can help us to make. Then the last business decision is the decision about how we maximize the resources on our own team. Sometimes I think we forget that as a learning and development function, we also need to be measuring the performance of our own team, the capacity of our own team, and what we can contribute. When we take those things and we reframe measurement from something we have to do to prove our worth to something that we need to do in order to make decisions about our work, then it sheds an entirely new, much more positive light in the subject. We approach it from our logical brain as opposed to our emotional brain. When we ask a question like, “What data do I need in order to make business decisions about how to best partner with the business, run my team, and do our best work?”, that’s very different than “What data do I need to prove that I’m worth the expense?” When that happens, there’s a sort of magic because if you focus on data for decision making, proving your worth isn’t needed any longer; it’s obvious.

Mike Veny: I love your answer. It’s so to the point so we can make informed business decisions. But for the listener who might be part of an L&D team or an instructional designer, where do they begin and what challenges do they face when they’re starting this process of putting in effective measurement strategies?

Jess Almlie: I love that you asked that, and it really goes back to first and foremost, is this need that you talked about of do I need to make everything up myself? I’m going to answer this with four different challenges that I see learning and development teams struggle with. One is you think you need to make everything up yourself. The second one is you need to try and measure after the fact. The third one is you have a lack of access to data that you need. And the fourth one is a lack of data in the organization overall. Let me address each of those briefly. So, first thinking you need to make everything up yourselves. This is part of where that panic comes in. We need to start with the data that already exists in the organization because if we’re truly going to help improve the performance of people, how are we measuring that performance to begin with? Those are the measures we want to try to impact. We don’t need to create those measures ourselves. We need to look at what already exists in the organization and then work with the organization to determine whether or not our solutions and our interventions are helping to move that performance data. Secondly, when we try to measure after the fact, that is so much harder to do because we don’t have any benchmarks at that point. We don’t know if we’ve moved the needle or not. We only know what the outcome was. We don’t know where we started from. This is something that when I first started to measure and was first asked to provide more measurement, like a QR quarterly business report, there wasn’t anything that existed at that moment in my organization. I had to go back and find it and the legwork that you have to do, the time that you have to involve in order to go backwards is so much more intensive than determining what the measure is upfront. And remember, you don’t always have to make it up. It can be something in the organization and then looking at that again after the fact. The third thing, a challenge I see is this lack of access to data. Those of us who are in learning and development now, I’m saying you need to look at your business’s performance data. Well, what if I don’t have access to that because maybe I’ve never asked for it before. First of all, my advice is start asking to see the data and explain why you’re asking—because you want to better understand the business challenges and successes that’s going to make you a better partner in helping people get better at their jobs and move the business forward. So ask the stakeholders, ask your boss, get a business mentor. Some of the most successful people I have met in the L&D industry have a business mentor. It doesn’t have to be a formal mentoring relationship; it’s somebody they meet with on a regular basis who is outside L&D, they’re in the business and they’re somebody you can start to ask about how that business functions just to learn more about it. You are also listening for data in every meeting that you sit in, how are they measuring what it is they’re talking about? It’s kind of learning by overhearing, you’re turning over rocks, so you are trying to figure out how the rest of the business is measuring and to get access to that data, you’re asking questions and you’re asking for it and you’re explaining why. Then the fourth one that I have found is that there’s simply a lack of data in the organization. I see this most common with smaller organizations who have grown very quickly, and they just don’t have any of the data in place. They know now that they need to change some things. They need to get some systems and processes in place. This is often when learning and development is called in to help create some additional training or structure. Then we’re looking for data and it simply doesn’t exist. So when this happens, there’s really two things we can do. We can partner to determine a current performance benchmark before we begin. So we’re going to work with them, and I have done this before, working with an area to determine this. They said, “We know we need to shorten our ramp time, our onboarding for our new employees.” Well, what is the ramp time now? That’s one of the questions we’re asking. We worked with them to determine what that was, and then we had a benchmark to begin with. Another one is we can gather while we’re creating and delivering. One example was I had worked with an onboarding team that was trying to automate their learning process for customers. This was for customer organization after a very large sale. As we were rolling out the program, we were gathering data about who was participating, how many hours they were participating in this automated program, versus how many hours our team members in that team were spending one-on-one with customers, and then we had comparison data after the fact. So we were gathering data really while we were doing the project. That was maybe a longer answer to the common challenges, but there are quite a few out there.

Randy Bowman: Those are definitely challenges that I think every L&D organization has encountered, either one or most of those in their lifetime. You gave us a few examples, but there may be some more in your mind that can you share—how you approach learning business advisor consulting, how your approach helps L&D leaders design and implement measurement strategies that are not only strategic but are practical and actionable.

Jess Almlie: Yeah, very much so. If we’re looking at an overall measurement strategy, I like to start by outlining three buckets of measurement. I can’t take credit for these. They came from the book Measurement Demystified by Peggy Parsky and David Vance. But this helped me to have some buckets to put what seemed a very confusing world of measurement into, and the three buckets of measurement are activity measures. So, activity measures are how many, how much, and how long. How many people are participating in your programs? How much is it costing you to create them? How long does it take to create them? How many hours a year are they spending in training? These are the measures that most of us in L&D have the easiest access to because they’re our own data. However, activity measures alone don’t necessarily show if we’re moving the needle, if our programs have benefited anyone; they only tell people that we’re busy. When I put out some activity measures in one of my first attempts many years ago to come up with some business reports, my boss said, “You know, that’s great. Those are all great measures, but all they tell me is, you’re busy. Everyone is busy. So what do these measures mean? What else is happening?” So, activity measures is the first one; effectiveness measures is the second one. That is, did people gain knowledge and did their behavior change? You’re going to hear some parallels to the Kirkpatrick theories here. We’re talking about Levels two and three. So, did they gain knowledge? Did their behavior change? These are things that we are going to look at. We want to have some measures in place for this. Sometimes it comes through survey data. Sometimes it comes from talking to supervisors after the fact; has their behavior changed? Sometimes it comes from comparing to those performance metrics that I talked about earlier. There’s a number of different ways we can analyze and compare effectiveness measures. Then, the third bucket of measures are outcome measures. So, activity measures, effectiveness measures, and outcome measures. We’re looking at, did we move the needle, did the performance improve? Did we reduce expenses? Did we increase revenues? Those kind of questions. The thing that I like to think about when I’m designing an overall measurement strategy or helping someone to do that, we want to have activity and effectiveness measures for the majority of the work that we do. But the outcome measures we want to reserve for the big rocks, the things that are the strategic initiatives for the organization that they’re working on. Outcome measures take the most time and effort to attain, so we don’t need to spend all of our time and effort on those outcome measures. We can do that just on those biggest major strategic initiatives. The way I recommend that anyone can start in terms of designing and implementing their measurement strategy is to start with a simple audit of yourself. What measures already exist and what measures are you already gathering? Get a big, long list of that, and a couple things will likely happen. When I first did this, one thing that happened was I realized I had access to more than I thought I did. It helped me to know that I actually had access to more and I could get more answers. The second thing is it really illuminated where the gaps were. I could tell that we did great on activity measures, we did okay on effectiveness measures, and on our outcome measures, we weren’t as great at. It also illuminated where we needed to implement more of those, so then I can make sure we have measures that are tied to all of those major strategic initiatives, some outcome measures, and we can see if we need to add in any effectiveness or activity measures. So do we have all the buckets? And what business decisions do the measures that we are gathering help us to make? If they don’t help us with any business decisions, it may not be worth it to continue to do the work to gather that information either. That’s a little bit about the strategy of how we design a measurement strategy, if you will. I don’t know if that’s meta or not, but that’s what I’m thinking when I’m working with different organizations to help them.

Mike Veny: You are an amazing guest because you keep answering my questions, and I don’t get to ask them. It’s great. I love it. You’re making my job easier, but I want to ask you this question. For the L&D professional out there that’s listening to this going, “This is great. You’re presenting this so clearly. Is there some type of template you recommend or checklist that someone can use to get started with all these steps in creating effective measurements?”

Jess Almlie: The book, Measurement Demystified, also has a handbook or a field guide you can purchase alongside, and there are some templates in there. If you are just looking at activity effectiveness and outcome measures, you can create your own template through an Excel spreadsheet or some other Word documents. But I know there are some templates available. When I was starting to implement our measurement strategy in my role as vice president of learning, I did use some of the templates out of that resource, Measurement Demystified.

 

Randy Bowman: Wow, this has been so chock full of information in such a quick time. I look over at the clock and I’m going, oh no, we’re nearing the end, and I feel like we’re just getting started. Here at IACET our vision is a world that learns better. So in your mind as a learning and development professional with your vast experience, when you hear the phrase, ‘a world that learns better’, what does that look like to you?

Jess Almlie: A world that learns better, to me, is one where we are able to eliminate what I call ‘legacy thinking’ about how learning happens. Legacy thinking about how learning happens in L&D, in our organizations and in education. I mean, we all grew up in formal classrooms. The majority of us did, where a teacher stands in the front of the room, lectures, and then tests us on our learning. We have carried that same format, if you will, into our organizations. This is where we spend our time as L&D. We spend it creating formal e-learnings, formal classroom learning. However, one of my favorite phrases is “No one comes out of a classroom or an e-learning with a bow on top.” We do not learn everything we need to know in a classroom or in an e-learning. More and more we have easy access to information outside of work, to help us learn, but we are not mimicking that within our organizations. So, I know I’m not waiting for the class when my vacuum cleaner breaks. I’m going to go on YouTube, I’m going to figure out how to fix it myself, or I’m going to call a friend. So, why are we thinking that learning at work should be different? I also think sometimes we get this attitude of, and this came out of our formal education system, that learning is something that someone does to us or for us, that learning itself is a personal thing. It’s something we do for ourselves. We don’t have to wait for somebody else to tell us to learn. Usually, it’s in our moment of need, because that’s when we’re most motivated, or it’s some challenge that’s presented itself and now we want to learn more about it. So, a world for me that learns better looks like understanding that learning happens in all places and spaces and not putting so much emphasis on our formal education and learning systems.

Mike Veny: Jess, thank you so much for delivering an engaging and informative episode. Randy, I don’t know about you—I know this is about measurement, but this was also a mental health episode too, because it gave me a lot of clarity. It reduced my anxiety around this topic and the thing that I got from it is, it’s all about making informed decisions. That’s it. That’s all we’re trying to do. What about you, Randy?

Randy Bowman: You know, what I took out of it was that so much of the data is already there if we just take the few minutes to go look for it. Once we started organizing the different types of measurements into those three buckets, it almost became a lot easier for me to see the data we have.

Mike Veny:  As we wrap up today’s discussion on measurement and assessment, the importance of not winging it, and that you don’t have to wing it, we’d love to hear from you. What future trends or innovations do you see shaping the way L&D measures impact and effectiveness? Please share your experiences and insights on our LinkedIn page. Your stories can provide invaluable lessons and inspiration for others navigating similar paths. And don’t forget, you can submit topic ideas, suggestions for guests and other feedback on the CET Talks podcast page of the IACET.org website. We certainly hope you’ll subscribe and tell your friends about it, tell your family about it and leave us lots of good reviews, so we can get this podcast to more people, and you don’t miss any episodes. Thank you so much for joining us today.

Host: You’ve been listening to CET Talks, the official podcast of IACET. Don’t forget to subscribe to the podcast on Spotify, Apple podcast, or wherever you listen to podcast. To learn more about IACET visit IACET.org, that’s I-A-C-E-T.org. Thanks for listening, and we’ll be back soon with a new episode.

Trending Now

Episode 24: Cultivating Careers: The Power of Employee Engagement for Organizational Success

Episode 23: Igniting Imagination: Crafting Creativity in Training Environments

Episode 22: The Metrics of Change: Navigating Purposeful Measurement in L&D

Episode 21: Pathways to Success: The Value of Lifelong Learning through Digital Credentials

Episode 20: Outcomes to Achievement: Crafting Tomorrow’s Workforce Through Competency Models

Episode 19: Chatting with the Future: Enhancing AI Output Through Prompt Engineering

Episode 18: On the Inclusive Frontier: Harnessing Neurodivergence in Modern Training

Episode 17: Designing with Purpose: Strategies for Accessible e-Learning Development

Episode 16: Innovating Education: Navigating Accreditation for Short-Term Training

Episode 15: Beyond the Basics: Elevating Virtual Training through Expert Facilitation

Episode 14: Aligning Your LMS with Accreditation Standards: Insights from an Award-Winning LMS Provider

Episode 13: Instructional Design on a Shoestring

Episode 12: Taking a Whole Person Approach to Skills Assessment

Episode 11: The Accreditation Journey: Practical Strategies from Detroit’s Accreditation Manager

Episode 10: Implementing Digital Badges: Engaging Learners and Enhancing Retention

Episode 09: Assessment of Learning Outcomes

Episode 08: Designing for the Learner: Expert Insights on Needs Analysis

Episode 07: Data Analytics for Training Businesses

This Post Has One Comment

  1. admin

    What future trends or innovations do you see shaping the way L&D measures impact and effectiveness?

Leave a Reply