Episode 20

CET Talks: Accreditation, Learning and Leadership

Episode 20

JULY 30 2024 . 24 MINUTES

CET Talks podcast episode 20 features Manny Straehle, the Founder and President of AERE. Manny is pictured on the bottom left of the graphic. The episode title, “Outcome to Achievement" is in the middle.

Outcomes to Achievement: Crafting Tomorrow’s Workforce Through Competency Models

The creation and application of competency models can bridge the gap between education and the evolving demands of the global market. But for as long as practical application has been a feature in learning experiences, few organizations have been able to implement models that have the desired impact, resulting in mismanaged learning experiences that fail to inspire. Join Manny Straehle, educational pioneer and competency model master, to discover how well-crafted competency models can not only meet industry standards but anticipate tomorrow’s need today.

Listen to the Podcast

Transcription

Host: Welcome to CET talks, the International Accreditors for Continuing Education and Training’s podcast, where we convene thought leaders in the continuing education and training ecosystem to share ideas, research, best practices, and experiences that promote the creation of a world that learns better. Enjoy the episode.

Randy Bowman: Hello, and welcome to CET talks. My name is Randy Bowman, IACET’s president and CEO. I’m here with my co-host, Mike Veny, a certified corporate wellness specialist and CEO of an IACET-accredited provider. Hey, Mike, how are you doing today? Glad to have you again.

Mike Veny: Hi, Randy. It’s good to be here with you. Before we get started, I want to ask you, a CEO, a CEO question. How do you find the right people to hire? How do you know what you’re looking for when you hire them? How do you go about that stuff?

Randy Bowman: Man, that is probably one of the most important things the CEO does, is bringing in talent. It’s also one of the scariest and riskiest things. People are so complex, and when you and an organization are complex, and you bring someone into an organization, you never know what the impact’s going to be. You want to make the right choice, but there are no perfect choices out there. How do you make those decisions? How do you weigh all of the pros and cons and tensions about a person’s experiences, their personality, their abilities? I don’t know. I’d love to get a lot more guidance on that.

Mike Veny: Okay. We have a special guest that I think might be able to help us with some guidance here. His name is Manny Straehle. Manny is a consulting assessment, educational, and research expert with a number of work experiences with licensure and certification organizations; over 25 years of experience in developing assessments and a specialization in developing competency models. Some of his key accomplishments, as founder and president of AERE, Manny leads a team of almost 30 experts in various psychometric testing and assessment activities, competency modeling, marketing and survey research, aptitude and employment selection, testing performance-based assessments, stackable credentials, and other research and evaluation projects. I love all that you do. This sounds amazing, and we have a very short interview, so I’m super excited to have you here. Welcome to our show.

Manny Straehle:  Well, thank you for having me, both of you. I’m thrilled to be here.

Randy Bowman: Well, we’re so glad you are. Manny, you and I have talked before, and have gone on forever about different things, but many of our listeners might not be familiar with what a competency model is. Do you mind taking a few minutes and explaining to someone who may be brand new, what is a competency model, and why are they becoming increasingly important in today’s education systems and workforce planning initiatives?

Manny Straehle:  So, as you were both discussing, one of the things, and why it’s very important, is the traditional path or the traditional ways of assessing or qualifying individuals are becoming less and less valuable in the sense that “Did I hire the right person?” The hiring process and retaining process, as well. Essentially, what a competency model does, it helps us with answering that question. But what are competencies at the end of the day? Well, it’s typically the things that you do know, and aptitudes, and other characteristics, such as personality. There’s technical and there’s non-technical things, and we like to call them cognitive and non-cognitive skills. We want to know what you do at a certain level. Are you entry-level, are you middle, are you expert, master? And at those levels, that you’re able to perform it well or are capable of doing it. The way we show that in a competency model, which a lot of competency models don’t necessarily do, is we bring in the evaluation folks, the researchers, to look at the outcomes. We have a set of three outcomes or four outcomes that we call C-suite outcomes. We just came up with this not too long ago. A lot of folks in the C-suite, they want to know if we hire someone, will they be performing it effectively, efficiently, high quality, make us money, save money? Those are some of the higher the C-suite outcomes. We have other outcomes, too. If you’re doing something, how do we know that you’re doing it well for the organization? How do we know you’re doing it well at your own level? At the end of the day, a competency model is really, can you do the work and for what reasons can you do the work? That’s how we define it. We’ve gone through several iterations of that definition, as well.

Mike Veny: I’m just curious; is there a way to measure if someone is going to be toxic to the work environment?

Manny Straehle:  That’s where you look at your personality or your non-cognitive assessments. There’s a lot of good ones out there. We tend to use the Big Five; there’s five dimensions or areas you’re measuring a personality. The one that we find to be the most interesting is what they call ‘narcissism.’ It’s what they call it, but it determines whether someone has exhibited psychotic behaviors. What makes that really interesting is if you have a certain lightness of those narcissistic qualities, you are typically someone in the C-suite. Now there’s a spectrum, right? It’s on the very low end of it; you have to have a sprinkle of those sort of things. At the same time, if you’re looking at that, it may be someone you don’t hire as a FTE or full-time employee, who exhibits a high score on that, but you bring them in for an assignment, for instance. It’s a different way of hiring. They could get the job done, but they can’t work with other people. You have an emergency, and you bring them in to fix a problem with troubleshooting. Maybe that’s what you do. Or you find people who could work with them, I’m not sure. They’re geniuses but have maybe bipolar—maybe they just don’t think like the rest of us, they think outside the box. That’s why they might have a little bit delusional thinking. Those are the individuals you try to stay away from for the most part. I think with everybody, there’s value to them, but we also have to change towards what the problem is, what problems we’re trying to solve in the organization. If that person could be useful, they’re the leading expert, but you can’t work with them, right? That tends to happen often, and we don’t really talk about it that often. That’s a great question that you asked, Mike.

Mike Veny: Yes. And just a quick follow up to that. In your experience, how accurate are these competency models? If I use one to predict the personality traits of someone coming in, is it proven to be fairly accurate?

Manny Straehle:  Yeah, we don’t develop a lot of the personality assessments. They’ve been developed for many years and have a lot of what we call ‘validity’ or ‘psychometric soundness’ to them. What you’re measuring is rather accurate. So you’re going to get—and typically, if you really want to focus only on personality, then you would probably use more than one type of personality measure, not just the Big Five. You might want to pair that up with the 16pf or some other scale that’s out there; DISC and things like that. Myers-Briggs, I’m not a fan of that—most psychometricians aren’t, it’s not a valid examination—but it’s the one that’s mostly used. I would not recommend it, but people have used that. Those things we usually pair up when we develop a competency model. We bring in the Big Five and we could look at a profile of someone who’s an expert versus someone who’s a novice and look at the personality differences. And they do differ in different fields. You know, engineers have a mindset, software or IT folks have a mindset, nurses have a mindset. It’s interesting, as well.

Mike Veny: Okay. When it comes to organizations who are developing and implementing these competency models. What are some of the most common challenges that they run into?

Manny Straehle:  I think it’s determining what those competencies are, and when you move into the expert level, believe it or not, a lot of them are very similar because you’re looking at people who are exhibiting leadership competencies. Those are very difficult to measure sometimes. You’re managing a group of people, so how do you manage a group of people at a competent level versus an expert level versus an intermediate level? Those things become challenging. What are the outcomes around those specific outcomes for a specific field? You have general outcomes, the ones that I mentioned, like making money, saving money, high quality performances. Then there’s things, like if you’re in healthcare, patient outcomes, survivability rates; well, there’s also contextual factors around that. You can be a great surgeon, but you’re working with the most frail group of patients, and half of them are dying on your table. Is that a really good outcome to look at, even though you’re the leading surgeon in the world on this? You have to be careful of the interpretation. I think a very difficult part of competency modeling is figuring out what the competency model link is going to be used for; the end result. I’ve seen a lot of different ways use it. I think there’s an endless use to them. I think that becomes a challenge, as well, for the organization that could be used diagnostically. They could be used to look at what outcomes you think are the most important. It might be at the department level that you’re saying, “Well, we think making money is the most important, so we don’t care what personality person we’re selecting. We have this goal to make this much money, so, we’re going to select only these people that have these competencies to make this amount of money.” When patient outcomes go down and patient satisfaction go down, those sort of things, or even customer satisfaction is less emphasized there. Those are the pieces of it. The beginning of a competency model is very difficult for an organization. It’s very visible to any organization that begins one. It also has a lot of stakeholders that want different directions for it, and defining those competency levels are specific to that profession, too. What does it mean to be beginner? What does it mean to be intermediate? What does it mean to be expert? Are those the right words we’re using to describe that competency performance level? You know, some people use emerging, some people use established. So it depends. Is it three levels? Is it more than? You know, typically, I like to develop three levels, but is it more than that? These are some of the questions we begin with in terms of when we’re developing a competency model at the very beginning.

Randy Bowman: Sounds very, very complex to do one of these. You’re managing all of these different competing tensions and competing values against one another. How do you use analytics and data to give you a foothold in those tensions, so that you can decide what the specific metrics and data points are that allow you to make an effective competency model?

Manny Straehle:  Actually, we use a lot of typical research methods. We’ll use interviews, we’ll use focus groups, we’ll try to get the right people in the room. We rely on the organizations or associations who have folks who are related to that profession, or even the paraprofessionals around that, to guide us in that process. We use surveys, we analyze the data, we use advanced statistics to look at some things. We always do it with a steering committee, as well, of individuals who are well-known, seminal, very influential, and they’re easy to get along with. They have a lot of influence and power to help us throughout the process, if we get stuck on something. Let’s say the research says this, but they’re not quite agreeing with it. We have to figure out “Do we need another focus group? Do we want another survey in there?” We believe that every competency model should be dynamic, meaning if there’s something new that needs to be addressed. Let’s say we just found out about Scully landing the plane in the Hudson River, right? Water landings. So now we want to make sure that our competency model covers water landings. That’s really important to include in the competency model, but your competency model doesn’t talk about it. So, we always need to be dynamic about it. What’s the process of having it being dynamic? The steering committee decides to add that it’s just them that makes the decisions. Maybe it goes to a larger group, maybe there’s a survey. It depends how powerful you want to make your validity argument. The more experts you have saying that it’s okay, the stronger your validity argument is, and that’s how we validate a competency model. A lot of research activities help us develop our validity argument. We do borrow a lot from the job analysis world, the valuation world, the HR world, the IO psychology—industrial organizational psychology—world. We tap into those four different industries to help us come up with a valid and dynamic competency model.

Mike Veny: Well, speaking of dynamic; given the dynamic nature of industries, job roles, and trends, in your opinion, how should competency models evolve to remain relevant? And can you discuss any recent trends or updates in competency modeling that organizations should be aware of?

Manny Straehle:  I think again, it’s being aware of those changes. We know in medicine, prescriptions or new drugs are coming out all the time, so how do we make professionals aware of it, how to use it, and all those things that come along with it? That steering committee will have to decide that we need to put this on a competency model, and we need to have a communication plan to send it out, validate it, and do it as quickly as possible. It might be the steering committee that says we need to add this new class of drugs, or these new ways of doing RNA-type vaccinations, and things like that. Being on top of what’s new Zoe’s continuing education piece, I said Zoe is working on as Randy’s smiling here. But yeah, it’s a very difficult thing to do, and it all has to do with getting the right decision makers and saying, “Yes, we’re going to do it.” And communicating it as best as you can through your channels; the effective channels, as well. Your MarComm group, your marketing communication group, is really going to have to be savvy with that and make it valid as possible. Especially things that are critical, in the sense that they’re life threatening. If we’re doing these surgeries incorrectly, your competency model has to update that and say, “Well, we have to take this method away. It’s killing people.” That’s a critical piece to remove, and we have to do it as quickly as possible, and things like that. It’s kind of an FDA recall, you know, so you have to do it as quickly as possible. They’re probably not as quick as possible, but if you’re a pilot in the FAA, you have all these updates that you read constantly. That’s a really good model of updating your competencies for pilots is, “Okay, we’re going to send out the communications. This is what you need to know that has changed on the 737 or the 747. Well, we don’t deploy them that much anymore. It’s the 777s or the 787s, or whatever aircraft are using.”

Mike Veny: Well Manny, you, you’ve given us so much here today to, to kind of chew on and think about, but we always like to ask our guests one last question. So, if you can just tell us what does a world that learns better look like to you?

Manny Straehle: I think we lost a lot of that in the Guild Society. I always profess about real-time learning and real-time assessment, and they go hand in hand. It’s very much like—I’m going to nerd out a little—very much like what Vygotsky envisioned when he talked about the zone of proximal development, which is you have a capable peer, someone who knows a little bit more or a lot more than you, and they’re watching you and they go, “Oh wow, you’re stuck. Let me help you get unstuck by doing that that lesson that’s very accessible to you, so you understand how to get unstuck, you achieve that next level or the next zone of learning.” I’m a big Vygotsky fan. I studied under one of the leading experts on it. I think having that apprenticeship mentorship model is incredibly useful. I think we lost a lot of that over the past 30 years. When I entered the workforce, I had mentors, and they would watch me and they’d be like, “Whoa, Manny, you’re doing that not quite right. Let me show you how to do it a little bit better.” When you have parents who are interested in your life, they do this really well, and we do it naturally, I think, really well. Most parents do. I think it’s that type of model we have to come back and visit. I know WorkCred with ANSI, they’re doing some great work there with Roy Swift, when they’re coming up with a standard around that. It’s very similar to your standard on competency-based learning, which is the next increment. A lot of the learning theories from the Vygotsky and Skinner talked about it. Skinner talked about the success of approximation, which is very similar. It’s the next piece of learning, but you’re assessing that and you’re teaching that at the same time. You’re giving real time feedback to that person. This is real-time learning, real-time feedback, real-time assessment. I think it should go back to the Guild Society. A lot of our old school learning-type psychometricians believe in that model and it’s there. Co-op programs work very well, and mentorship and apprenticeship models are incredibly valuable. I hope that’s where—and it’s very expensive to build those things very effectively. But, hopefully, we’ll move more into that, corporations will go back into leading that, and they have to be long-term. A lot of times we create a lot of short-term solutions, and in the past we had a lot of long-term funding for great programs like this. Hopefully, we’ll see that turnaround as well.

Mike Veny: Manny, thank you for such a great interview. One of the reasons I enjoy hosting this show with Randy is that I learn so much, and today was one of those episodes. I’ve been taking a lot of notes, and the main thing that I got out of it—and I think this is important for you listeners out there, especially if you’re still overwhelmed by this topic, because some of these topics can be a little overwhelming for those of us who don’t dabble in them all the time—is figure out the end result you’d like the competency model to achieve. That’s the start of it right there. Figure out the end result, and you can start. What about you, Randy? What was your takeaway?

Randy Bowman: I really enjoyed that last little bit with Manny and the relationship of real-time assessment to real-time learning. Being involved in the lives of the people around you in a meaningful way, of almost that ‘iron sharpens iron’ mentality of relational learning, I might call it. I love that idea. I think it’s very learner-centric, very human-centric. That’s going to be my takeaway for today, to make sure that I’m engaging in that kind of learning, both for myself and the people who are around me. Thank you so much, Manny, for being with us. Thank you for sharing your insights. I know there’s so much more we can talk about and, hopefully, we can have you back sometime to delve into and really nerd out on some of these issues and go deep. To our listeners, as we wrap up today’s discussion on competency modeling, we’d like to ask you, how does your organization promote competency development? Do you have any frameworks in place or are you trying to develop them for tomorrow’s workforce? Please share your experiences and insights with us on our LinkedIn page. Your experiences and your stories help others who are walking these same paths. Let’s all work together and let’s all build each other up and do some of that proximal learning on our social media. Don’t forget, you can submit your topic ideas, suggestions for guests, and other feedback on CETtalks podcast page of the IACET.org website. We certainly hope you’ll subscribe to this podcast on your favorite platform, so you don’t miss any of these great episodes. Thank you so much for joining us today, and we look forward to next time you’re with us.

Host: You’ve been listening to CET talks, the official podcast of IACET. Don’t forget to subscribe to the podcast on Spotify, Apple podcasts, or wherever you listen to podcasts. To learn more about IACET visit IACET.org. That’s I-A-C-E-T.org. Thanks for listening, and we’ll be back soon with the new episode.

Trending Now

Episode 24: Cultivating Careers: The Power of Employee Engagement for Organizational Success

Episode 23: Igniting Imagination: Crafting Creativity in Training Environments

Episode 22: The Metrics of Change: Navigating Purposeful Measurement in L&D

Episode 21: Pathways to Success: The Value of Lifelong Learning through Digital Credentials

Episode 20: Outcomes to Achievement: Crafting Tomorrow’s Workforce Through Competency Models

Episode 19: Chatting with the Future: Enhancing AI Output Through Prompt Engineering

Episode 18: On the Inclusive Frontier: Harnessing Neurodivergence in Modern Training

Episode 17: Designing with Purpose: Strategies for Accessible e-Learning Development

Episode 16: Innovating Education: Navigating Accreditation for Short-Term Training

Episode 15: Beyond the Basics: Elevating Virtual Training through Expert Facilitation

Episode 14: Aligning Your LMS with Accreditation Standards: Insights from an Award-Winning LMS Provider

Episode 13: Instructional Design on a Shoestring

Episode 12: Taking a Whole Person Approach to Skills Assessment

Episode 11: The Accreditation Journey: Practical Strategies from Detroit’s Accreditation Manager

Episode 10: Implementing Digital Badges: Engaging Learners and Enhancing Retention

Episode 09: Assessment of Learning Outcomes

Episode 08: Designing for the Learner: Expert Insights on Needs Analysis

Episode 07: Data Analytics for Training Businesses

Leave a Reply