Ep.16: Keeping Up with the Benchmarkers
Remember, you can always listen here or follow us on Apple Podcasts or Spotify. Either way, thanks for supporting us.
About this episode. It’s human nature to be curious about what other people are doing. But is it helpful? And is benchmarking just a modern day, corporate version of our naturally human instinct to "keep up with the Joneses?”
In this episode of The Better Way?, Zach and Hui dive deep into the widely used, yet often over-valued, practice of culture and compliance benchmarking. They define it, explore why organizations do it, and acknowledge its potential to inspire (and maybe even help you find “better ways”).
But they also caution against over-reliance, arguing that benchmarking-based comparisons usually lack the context necessary to be truly meaningful. Understanding what others are doing can be helpful to charting your own course; but when things go wrong, there is rarely strength in numbers. And who the heck are the Joneses anyway?
Who? Zach Coseglia + Hui Chen, CDE Advisors
Full Transcript:
ZACH: Welcome back to The Better Way? Podcast brought to you by CDE Advisors. Culture. Data. Ethics. This is a curiosity podcast for those who ask, “There has to be a better way, right? There just has to be.” I'm Zach Coseglia, and I am joined as always by Hui Chen. Hi, Hui.
HUI: Hi, Zach. How are you doing today?
ZACH: I'm good. I almost started laughing during the intro because as I was getting to saying my name, I remembered that last time we did this, I forgot, I guess, my name.
HUI: That happens.
ZACH: But we got through it.
HUI: I hope everyone out there is doing well. Thank you so much for joining us.
ZACH: Indeed. And it's just us today, and we have a fun topic. I think it's something that comes up a lot in our world. It's something that you and I both have pretty strong views on. And the topic is: benchmarking.
HUI: It certainly is a favorite topic that frequently comes up. I think this is true holding back to my entire compliance career, everybody's always talking about it.
ZACH: So, when we say benchmarking in the context of today's discussion, why don't you just give us a little bit of a definition? What do we mean?
HUI: Well, I guess I'll say it really is making a comparison. So, people want to make a comparison, whether it's about how their program is set up or how much budget they're getting, what the program's scope is, what the function of the program may be. It's in essence: comparison. And oftentimes this benchmarking is also used in a way that is specific to the industry.
Very rarely, if ever, do I hear people say, “I just want to do a general benchmarking.” They always want to benchmark within their industry and/or with their peer companies.
ZACH: And there's a lot of different ways to go about doing benchmarking. Sometimes it's done through discussions with folks like us who have a broad reach. So, people will ask us, you know, “what's happening in our industry” or “what are peer companies doing?” Sometimes benchmarking will happen in a much more structured way. For example, there are certain culture surveys that may have a benchmark built into them. Trying to provide comparative data points at scale.
Sometimes people will actually ask us to help them do a benchmarking initiative by using our contacts. And I'm sure law firms do this, and other larger consulting firms do this—using your contacts and client relationships to try to develop a requested benchmarking analysis. And sometimes it's done, you know, by the folks who are in-house who are asking these questions themselves, just in the context of their formal or informal conversations with friends and peers and counterparts at other companies.
You know, sometimes it's what happens in those actually valuable moments at conferences. We often talk about all of the not so valuable things that happen at conferences, but sometimes one of the valuable things that happens at conferences, it brings people together. It gives them a forum to talk about some of this stuff. So, there's a lot of ways to go about doing benchmarking.
Let's start, because I'm sure everyone who listens to us knows that we're going to get to the part where we talk about why benchmarking is problematic. We wouldn't be talking about it; we wouldn't dedicate a whole episode to it, if that weren't what's coming. But why don't we start with the sort of glass half-full version of benchmarking and talk about some of the benefits of it. Hui, what in your mind are some of the benefits of benchmarking?
HUI: Well, I certainly will say the most commonly used reason for benchmarking is often for resource advocacy. So, you're in a company; you feel like your function may be not adequately resourced; or you wonder whether it is. And how do you know what is adequate resourcing? You know it by comparing to other similar, peer companies in your industry; and you say, well, “you know, if we consider ourselves comparable to that company, then let's see how we compare, when it comes to how we structure our function, how we resource it.” So I think the most common usage, or at least the most common motivation for doing it, is to make that resource comparison so that the compliance professionals can advocate for their function for the appropriate level of resourcing.
ZACH: Yeah. And I like it for that purpose. And I also recognize that part of the reason why folks use it for that reason (and for other reasons) is because, better or worse, their internal stakeholders find it compelling.
HUI: Exactly.
ZACH: And especially in that context. One of the other contexts where I see it having some value is in the context of, you know, we always talk about how compliance and ethics and culture are not these exclusively legal or regulatory or enforcement exercises. But I do see value in the benchmarking from a more legalistic, more regulatory standpoint, when there is a law, a regulation, a government enforcement priority that is ill-defined.
HUI: Exactly.
ZACH: Where there's gray area and where that benchmarking helps us understand, not so much, well, they're doing it this way, so we're going to do it this way; but helps you understand, especially in an industry, how your industry is reacting to uncertainty / to a gray area. To something that isn't a clear, definitive pronouncement, but instead requires interpretation.
HUI: Very much so. And related to that, I also see this desire for benchmarking is in some ways a desire to seek better ways, right? Because you're out there trying to compare what other people are doing vis-a-vis what you're doing. And in that process, I think what people are hoping to discover is, “oh, here's a different way of doing something that seems worth trying in our organization.” And I think that, in that respect, I highly applaud that desire.
ZACH: Yes. I fully agree with that. In fact, the phrase that we often hear used in the context of benchmarking, and I hate this phrase, but the phrase is: “best practices.” I do think it is a positive thing, as you said, for people to seek out an understanding of what others are doing as part of their own brainstorming, as part of their own ideation process, as part of their own journey or search for better ways. But when we frame these things as “best practices,” we've judged them completely unfairly.
Calling something a best practice is to say that it is a practice that is the best way to do something. That, to me, is an outcome determinative statement; and yet, very rarely are we actually empowered with outcome-related data to prove that that practice is the best.
HUI: Well, I have the same issue with “best practice” as I do with the way people use the word “effective.” Because I also want to ask “best at achieving what?” Which is the same question we asked about effective, “effective of doing what?” Because driving really fast is a best practice for someone who wants to get a thrill out of driving fast. That's the best practice if that's the end you want to achieve. But it may not be the best practice when it comes to safety. So that's my issue with the whole terminology about best practices.
ZACH: Absolutely. Let's talk now about some of the challenges that we have with benchmarking. I guess I'll share one first. I get disappointed in the focus on the benchmark in the context of culture work. We talk all the time about how complex culture is. And the benchmark, I feel like, leans into a more reductive approach to culture. Just because someone is in the same industry as you doesn't mean that their culture has anything valuable to say about yours. Just because someone is of the same size company as yours is, doesn't mean that they have anything valuable to say about your culture.
So, I really take issue with that at times because culture is so complex and you don't know what's going on inside some of these blind, especially when they're blind, benchmarks. You don't know if they've recently had a CEO turnover. You don't know whether they've recently gone through some sort of meaningful scandal. You don't know whether or not they are performing really, really well, or if they are performing from a business perspective really, really poorly. You don't know what their strategy is in terms of hiring people or retaining people. You don't know what they value and how that translates into the way in which they run their company. So there's all of these things that you just don't know. And without knowing those things, I just don't see a whole lot of value in comparing myself to this often undisclosed stranger, especially for something as sort of intimate and personal and complex as culture.
HUI: Yes, I would even take something that you would think is relatively straightforward, comparable between two organizations, something like participation rate in a culture survey. Let's say you would think, oh, that's relatively straightforward. But then when you start taking into account the type of issues and the type of contextual considerations that you're thinking about. What just happened in this organization recently? We don't know. How geographically spread out is the organization? How big is the organization? Let's say a 65% participation rate. I would interpret that differently if I'm talking about out of a 100,000 person organization versus a 100 person organization.
That same 65% is to me different in those two different size organizations. And then if you layer in all the other contextual considerations, that really does begin to raise questions. Are these two 65% really comparable?
ZACH: I also think about when it comes to culture, the benchmarking that often happens with these surveys raises a couple other thoughts for me. One is it leans into I think a more simplistic version of the culture assessment, because a . . . quantitative benchmark really only works if you're doing a more rote version of a survey that is different from the way that we do our assessments. It just is.
And then the other thing is, especially with certain aspects of culture, aside from something like participation rate, most of the substantive things that we're getting into when we're talking about culture, you have to ask yourself, do you even want to be like this other company? You may very well value very different things, and because you value different things. And because you value different things, what is good to them may not be what is good to you or good to them. And so again, I just think that it overly simplifies things in a way that that I don't like and I really think that when it comes to culture reviews in particular we could just do a lot better. I just don't think it adds a whole lot to the conversation.
HUI: I think what we're saying here is, we need to be mindful of the complexities and the nuances of the context when we are looking at these comparisons. Because look, to compare, or to want to compare is I think human nature. And there are certain things that I would say we would universally say, regardless of the context, this is really good or really bad. So for example, going back to participation rate again, if an organization did a cultural survey and they had a 15% participation rate. Then you would say, okay, regardless of what the context is, that really doesn't sound so good. But then don't forget the second step is to dig into what's been happening. What may be the contextual factors here? We're not saying don't ever compare. We're saying really be mindful of all these things when you're making that comparison.
ZACH: Yeah, for sure. What else from a maybe more negative position or some of the fallacies or nuances that you see?
HUI: Or I would say, yeah, a more critical position.
ZACH: Yeah, critical.
HUI: So similar to this first point that we raised about culture surveys, to this day, I remember one of the interview conversations I had years ago when I talked to a CCO of a very large pharmaceutical company. And he said something like, you know, my stakeholders always want me to benchmark. They want to benchmark, for example, the size and budget of my function. But when I start talking to my peers, we're not really comparing apple to apple, which is entirely true. And again, we're talking about nuances that oftentimes require much more digging-in than the initial numbers tell you. Let's say two similar companies, A&B, similar in size, same industry.
One company has a 100 person compliance department, the other company has 50. You cannot be so quick to jump into the conclusion saying that second company is investing only half of what Company A is investing in because you got to start digging into, who is doing all the compliance related work? For example, Company A, with the 100, they may have an investigation team that has full-time assigned forensic auditors that work as part of the compliance department's investigation team. Company B may not have that, but they have their corporate auditors, who cover that responsibility and they have adequate resourcing for that coverage, so you . . .
ZACH: Yeah. Or they're spending significantly more on outside consultants or outside counsel.
HUI: Exactly. So that is just one way how, what looks like two different numbers can really mean very comparable things, even though the numbers don't match up. So, it is about, again, a lot more digging behind the scenes. Do not ever just look at number to number. This is how much we have for our compliance function. This is how much we spent on this. This is how much we budgeted. But the second company may have budgeted in a way that's dispersed throughout different functions. You have to take those things into consideration if you really do want to make that comparison.
ZACH: For sure. I mean, I'm going to be a little cynical here, but I think when it comes to resourcing related issues, I think people are probably fine with just making the 50 to 100 comparison because if you're on the 50, you want to advocate for more resources.
HUI: Exactly, yes.
ZACH: Which actually, I say it cynically, but I actually respect that game. A related point that I think of and that I share with folks a lot when we talk about benchmarking is, as I said before, a lot of the benchmarking that happens, happens less quantitatively and even if it does happen based on discussions with your outside advisors or based on discussions with your counterparts and you in fact work to quantify some of that, I think that the conversational nature of a lot of benchmarking means that people are often putting forth the best version of what they're sharing with you.
HUI: So true.
ZACH: So, you talk to your counterpart at X company and they're going to tell you the public friendly version of what they're doing and how things are going. And most of us will probably do the same. We're not going to air all of our dirty laundry to a competitor, to a peer even in the context of a generally confidential discussion about benchmarking. And so, I think that what often happens is we actually just aren't getting super valuable or valid information because it's coming through a bit of “rose colored glasses.”
HUI: Very much so. So another area goes into this focus on industry that people tend to have when they do benchmarking. On the one hand, that desire to compare within your industry is completely understandable and rational. So when you compare the compliance risks and how they need to be managed of a pharmaceutical versus a financial services institution versus a manufacturing facility, they're completely different. You would not need a team of money laundering experts and analysts for a pharmaceutical or for a manufacturing.
You probably wouldn't need a lot of the research related type of compliance risk management relating to life science companies that you wouldn't need those when you're working in a bank. Understandable. But the problem when people compare within their own industry is: if the entire industry is not really strong on their compliance, you end up comparing this with each other and all feeling pretty good about yourselves. The problem then is you don't even realize that you're so inadequate in delivering on your mission to prevent and detect and remediate misconduct until something bad happens to your whole industry.
ZACH: Yeah.
HUI: And I saw this and this really hit home for me when I was serving as an expert witness in some of the opioid litigations because I had the occasion to review many companies drug distribution compliance. And I saw records of them comparing notes with each other, saying, hey, I look at to our peer company who is doing this and yeah, they're doing what we're doing and we're generally on par—and they were. They really were doing all pretty similar things, except none of them was adequate in preventing the massive, massive sort of shipments of opioid drugs and distribution to where they shouldn't go.
So I thought that was a really good lesson in thinking through, should I be looking beyond just what my peers are doing? And I saw this earlier in my career as well. I started my compliance career in high tech. High tech came to the compliance game much later than, for example, life science companies. And if I were at that time, well, not if I did . . . at that time, I did air quote benchmarking with the peer companies, and we all were sort of in similar places.
But I remember this is back to one of the benefits of going to a conference. I went to a conference as sort of someone new in the profession tend to do. And I heard presentations from life science companies, and I talked to their compliance officers; and that was when I realized, boy, we have quite a few steps to go here to reach the level of that maturation. So, it was only by looking at other industries that I realized what some of the challenges and opportunities were in my own space. And for that reason, I think it's very, very important that you don't just look to your industry.
ZACH: Interesting. It’s just based on this fallacy really that there's strength in numbers. There really isn't strength in numbers. If you're all doing it bad, you're all doing it bad and there's probably not going to be a whole lot of forgiveness for that if you do find yourself in an enforcement action or in a litigation. But the solution to that really comes back to, I feel like one of our core values, which is, just building data-driven and outcomes focused evidence-based programs. It's fine to do benchmarking as we said to get ideas to brainstorm more broadly than just with your peers internally. But if you take those ideas and actually test them out, if you actually prove that they accomplish a predefined goal, who cares if anyone else is doing it?
HUI: The challenge comes when you have to explain that to your stakeholders and your stakeholders say, well, if no one else is doing that, why should we be doing it? And this is why you need to understand and be able to explain the nuances.
ZACH: Yeah, of course. But it's one thing to say we're going to do it this way. We're going to be innovative. We're going to try something different. It's another thing to say we've got a hypothesis that this way will help us accomplish that goal. And this is how we're going to go about measuring it. These are the steps we're going to take to give you data to show you how it's performing. And this is how we're going to manage for the possibility that it doesn't, so that we can change course. I mean, I don't know very many people except the most risk averse sort who would hear that, and think, well, we don't want to do that. Let's just do what the other person's doing, because at least there's some false sense of safety in the numbers. Anything else?
HUI: Yeah. I thought it was interesting when I did the opioid cases. Many of them were very focused on things like, are the drugs kept in locked cages? Are the shipping labels correct? Do they match a certain list of, this and that? But not one company was looking at, why are we shipping four million pills to a town of 200,000 people? Not one. I just felt like this is truly losing the forest for the trees. And they reinforced that approach with each other, because when they talk to each other, that's what everybody else was doing.
ZACH: Yeah. I mean, it's the very definition of a focus on output rather than outcomes. It's the very definition of focus on process, maybe over purpose.
HUI: Absolutely.
ZACH: Well, we could do a whole episode just on that.
HUI: Indeed.
ZACH: All right, so let's talk about, from everything we've shared here and just from thinking about this topic quite a bit over the years, what guidance or what ideas or what recommendations do we have for people who are going to go about benchmarking in large part because they're under pressure from stakeholders internally to do it and to be able to provide at least that part of maybe a much bigger story. What are some of the best . . . oh my God, I almost said best practices. Shame on me. What are some of the ideas or recommendations that you have for folks?
HUI: Well, before we go there, I think I want to mention one more sort of critical thinking that needs to be introduced. To remember many of these benchmarking data that you get is a snapshot in time. And this is another part of the context that we need to remember, is that everything is an evolution. So this in part folds into what we said earlier about the contextual, but let's remember that what you're getting, what you're comparing, is a snapshot in time from different organizations, and you are doing so without the benefit of not only the general context at the time, but the longitudinal context, which is where do we put these numbers in the context of the evolution of the organization.
ZACH: Absolutely. And look, I spent a lot of time on my soapbox several minutes ago talking about context in the context of culture. Did I just say context in the context of? I think I did. I think it's right. I think it's a completely valid way of articulating my point. Context in the context of culture, but risk needs to be emphasized as well when you're having the discussion about context. I mean every company, even if you're in the same industry, has a different risk profile.
There are not just macro level things that are impacting your risk profile, but there's a lot of stuff that goes on internally that you're not going to necessarily know. Or that's going to get very nuanced in ways that impact the risk profile and therefore impact some of the decisions that folks have made. So yes, context huge. Context changes everything.
HUI: Indeed, it does.
ZACH: Suggestions or recommendations or considerations that we have that folks can take away if they are under pressure, for example, to do benchmarking. If there are stakeholders who want to hear that part of the story, how do we do it well?
HUI: So, first thing I would say is what we've already said, which is embrace the complexity, resist the urge to do the simple. They have this many people, we have this many people, they have this much budget, we have this much budget type of . . . or their survey reached this percentage point and our surveys at this percentage point. Those are not irrelevant, but they must be contextualized with a lot of nuances that we talked about. So, resist that temptation to just go to a simple benchmarking. You want to do benchmarking? Really dig in as much as you're able, so that that comparison is more meaningful.
ZACH: Yeah, I say really dig in or when you're communicating either your strategy or your plan, that is in part influenced by benchmarking, make sure that your stakeholders understand that you did the benchmarking as they were maybe asking you to do, but that the benchmarking has limitations. So, either go deep and get into the nuance or just make sure that when you're having conversations with folks that they understand the things that we've talked about today. That context matters and we don't have the full context. That culture matters and we don't actually understand everything that's going on within other organization. That risk profile matters, and we understand the risk profile to a certain extent, but there are nuances that are going to make the decisions we make different.
But there are all of these other considerations that go into why we make the decisions we make and that benchmarking is one of them, but also has a lot of limitations. Making sure folks understand that I think is really important.
HUI: Yeah. So, I think second part again based on what we have discussed is, completely understand why people are interested in benchmarking against peer companies within your own industry, but don't make that your only sources of benchmarking. So do a little bit of a mix between your industry / peer companies and peer companies outside of your immediate industry, just to get a sense of what that may look like.
ZACH: Yeah, and it may be a non-industry benchmark that carries maybe not a similar risk profile, but a similar level of risk. To compare a biotech to maybe finance, where they're both regulated and where there's enforcement activity and where there's just a lot of opportunities for things to go wrong, going to those sort of benchmarks to draw inspiration from outside of industry makes a lot of sense.
HUI: Exactly. Think of it as drawing inspiration. Just think outside of the box.
ZACH: Yeah.
HUI: The other thing that I think is consistent with everything we've been saying is: use whatever you learn from benchmark as one piece of the puzzle. Not the puzzle. But one piece of information that helps you form the picture about your overall performance and resource and governance and all of that. When you're making that overall assessment, whether it's about your program or about your culture, it should never be driven by just one set of information. So, remember that when you're doing this. And this is benchmarking whatever data and information stories you get, it feeds into a larger narrative.
ZACH: Absolutely. I got to tell you, I love it when we have guests, but I have so much fun when it's just us. So, thank you, Hui, for another great discussion on a topic that I got to say more than one person told me they thought benchmarking felt a little boring and I don't think it was.
HUI: I don't think it was either.
ZACH: I thank you for that.
HUI: Thank you.
ZACH: Any final words for folks?
HUI: Context matters.
ZACH: It does indeed. Thanks, Hui.
ZACH: And thank you all for tuning in to The Better Way? Podcast. For more information about this or anything else that’s happening with CDE Advisors, visit our website at www.CDEAdvisors.com, where you can also check out the Better Way blog. And please like and subscribe to this series on Apply or Spotify. And, finally, if you have thoughts about what we talked about today, the work we do here at CDE, or just have ideas for Better Ways we should explore, please don’t hesitate to reach out—we’d love to hear from you. Thanks again for listening.