Ep.21: Critical Thinking: An Underused Superpower?
Remember, you can always listen here or follow us on Apple Podcasts or Spotify. Either way, thanks for supporting us.
About this episode. Critical thinking sounds simple—but in practice, it’s one of the most overlooked skills in compliance and culture work. In this episode of The Better Way?, Zach and Hui dive deep into what critical thinking really means: objective analysis, bias awareness, and evidence-based decision-making. They explore why curiosity is the foundation of critical thinking and why assumptions, incomplete information, and invalid metrics often lead organizations astray.
Through real-world examples—from FOFO (fear of finding out) to confirmation bias and groupthink—they reveal how these traps show up in compliance programs and what leaders can do to avoid them. The conversation closes with a practical framework for applying critical thinking every day: separating facts from perceptions, questioning assumptions, and seeking diverse perspectives. If you’re looking for strategies to improve decision-making, this episode is your roadmap. Oh, and Happy New Year!
Who? Zach Coseglia + Hui Chen, CDE Advisors
Full Transcript:
ZACH: Welcome back to The Better Way? Podcast brought to you by CDE Advisors. Culture. Data. Ethics. This is a curiosity podcast for those who ask, “There has to be a better way, right? There just has to be.” I'm Zach Coseglia and I am joined as always by the one and only, Hui Chen. Hi, Hui and Happy New Year.
HUI: Happy New Year to you too, Zach, and Happy New Year to our friends out there.
ZACH: Absolutely. We've got all kinds of exciting things planned for 2026, including our discussion today, which is one that we've been talking amongst ourselves about having for quite some time—in part because it's so deeply ingrained in our ethos. And that topic is—pretty broad—but it is: critical thinking.
HUI: Yes, we talk about this between the two of us all the time and we thought it's time that we enlarge this discussion to all of you out there.
ZACH: All right, Hui. So, let's start as we often do with the basics and some definition of terms. This is going to sound a little silly probably to those listening, but what is critical thinking?
HUI: First of all, to start with that question is already the beginning of critical thinking, right? Is to make sure that we're talking about the same thing, making sure that we're all truly on the same page. So I like how we begin with this question.
ZACH: So I was being a critical thinker by asking you how you define critical thinking.
HUI: No question about it, in my mind anyway. So, I have a sort of multi-component definition in my mind for what is critical thinking. To me it is the objective analysis and evaluation of facts, information and arguments in order to form an evidence-based conclusion or judgment.
ZACH: That is quite the definition. So, let's break it down into its component parts. The first thing that you mentioned was it's about objective analysis and evaluation. Tell us more about what you mean by that.
HUI: So, objective to me is being able to step out of your own assumptions and biases. That begins by recognizing that we all bring our own assumptions and biases to our thinking. So being aware of those is being objective. And it's analysis and evaluation—so it's not taking things at their face value but digging into them. So that's objective analysis and evaluation unpacked a little bit.
ZACH: Absolutely. So, then the next part that you mentioned was of facts, information and arguments. Again, terms that kind of are self-defining, but an important element of this overall definition.
HUI: I sort of lay them out as facts, information, arguments because these are the data points, shall we say, that are present when we have to apply our thinking to a situation. And oftentimes people confuse them. Sometimes they take an argument to be a fact. They take information without questioning whether [that] information [is] based on facts. So, facts and information and arguments are all different. They bring different elements to this problem that you're trying to apply your thinking to, but you have to understand that they're different.
ZACH: I like the way that you laid them out individually, and I think that much of the sort of hypothesis behind our interest in this topic is that it seems like folks often treat information as facts and arguments as proof, which leads to the next element that you mentioned—was to form evidence-based conclusions or judgment. Now, those of you who know CDE and who know Hui and I well know that we're always talking about outcomes-based, evidence-based approaches to the work that we do. So unsurprisingly it is sort of the gravamen. It is sort of the heart of critical thinking.
HUI: Very much so. It is ultimately about what is there to back up this conclusion or judgment that I'm making?
ZACH: Yeah. You know, when I originally asked myself, well, what is critical thinking? How would I define critical thinking? The word that came to my mind immediately to maybe more simply define it was curiosity. And not surprisingly, because we always start this podcast by saying this is a curiosity podcast, and so much of what we do is driven by curiosity . . . and no doubt curiosity is a huge part of critical thinking. But talk to us about what else other than curiosity goes into critical thinking in your mind.
HUI: I would say curiosity is not just a huge part, it is the very foundation of critical thinking. It's also awareness of biases—and we're going to talk about all kinds of biases that all of us bring to bring to our thinking. But it's recognizing that those biases exist, which is the first step of being able to correct those biases. And another element would be the need to use logic in our thinking, which surprisingly is not as often as you would like to see.
ZACH: Yeah. I mean, I talked about how anyone who knows us knows our focus is often on outcomes-based, evidence-based approaches. Our focus also is often on human-centeredness, especially when we're talking about compliance and culture topics. Especially in a world in which they are often so dominated by, you know, legal and regulatory and enforcement considerations. What I love about critical thinking as a topic is that it's uniquely human. The ability to think critically, the ability to apply logic is uniquely human. And yet the flip side of that is: bringing those biases to the table is also uniquely human
HUI: Indeed.
ZACH: And so and so critical thinking is this really seemingly obvious thing that I think most of us think we do every day really well, and yet it's quite complex.
HUI: Very complex.
ZACH: And it is especially human. Yeah. Well, let’s talk about, again, it feels so silly to say this or to ask this question, but let's talk about why critical thinking is important. I mean, part of me wants to just be like, well, obviously it's important because it's uniquely human and it's the foundation of everything that we do, but Hui, break it down for us. Why is critical thinking so important?
HUI: So, I somewhat have to go back to the definition . . . if we are not applying an objective, logical, bias free, or at least close to bias free, analysis and evaluation to a problem.
ZACH: Or at least bias aware.
HUI: That's right. That's a much better way to put it. Thank you, Zach. If we're not doing that type of analysis, if we're not distinguishing facts from unverified information from arguments that have yet to be backed up. And if the conclusion and decisions we make are not based on evidence, oftentimes we would be then making decisions without truly understanding what the problem is, because the lack of all those things have caused us to misunderstand or misstate the problem. And therefore, we cannot apply the right solution to the problem if we never understood the problem correctly. Even if we have understood the problem correctly, if we fail to do all those things that we call critical thinking, we may not be applying the right solution to the problem. So, when we are trying to tackle problems and try to solve them, if we're not understanding the problem correctly and if we're not applying the right solution to the right problem, then what are we doing?
ZACH: Yeah, fully. Let's bring it down to the ethics, compliance and culture level. And what I did in preparation for our discussion today, Hui, was actually just quickly wrote down like five or six of some of the most obvious examples of a lack of critical thinking that have arisen over the course of conversations that I've had more or less in the last year or so. And I think that some of these very much highlight all of the issues and traps that you were just articulating.
So, one of the ones that stood out to me was someone saying, more than one person, in fact, saying when people hear, you know, when people at my company say they're not comfortable speaking up, let's say like in a survey or something like that, what they're really talking about is issues that are unrelated to compliance. They're talking about just business ideas and innovation. I mean, we hear this. This is an actual example of something that we've heard. And I mean, it's the definition of an assumption.
HUI: It is very much so.
ZACH: In fact, when we hear this, it's often taking a data point, which we search for . . . evidence which we search for, is folks expressing a discomfort in speaking up and sort of bastardizing it or interpreting it in a way that is advantageous to the narrative that you want to tell without, in fact, backing that narrative up with additional contradictory or supplementary evidence. It's very much . . . Yeah, go ahead.
HUI: I hear at least two biases in there. So, one is the confirmation bias, which is you select information to get to the answer that you want. The other is self or selection bias because even in those situations where they would be able to cite some kind of data that says, look, here's all the reports that came in and 80% of them, 90% of them, whatever it is, has nothing to do with the core ethics and compliance issues that we're interested in and refers to these other issues. But what is not included in that data is all the people who didn't report anything, right? So, the selection bias is you're not thinking about what is present in your selection. You're only looking at the selection. You're not thinking about what's missing in this picture.
ZACH: Yeah, I mean, I think it's actually oftentimes even more so like an authoritative bias because it's the narrative that someone has heard, you know, a person in a position of power or a leader express when they are digesting the data and then it ultimately gets sort of adopted.
HUI: Yes.
ZACH: The other thing about this in terms of a lack of critical thinking that stands out to me is, just separate and apart from the obvious assumptions that are being made and the biases that are being applied. It's the idea that if folks are uncomfortable talking about business things, if they're uncomfortable raising innovative ideas, if they're uncomfortable being a voice for maybe doing something differently . . . that still is a meaningful and actionable data point as it applies to ethics and compliance. The idea that your folks are uncomfortable speaking up in one context doesn't mean that they're comfortable speaking up in another. If anything, it's a signal that there is discomfort that could potentially boil over into your ethics and compliance program. And applying curiosity to that would mean asking more questions and digging deeper, not trying to explain it away.
HUI: Without question, absolutely.
ZACH: Should I share another one?
HUI: Please, these are, by the way, folks, these are the kind of things we talk about all the time.
ZACH: All right, here's another one. This is one we hear a lot, which is we shouldn't do that because if we find something, we'll then have created a record of misconduct that could be problematic for us in the future. Now it sounds silly, and maybe that makes people you smiled, maybe others sort of smile or snicker when they hear that. But we actually hear this quite a lot, and we've talked about it on the podcast before. One of our old colleagues, a cultural psychologist we used to work with, would often call this FOFO, the fear of finding out.
And what I find so interesting about this example is, one, the concern about the record as opposed to the concern about the misconduct. The doing the monitoring or the doing the questioning or the doing the assessment that uncovers this activity isn't really the problematic record, it's the misconduct that's the problematic record. And when I hear that, it's less about sort of the assumptions that we were talking about before and more just about not fully thinking all the way down the line about the potential, you know, impact of this decision.
HUI: You know this FOFO, which is also a term that we use a lot and it just irrational. It just seems not logical to me because the problem is there whether you find it out or not. What you're really saying is, I don't want to fix this problem. Right? It's basically like saying, you know, I have some of these symptoms, but I don't want to get tested because in case it's a bad diagnosis . . . well, whatever it is, is there and if you go get the diagnosis, you have the opportunity to treat it.
ZACH: That's right.
HUI: If you don't, it does not go away. I mean, you know, this is really the ultimate definition of just burying your head in the sand.
ZACH: Absolutely. If you don't get the diagnosis, it's definitely going to kill you. But if you do get the diagnosis, at least you have a shot.
HUI: Right, this is simply not logical.
ZACH: So, Hui, I know that this is . . . some of the common traps that are reflected in these examples and in others is something that you've actually written about and spoken about a lot over the course of the past decade. Why don't you take us through some of those? Let's start with incomplete information. This is often a sort of designation or a quality of a lack of critical thinking.
HUI: Right. This is when you don't realize that you do not have the complete picture and when that complete picture is so meaningful and provides meaning to the piece of information that you do have. So, a typical question that DOJ oftentimes have asked companies and companies have often keep data of is, has the company disciplined employees for a particular type of conduct . . . for violation of this type of policy or whatever, right?
Oftentimes what is missing is the question has the company not disciplined employees for this type of conduct? And that is the truly meaningful question that puts in context the number you do have for those who are disciplined. So, let's say a company has disciplined 20 employees for this type of conduct in the last five years. You don't know how to interpret that 20 unless you know how many have been found to have violated the same policy but not been disciplined. So, if you have found 20 and disciplined 20, 20 is a good number. If you have found 100, you discipline 20, it's not such a good number. So, in order to judge that number, you have to have that context. And oftentimes we find that people are not asking for that context. They look at the 20 and they say that's a good number. We discipline 20 people for violating this policy. When the rest of the company, at least people who have interacted with 100, know that this is not such a big deal.
And in fact, it was interesting, I just got a someone sent me a link this morning about China having executed one of the senior executives of one of its state-owned companies for bribery; and everybody knows that China actually imposes the death penalty for bribe taking. Why is bribery then still so prevalent? Because, I believe, this is certainly not based on official data, right? But based on conversations with people. The perception out there is out of 1000 or 10,000 people who take bribes, maybe one gets executed.
ZACH: I mean, for what it's worth, those are actually not odds that I'm particularly interested in. But still, yeah. I love that you use the term context because we've talked here a lot and we talk elsewhere a lot about how context matters and context matters is sort of the core of incomplete information. And when I hear the examples that you provide, I mean, there's a whole host of other things relating to context that I think come into play. Yeah, I suppose if 20 people engaged in this misconduct and 20 people were disciplined, that's better than the alternative. The flip side of that is, but what are you are you doing to actually uncover this misconduct? You know, do you have monitoring in place? What does oversight look like? What is the auditing protocol and schedule look like? All of these things contribute to a better understanding of how likely you are to identify the misconduct and therefore shape our kind of perception of the quality of your efforts to remediate the misconduct. So, I love the kind of context matters element to this. Another metric that you often write about and talk about is the invalidity of certain data points. So, talk to us about what you mean by invalid metrics.
HUI: Invalid metrics is when you're just simply using the wrong metrics. So, this is something that comes up almost in every training discussion we have, is how do we measure training effectiveness? Well, it's not by training completion rate, because what are you measuring in training completion rate? It's how many people showed up and completed the training. It is a measurement of your power to compel people to do a training to its conclusion. That's what it measures. Effectiveness has to do with what you're trying to achieve through that training. Now, I will say, when I have asked people, what is the goal of your training, which is directly relevant to the question of effectiveness, there have been people who are honest enough to say, “to satisfy the regulators.” Now if that is your goal, then yes, training completion would be the right metrics.
ZACH: I mean; I guess. I would like to think that we're living in a world, but probably not. But we're living in a world where even the regulator, the enforcers are interested in more than just whether you did it—and whether or not you have any reason to believe that it's doing something meaningful to actually prevent misconduct. But your point is well taken, nonetheless.
HUI: Right. If your goal is to show that you are able to compel so many people, a certain percentage of your organization to do a training to its completion, then completion is the right metrics. The other popular one is, and which I heard all the time when I was in DOJ was, we have a great tone from the top. Here's the 35 messages that our senior management have sent out supporting compliance, right?
ZACH: Yes, the scripted messages that were written by compliance that were then put into a teleprompter that the leader then read dutifully.
HUI: Exactly. Sometimes not so dutifully. But this is also another form really of incomplete information. What I really would want to know, which you know, is a question that I often did ask companies that present that kind of measurement, is how many times did they say, “meet your goals by all means possible,” right? By all means necessary. Did you count those?
ZACH: Yeah. Hit your target, hit your target. Make sure we're, you know, meeting our financial goals and objectives. You know, and those are often repeated probably much more frequently than, “and also remember, we are committed to ethics and integrity.”
HUI: Exactly. So, I analogize this to counting calories only when you eat vegetables. Yeah, that does not work. If you really want to know what is your calorie intake, you cannot do it only when you're eating vegetables.
ZACH: I want to talk about the sort of sometimes lack of critical thinking in this context, but slightly different, which is less about measurement on the back end and more about the decision-making process that goes into what we prioritize. You know, I see all the time folks saying, hey, let's have a goal next year where we are going to create an enhanced code of conduct, or we are going to create a new digital portal and library for all of our policies. That, in isolation, sounds like an interesting idea and something worthy of consideration.
You ask why, and we might hear an answer like, well, we're getting all of these questions coming to us. This is also sometimes sort of the dynamics of, you know, folks who are interested in maybe using like a chat bot or an agent to answer questions about policies. So well, we're getting all of these questions, and we want people to not come to us for these answers because it's burdensome and we have other things to do. We want them to be able to sort of self-serve themself. And this applies to just about any element of compliance that could be self-served.
The challenge that I have oftentimes when I'm looking at these goals is that while they make sense in isolation and they're often, you know, kind of thoughtfully designed and seemingly a good idea, they're very rarely finely calibrated to the problem, which is, we want people to stop coming to us. And one of the reasons I know that it's not finally calibrated is because it is so very rare for that goal to be informed by meaningful conversations with the stakeholder to ask them what they want and what they need.
That to me is the sort of centerpiece of human centeredness. It is designing solutions, not just with the human in mind, with the stakeholder in mind, but by actually talking to them about what they want. And what we often see is people investing time and money into new products, let's call them products, that they put out to their stakeholders. That the stakeholder doesn't want. Because the stakeholder just wants to call compliance and get the answer to the question.
So, the goal that has been developed, the thing that has been invested in, and the time that has been spent winds up being sometimes wasted because it's not actually solving the problem that they started with when they were at square one.
HUI: I think it starts with, oftentimes, people not actually articulating that problem. They don't ever say the problem we want to solve is that we are getting too many calls. They actually pose it as some other problem. They the problem is people are not just having the self-help tools. If they only had it, they wouldn't be calling us, but they're not framing the question to start as “we're getting too many calls. How do we reduce that?” Once you start by an assumption, which is they're calling us only because they have no tools, then this is that confirmation bias. You work towards that assumption. You're not actually questioning your assumption. If you were framing your question honestly, which is we're getting too many calls. Then, hopefully your curiosity would lead you to gather the information from people who are calling to say , “why? Why are you calling? Why are you choosing to call?” Not the question itself, right? “Why are you choosing this method of communication?”
ZACH: Yeah.
HUI: “What would have been another option that you would have liked before you call that you don't already have,” right? At least gather that information without framing the question correctly, you're not then out to collect the relevant information that would help you solve that problem.
ZACH: Yeah. Well, let's talk a little bit more about confirmation bias, because I do think that that is an element here. This is sort of the idea of looking for the outcome that you want, sometimes looking for the outcome that's going to be the path of least resistance for you.
HUI: Yes.
ZACH: What are some other examples of where you've seen the confirmation bias in compliance? I think, you know, investigations comes to mind as the obvious place.
HUI: Very much so. It's not just in internal investigations, in criminal investigations, we do see people's starting with an assumption and wanting that to be the answer, then proceeds to ignore or trivialize evidence along the way that do not support that assumption and only select the evidence and information that confirm the kind of conclusion they want. It's a very dangerous thing.
ZACH: It is a very dangerous thing and one of the most dangerous ways in which I see that is used or inadvertently exercised is in the culture context. You know, culture is such a personal thing, especially to leaders, especially to people who see themselves truly as being responsible, you know, and accountable for promoting a certain kind of culture. A culture that supports their business goals and where people want to work and all that good stuff. And I see all the time, folks saying with respect to culture,” well, we have a really great culture here.” And you ask them, well, how do they know that? And they can't really tell you. Or, you know, “I feel . . .” Whenever someone starts a sentence with, “I feel,” you can be pretty sure that they're not exercising critical thinking in that moment. And you know, “I feel like we all just, you know, really support each other.” Or “I feel that we have a culture where people can speak up and, you know, feel safe raising issues and coming forward and talking about potential problems before they become the problem.” You know, it's really dangerous in that context because, one, culture, it can be quantified, we know this. It can be measured, we know this. But it can also be a softer concept; and you know, going with your gut, going with your feeling, when it comes to culture, it's just not a good strategy. And yet we see that confirmation bias. We see blinders when it comes to culture all the time.
HUI: It's important to always remember when you hear the word “I feel,” you classify that statement not under facts. That is not fact. That is not evidence. It is somebody's perception. And not that that's not important, but it's critically important that you recognize that as a perception, not a fact.
And we do hear this type of confirmation bias often. We have people, very committed, well-intentioned leaders who say, “we have a good culture because I'm out there talking about it all the time. Because I'm trying to lead by example,” which are not untrue, but just because you're making an effort, that piece of information in itself is not sufficient to then draw that conclusion on your overall culture, because you as a leader have made some efforts or very significant efforts. How is it landing? How are the people, other leaders in your organization? What are some of the perceptions out there that when you're very passionately advocating for your point of view, people may not be expressing to you? Those are questions that a curious person would want to be asking as a critical thinker.
ZACH: Absolutely. And you know when you start that sentence with “I feel” or “I think” what I try to do, cause look, I do this all the time. I'm deeply human. We do this all the time. I feel a lot. But what I often try to do in those moments is look for behaviors. Look for behaviors, because behaviors are a data point that are a heck of a lot more reliable than just what I'm feeling or thinking in a moment.
I want to switch gears here and talk about another concept. Which you've hit on, you've talked about confusing legal accountability with compliance effectiveness. We've talked about over generalization too. I want to talk about those two in a slightly different form, which is and this comes up a lot, but it's the sort of using legal concepts as a placeholder for critical thinking. So, one of the examples of that that I often see is people saying things like, you know, “we can't do that because of privilege.” Okay, “say more” should be the response that anyone who was on the receiving end of that statement . . . that should be what they say, is say more or you know “I'm worried about the future litigation risks. So, no, we can't do that.”
You know, this kind of, actually very much, overlaps with the whole FOFO worried about the record thing. And what happens in these moments, and I see them all the time, and I guess I'm in some ways lucky, but in most ways unlucky enough . . . I have been a practicing lawyer and have been a litigator for two decades . . . I can apply a level of critical thinking to that you know someone who isn't a lawyer may not feel comfortable doing. But I see lawyers all the time sort of using the law as reasons why we can't do something, or reasons why we have to do certain things in a certain way that, at times, are probably based very much on critical thinking and analysis but oftentimes aren't. It's just the law as a shield rather than as a tool for critical thinking.
HUI: You know, I don't know if I should be surprised by how often the word privilege is used for exactly the kind of purposes that you have you have indicated. And oftentimes, I have pushed back and I would say, can you give us the privilege analysis? And I do mean analysis: cite cases, you know, cite court decisions that interpret this. Give us truly a legal analysis for your legal opinion. And I'm trying to think back . . . I cannot think of one instance where somebody actually came back to me with a fulsome legal analysis. And I do feel like this is missing not just as a critical thinking piece, but as legal work. As part of the legal profession. That really concerns me as someone who, you know, is not practicing as a lawyer but is legally trained and proud of that training. I think . . . I don't know what happened that we can accept lawyers rendering legal opinion without a fulsome citation-filled analysis. I have yet to see a full sum analysis that back up these privilege claims.
A similar issue that also overlaps the incomplete metrics point that we were discussing earlier is oftentimes people cite DOJ cases, right? So, compliance professionals and legal professionals in the space love to look at it, look, here's our DOJ cases. This is what I did. Is like the world of references, all the DOJ enforcement cases. What is missing from that data set, is all the cases that were never reported to DOJ, right? DOJ never knew about those cases, and they something happened, somebody remediated or not, but they never self-disclosed. It was never investigated. Many of us know about companies and organizations that have gone through that process of having found misconduct, dealt with it one way or the other, without ever self-reporting and without it ever being found out in a public way. We don't have the data from that precisely because they're unknown. But when you look at these kinds of numbers, you always have to think about what else is out there that we simply don't know about.
ZACH: Yeah. I mean, that goes back to the sort of self-reporting, self-selection bias that we referenced before. I think the other example of this that I know irks you probably to your core is when people say, “oh well, we have to do that because it's in the ECCP.” Like there's never been a better example of a lack of critical thinking than simply citing the ECCP as if it is a word from . . .
HUI: Yeah, it irks me to my core. You're right. There's no better way to rile me up than to say something like that.
ZACH: Yeah, that's also an example of another type of sort of bias or trap. Maybe this will be the last one that we talk about, but it's a group think; and in a lot of ways the sort of approach to the ECCP, the way in which it's used, the way in which it's talked about in our community is an example of group think. We . . . we being the sort of royal we of our profession / folks that operate in this space, have decided that this is how we're going to treat it. But there are a whole host of other examples of group think within individual organizations that can be very dangerous to critical thinking and that can be very dangerous to the decisions that are made from an ethics and compliance perspective.
HUI: We see it all the time in all the corporate meetings that we go to, right? I mean, this is a very, very common phenomenon. We all sit in the meeting, you know, the senior leader or a person with a very strong voice in the room or a heavy presence just says, you know, I think we should go this way and one by one everybody falls in line. Yes, this is what we should do. And this is the kind of thing that's gotten a lot of organizations in trouble, precisely because no one is raising questions.
ZACH: Absolutely. It reminds me also of a slightly different scenario. As a sort of side note to folks, I think we're going to do an episode at some point in the not-too-distant future about meetings, about corporate meetings. And this is another example that comes to mind in that context, but that's very relevant to the critical thinking piece. And that's less when the sort of like more, you know, person of authority or the person who has a strong presence drives the group think—and it's instead when you have a room full of people who are in healthy disagreement about something, but to try to resolve things . . . to try to check the box . . . get the meeting to end . . . move on. You hear., “I think we're all saying the same thing. I think we're all saying this.” To me . . . that's another example of a trap for critical thinking because what it often does is actually undermines critical thinking and tries to force consensus where you know one or more people are dissenting or are sharing a conflicting view.
HUI: Yes.
ZACH: Yeah, all right, so as we are wont to do, we've talked about a lot of the problems, but let's talk about some of the solutions. Let's talk about a framework for critical thinking. And so, Hui, what's the sort of first thing that you do and that you think others should do to really hit pause and say, no, let's make sure we're making a critical, thoughtful decision here?
HUI: We have referenced this, referenced this earlier, but first step to me is always recognizing and distinguishing between facts, assumptions, interpretations, perceptions. All of those things are different. And you have to correctly identify and label them, so you know what to do with them.
ZACH: Yeah. Another way that I might put that one too is, and this is another thing that we talk about a lot, which is listening. Meaning stepping outside myself and listening to what other people are saying or listening to other data, other documents, and other inputs that could potentially help shape the analysis. What else?
HUI: You just did something that is critical thinking, which is you pointed out an assumption underlying the statement I made. The assumption that I made was that you listen because but without listening you wouldn't even be getting the facts and assumptions and interpretations and perceptions. So, bravo for that!
ZACH: Okay, what's next?
HUI: Well, for the assumptions and claims that are being made, right? And perceptions, all . . . everything but facts. What exactly is the claim being made and what is the evidence that supports it? And what is the evidence that's missing?
ZACH: Yeah. Absolutely.
HUI: We gave many examples of that earlier.
ZACH: Yeah. And then I guess an extension of that is, and this is very much connected to the listening piece, which is: where is this information coming from? Who's saying it? Who's supplying it? Are there motivations behind it? What do those motivations tell me about the quality of the information that I'm getting? Are there voices missing from this conversation that are important that I need to seek out to make sure, to your point earlier, that I have the full context?
HUI: Yes, it's what's being said, what's being supported by evidence, who is saying them and what's behind those voices . . . and you know what's missing in the picture.
ZACH: Yeah, what else?
HUI: What other explanation could there be for the phenomenon that we're seeing, right? So sometimes we see a problem, everybody says, oh, it's because of blah. Well, could there be other explanations? Ask that question.
ZACH: Yeah. Or if you're sort of on the front end, as we were talking about like trying to decide what the project should be or how to scope something or how to design something, asking yourself, am I actually solving the problem that I just articulated? Am I doing something that is going to fix this, that I think is going to fix this problem and I have a good evidence-based reason to believe it? Or am I creating an explanation or creating an outcome that is not supported?
HUI: Yeah, I think, to get precise, right? Whether it's in your definition of what the problem is or in your framing of what the solutions might be, be very precise about what you're saying. Try not to generalize. Be very specific as to this is the problem we're trying to solve? Not something vague in general, but something much more specific.
ZACH: Yeah, let's talk about getting an outside view. I think this is something that a lot of us do. This is a sort of natural thing, but it's this idea of getting out of your own head and bringing other people into the analysis with you, right?
HUI: A lot of us do it, but I feel like not enough of us and not enough times—because we all do . . . we tend to get stuck, if not in our own head, in our own echo chamber. So, we often seek out views of people who share your views and perspectives. What I love about our working relationship Zach, is we do share similar views and perspective. If we had wildly different perspectives and views, it would be quite challenging to work together. We do share that, but we are not afraid to challenge each other. You know, I know we constantly say to each other: “Explain that more. Well, I'm not sure I would completely agree with that, or I might say it differently in this way.” So, it is important to have those people in your discussions and your considerations of decisions making.
And you know, another example that I often give, is when I first became a compliance officer, somebody asked me to draft a policy that governed a particular type of marketing activity. So, I sat in my ivory tower, and I drafted what I thought was a pretty darn good policy. And then I thought, huh, I should probably ask those who do this kind of marketing activity to see what they think. Well, I think within 20 minutes they tore my policy to shreds. I mean, there was like very few facets of that policy left that didn't need significant revision because they brought to me, their perspectives . . . their way of thinking . . . and the reality of how they do things. All of those were missing. And when I did my little policy drafting, I ran it by my legal colleagues, and they were fine with it. But once I brought it to the people who do this type of work, it was a whole different thing. So, this is also part of our human centered approach, right? So, you're trying to design policies and procedures and programs that shape people's behavior, then it's very important that you get the perspectives and the realities of those people.
ZACH: Yeah, fully. You know, it's interesting because as we’re sort of nearing the end of our conversation, I'm thinking that one of the things folks might experience or be worried about when it comes to critical thinking is that, there's a way of doing it that might actually make you feel or might have you perceived as a dissenter or as, you know, like this person's a problem. Everyone else just goes with the flow and this person is stepping out of that. And yet, the way you just described it, I think, is the way that I think of critical thinking, which is it's the exact opposite of that.
Critical thinking is actually about stepping out of yourself, looking more broadly than within, and it's often a deeply collaborative exercise. Especially in the ethics and compliance space, where we're ultimately looking to shape behaviors in ways that, you know, prevent and remediate misconduct. And so, there's a world of people out there whose behaviors we want to shape. And how do we do this without . . . how do we critically think without bringing them into the equation? So, I hope that those who maybe think, oh gosh, like if I really do all this, like people are going to think that I'm a problem. They're going to think that my critical thinking is critical rather than thinking.
HUI: Well, I actually think some of these efforts will be very much welcomed by your stakeholders, right? Because you're really seeking their input, you're seeking their partnership. This reminds me of something else that I often heard when I was at DOJ was companies, compliance folks come in and say. Well, we built this website, but nobody came, right?
ZACH: Yeah, for sure.
HUI: And the way they said it was that the clear statement they were making was, we've done what we could. Something's wrong with those people because they wouldn't come to the wonderful website we built.
ZACH: Yes.
HUI: Almost every time I have gone back with the question of, “if you came up with a product and nobody bought it, is that more of your problem or the consumer's problem?”
ZACH: 100%. I mean, I've probably told this story on here before, but I remember talking to someone a few years ago and you know, they were a chief compliance officer, and they were really critical of a business leader within their organization who didn't know about a certain policy that they had. And was like, this person is not fit to be a leader within our organization. And I don't know whether that's true or not, but what I do know is that if you're the Chief Compliance Officer of an organization and a senior leader within your organization isn't familiar with a key policy, maybe you have some responsibility for that.
HUI: Indeed. Unfortunately, that is something that we see quite often.
ZACH: Yeah, all right, Hui, we've reached the end. We will continue to talk about this topic throughout 2026, but what are your final takeaways for our listeners?
HUI: I would say, first, separate facts from everything else. Assumptions, interpretations, perceptions. Question, question, question. Have that curiosity. Question your first conclusion. Question why people arrive at the conclusions they do. Just be curious. And finally, always ask for evidence.
ZACH: I couldn't have said it any better. Thank you so much, Hui. It's always good to have these conversations. I'm excited for the year ahead and thank you all for listening.
HUI: Thank you. Have a wonderful 2026.
ZACH: And thank you all for tuning in to The Better Way? Podcast. For more information about this or anything else that’s happening with CDE Advisors, visit our website at www.CDEAdvisors.com, where you can also check out the Better Way blog. And please like and subscribe to this series on Apply or Spotify. And, finally, if you have thoughts about what we talked about today, the work we do here at CDE, or just have ideas for Better Ways we should explore, please don’t hesitate to reach out—we’d love to hear from you. Thanks again for listening.