Client Management For Nice People: Jaw-dropping client experiences (and how they changed us.)

Transcription of Christian Espinosa’s episode (That time when a client hired you to expose vulnerabilities and it ended up blowing up in their face)

This transcription belongs to Episode #53: That time when a client hired you to expose vulnerabilities and it ended up blowing up in their face (with Christian Espinosa) Please watch the complete episode here!

Transcription of Christian Espinosa’s episode (That time when a client hired you to expose vulnerabilities and it ended up blowing up in their face)

Morgan Friedman (Host): Hey everyone, welcome to the latest episode of Client Horror Stories. I’m very excited to have with me, Christian Espinosa. How are you doing today? 

Christian Espinosa (Interviewee): Hey, I am doing great, Morgan.How are you? 

Morgan Friedman (Host): I’m excited for our podcast. We had a great preparatory talk the other day, so now let’s jump right in. Tell us all about your biggest or craziest client horror story. 

Christian Espinosa (Interviewee): Well, I’ve had a lot of crazy client horror stories, but I’ll tell you about one of them that is very memorable for me. My company does ethical hacking or penetration testing, and we often do it against a healthcare facility. And this particular story is about a healthcare facility in Louisiana.

They hired us to use any means possible, which includes physical access, technical means hacking them across the internet, social engineering, any means that we could use to prove or disprove how secure they are.

And we have very clear rules of engagement, but maybe not clear enough based on this scenario. Because we say what kind of tactics we’ll use, we say, “We’re gonna emulate an adversary and we’re gonna find data on the internet about your employees, and then maybe use that in a method that entices somebody to click on something or to do something, that would give us control of their computer.”

So in this scenario, we came up with a plan. And it’s pretty common for us to come up with something like this. 

Morgan Friedman (Host): Okay. 

Christian Espinosa (Interviewee): Where we found an ex… we created an Excel document that had malicious code into it. It was our malicious code. 

Morgan Friedman (Host): Right. 

Christian Espinosa (Interviewee): So it didn’t spread automatically and we parsed the internet and found a list of employees for this hospital.

So we put a pretty big list of them in this Excel document. And we labeled the document, “2023 Layoffs”. In the name of the organization, space 2023 layoffs. And then we put this document on… 

Morgan Friedman (Host): By the way, I just wanna interrupt and say that’s brilliant marketing strategy. I feel like a bureaucrat or like mid-level employee at any company on the planet, like just seeing layoffs of my company this year, I’m assuming this was in 2023, it was like… 

Christian Espinosa (Interviewee): Yeah. 

Morgan Friedman (Host): I wanna click on that.

Christian Espinosa (Interviewee): Yeah, exactly. I mean, that’s the whole thing is you try to make it enticing and realistic. So people can’t help themselves, but click on the document or open it up. So, we put that Excel document on five USB thumb drives, and then strategically drop them at various points in the hospital. 

Morgan Friedman (Host): Like, left one in the bathroom, put one outside of the data center, one by the drinking fountain. And the whole idea is somebody would pick up this thumb drive, put it in their computer, and then open up that document because it has an enticing name. And of course… 

I love it. I think I would fall victim to that. 

Christian Espinosa (Interviewee): Yeah, a lot of people do. The criminals are good at this stuff, and that’s where we’re trying to test is how good their training is and… 

Morgan Friedman (Host): Exactly.

Christian Espinosa (Interviewee): And about. Number one, picking up a mysterious thumb drive. You know, that’s okay by itself. But then putting it into your computer, that’s typically a violation of a policy. In this case, it was a violation of their policy. And then on the file, if you open up the Excel document, we have a big warning that pops up that says, “This is confidential data.”

If you run across this contact, this person… you know, we made up this whole storyline. And if you’re certain you wanna see the data, click on this button. But the button they clicked on to reveal the document is actually the macro that installs our… or runs our malicious code. 

Morgan Friedman (Host): I see. 

Christian Espinosa (Interviewee): So, it’s set up in a way where it looks like it’s legitimate, but it actually tricks them into doing that.

So, we had a number of people open up that file and then what happened is, because we had real people on the list, and some of these people probably knew those people. What happened is, one of the individuals took the file off the thumb drive and emailed it to a number of employees, and then it basically spread throughout the entire hospital.

And from our perspective, what we see is we see when somebody clicks on or opens up the document, we see their computer pop up on our end that we can remotely control. And we ended up infecting like most of the hospital.

And even somebody took it home and we inadvertently infected their personal computer at their house.

But like I said, it wasn’t anything malicious, we’re the ethical hackers. 

Morgan Friedman (Host): Right, right. 

Christian Espinosa (Interviewee): But what was interesting about this is, the CEO of the hospital, when we were debriefing what happened, got super upset with us. Like to the point she was threatening to sue us. 

Morgan Friedman (Host): Whoa. Hold on. Was it the CEO previously that hired you or was it someone else in the organization?

Christian Espinosa (Interviewee): So the CEO and the CIO are the people that hired us. They knew this was going on. And this is the challenging thing in my industry is that people often have this false sense of security that my staff, my team will never do something like that. 

So, that’s probably what the CEO thought that they… nobody’s gonna fall for anything. But the fact that we were able to compromise, like 90% of the computers at the hospital, caused the CEO to get super angry and upset with us. 

And to the point, like in a debrief, she was yelling at us and saying, how dare we use those kind of tactics. These are real people on the team and they may think they’re gonna be fired and have a mental breakdown.

I’m like… and I understand the perspective, but if I see my name on a document, I’m gonna go talk to my boss. I’m not gonna just like freak out about it initially. And that’s not the whole point, the whole point is, if this were a malicious actor, they could have compromised the entire hospital and stolen everybody’s records.

And really put that hospital at a very exposed position from a liability perspective… 

Morgan Friedman (Host): Right. 

Christian Espinosa (Interviewee): For HIPAA data and protected health information. So yeah, that was one of the challenging things we had with the CEO, and the CEO threatened to sue us. And then there was also super zealous security guard that knew we were doing this physical… Yeah.

Morgan Friedman (Host): Before we get to the super zealous security guard, let’s dive in a bit to the CEO because what’s curious about this situation and what I wanna unpack for a few minutes… 

Christian Espinosa (Interviewee): Yeah. 

Morgan Friedman (Host): Is, the CEO hired you to do this? Like, did I… So the surprising thing is I would understand the rank and file of frontline employees being angry that they took advantage of me for a test, but like, this is exactly what the CEO signed up for.

So like, where do you think the breakdown happened?

Christian Espinosa (Interviewee): I think we did not explain… I think two things. One, we are pretty explicit about a list of things we could try, we may try. We don’t list everything, because we don’t know how about the intel until we do the research on the company… 

Morgan Friedman (Host): Right. 

Christian Espinosa (Interviewee): And scrape the internet and find things.

But I think… You know, and then in the future, after this scenario, we’ve… we started doing this, we explain like, “Here’s a type of tactic we may use. Are you okay with this?” And we go through the exact same scenario I just went through. Most people say, use whatever you want because my staff aren’t… they’re not gonna fall for anything.

They’re well trained on this. So I think… 

Morgan Friedman (Host): Yes. 

Christian Espinosa (Interviewee): Her main contention was that we used real people’s names on a document, and those people might have leaked it to the… this is one of the things she told us. She was concerned that one of her staff that was on that list would go to the newspaper and tell them the hospital is laying off nurses and staff, and it caused a panic.

So there’s all these… 

Morgan Friedman (Host): Right. 

Christian Espinosa (Interviewee): Like weird scenarios that were going on in her head. It almost… and she almost like forgot like the whole point of what we were trying to do. Which is show the risk and the vulnerabilities to that hospital because people weren’t doing what they’re supposed to do.

Nobody was supposed to put a thumb drive into a computer that they pick up. And nobody’s supposed to open a document that they didn’t know anything about. And then on top of that, not email it to everybody else either. So, there are a number of like policies that were broken that she thought these people knew better.

 But yeah, that’s the whole point, is to expose these vulnerabilities so we can fix them in the hospital and help make the hospital more secure in the future. 

Morgan Friedman (Host): So, that’s interesting. I feel like there’s… there are a lot of people and a lot of leaders that sign up for something in theory, but like as soon as it happens and there’s a fire under their ass, guess what?

There’s a fire, people run far away from the flame, even if it’s freezing, and you need the fire, and you purposely lit the fire. 

Christian Espinosa (Interviewee): Yeah, that’s a very good point because it was their idea to have this done, but the fact that we were successful is what I think was the driving factor of… 

Morgan Friedman (Host): And this is also a good example of the common pattern of “Blaming the messenger.” Like you were the hired messenger, they paid you to figure this out. But when it turns out, that 90% of the computers got infected, rather than saying, how do we improve our security or train our staff, et cetera, they got angry at the messenger. A very common pattern.

Christian Espinosa (Interviewee): Yeah. And that’s exactly what happened. And I can understand that and take it, but my team was getting the brunt of it as well, and they weren’t prepared for that sort of fury from our client, especially when they thought they did a good job. 

They accomplished the objective, my team did. And then they’re getting yelled at from the client. 

Morgan Friedman (Host): I just wanna call attention to what you said because it’s an interesting point. I’ve done maybe 70 of these episodes now, or 80, and almost always when there’s a problem, both sides know, ignoring who’s to blame, but usually you know when something’s wrong.

So, this is an interesting case where your team was like, yeah, we passed the test. We did it awesome. They’re super happy. When instead, it’s the exact opposite extreme. 

Christian Espinosa (Interviewee): Yeah. Because our objective is to find their weaknesses so they can secure them up for the future. And that’s what we did. And my team was very proud of themselves for coming up with this whole scenario, doing the research, finding the names, crafting the malicious code that wouldn’t set off their antimalware. 

And they… there’s a lot of work involved with this. And then, to have the client be ungrateful even though we did find things that need to fix, obviously.

Morgan Friedman (Host): Yeah. Maybe a related lesson here is, whenever you have to confront someone on a weakness, whether it’s on an institutional level weakness, like unenforced policies or on a personal level, “Hey Chris, you suck at x, y, z.” That is fundamentally a situation that’ll put the other side on edge, so you always need to tread doubly carefully in those situations. 

Christian Espinosa (Interviewee): Yeah. Yeah, and I think there’s a lot, a couple layers of dynamics with these scenarios, because typically in this… we do this a lot, this kind of test. Typically the CIO or the CISO or whoever they… whoever’s in charge of information security or cybersecurity will tell the CEO consistently that everyone is trained in cybersecurity, that they’re way more secure than they really are.

And then the CEO’s like, okay, let’s trust what you’re saying, but we’re gonna verify. We’re gonna hire a third party to come in here and verify everything that my team is telling me that we should find nothing, right? 

So then we find something… so it unravels another layer there that they’ve been misled by their team to this false sense of security.

Morgan Friedman (Host): You know? I love it. That is a good point. Because it’s not just you’re exposing a weakness, but a weakness where they even led to think that they have a sense of security there, which makes it doubly hard. If you suspect you have a weakness, then telling someone you have that weakness is not bad.

Like, if I really think I have some terrible disease and the doctor confirms it, it’s not as terrible as me thinking I’m healthy. 

Christian Espinosa (Interviewee): Right.

Morgan Friedman (Host): And then discovering I have a terrible disease. So that adds yet another emotional layer on top. So dealing with this, we need to be very tricky. I wanna share a 30-second story and then we can… and then you can tell us about what happened with the security guard.

There’s a classic analysis of business model, what’s called a “SWOT analysis.” We have a business look at the strength, weakness, opportunities, threats, SWOT acronym. I’ve seen and been involved in hundreds, maybe thousands of them over the years. 

And just a few days ago, I was presented a SWOT analysis, but for the first time, instead of the W standing for “Weakness”, this team labeled it as “Watch outs.”

I’d never heard it before. And what I thought was fascinating about that was, no one wants to be told their weakness or the weakness of the business plan. So, they like made it California-style euphemistic. No, these aren’t weaknesses. These are things you need to watch out for. And it was kind of brilliant, I have to say.

Christian Espinosa (Interviewee): Yeah, a simple change of words might change the perspective, but it’s still the same thing. We can call it something different, but yes, of course. 

Morgan Friedman (Host): Yes. 

Christian Espinosa (Interviewee): Maybe a little more socially acceptable today to call watch outs versus a weakness. 

Morgan Friedman (Host): So, what happened with the security guard?

Christian Espinosa (Interviewee): Yeah, so the security guard knew we were doing the test and he was overzealous in a lot of good ways. Like, he put the entire team through active shooter training. So, he is very proactive. But one of my team’s job was to like walk around the hospital and look for closets where they could get into a closet that had computers in it, or a wiring closet and plug in a device that give us control as well.

So, my guy was walking around checking doors, and I always tell people… I train my team on how to do this, so like, we have headphones in and act like you’re on… like listening to music or you’re on your phone. So then if someone notices you, you just keep walking and act like you can’t hear them.

So, my… got the guy on my team did this. He got into a calm room. He plugged in, a device that allowed us to remotely get into their computer system. So, there was two ways we had in. And then he was walking out, the security guard saw him on the camera, and then ran down to where he was. And told him to say, “Hey, stop.”

And my guy just kept walking. So he was walking out the door of the hospital and the security guard pulled his gun on the guy… 

Morgan Friedman (Host): Whoa. 

Christian Espinosa (Interviewee): Which I think was a little bit too much, and told him to freeze. But he was walking, like he couldn’t hear, but he could actually hear. But, you know, I told him how to handle these situations. And never did… 

Morgan Friedman (Host): Question. Question. Did the security guard know this particular guy was part of the project or did he, or the test, or did he actually think it was someone trying to hack the system? 

Christian Espinosa (Interviewee): So the security guard knew that this test was going on. So ideally, people don’t know, but he did know. So he… there was a high… a heightened sense of awareness. And I think a heightened sense to demonstrate how secure the facility actually was when there was a vulnerability that we found or a watch out or whatever it is you wanna call it. So, watch out. So, yeah. 

Morgan Friedman (Host): Feel free to start using the phrase, “Watch outs”, if you want. 

Christian Espinosa (Interviewee): Yeah. So, I think… again, it’s a thing where he is telling the CEO as well, how secure from a physical standpoint the facility is, and then we find a door unlocked, we plant something in there.

So it’s not making him look the best either. So he’s trying to kind of overcompensate. And I asked if that was standard protocol after the fact if somebody walked into a com closet at a hospital for you to pull a gun on them, and they said, “No, not unless the person’s armed.”

So I’m like, “Well, what made you think the person was armed? Because he had no gun.” So, you’re breaking your own protocol, just to sort of try to prove that you’re more secure than you are.

And that’s the challenge with a lot of these tests. Everyone thinks they’re secure, as I mentioned, and we come in and show them that they’re not, but that’s the whole point of the test, so they can become more secure after… 

That we go through our results and they learn the lessons hopefully, and change procedures, and change combinations, and maybe get rid of some of the staff, and make improvements. 

So again, this is my teammate who had a gun pulled on him. He was a little bit freaked out about it, you know? It’s like, not everyone wants to have a gun pointed on my team.

I mean, it happens sometimes, but normally, somebody will call the cops and say, this person is in the parking lot versus try to do something like that on their own. Especially when it was very evident the guy on my team did not have a weapon. 

Morgan Friedman (Host): So, there’s an interesting dynamic here that’s also worth exploring of people needing to demonstrate their power. And knowing that the security guard knew that a test was going on, it sounds like he had this emotional desire to really show his power. 

No, it really is under control. We’ll shoot people who try to get into closets and what’s interesting about that is, it’s hard to control that or compensate for that.

 Like, if someone is looking for your weaknesses, there’s this very natural human desire to go to the exact other extreme. And so I know I’m not just strong, I’m very strong. 

Christian Espinosa (Interviewee): Yeah, I think that’s what happened. It was like, oh, we caught him on the camera. So, we’ll show him how we handled the situation. From my team’s perspective, they all thought they did a great job, but they… having a gun pulled on you, getting yelled at by the CEO, is a little bit of emotional trauma I think, for some of my team members. Yeah. 

Morgan Friedman (Host): So what… I’ll make another observation on someone pulling your gun out. I feel like that is unprofessional at an obvious level. Because like it’s not called for in that and there’s like violation of the hospital protocol, but it’s also unprofessional at a more subtle level that’s worth calling out or mentioning, which is intelligence agencies or military training 101, like the first day… like what you’re the first years are emphasizing all about.

Don’t use your power, never use the gun. I remember when I was a little kid, I took karate classes, I hated it. But, I remember like half of every class was all about when not to get in a fight and avoiding being in a fight. So I feel like the real trained, serious professional… if you have a gun, it’s a bit like a magic power.

You can… you have life and death in your hand. So like the real professional, should have it very internalized that you only pull your gun as the absolute last resort when every other possible means to solve the problem has failed. 

So, the security guard running to it shows that he’s letting his emotional state overwhelm the professional training he was surely taught… 

Christian Espinosa (Interviewee): Yeah.

Morgan Friedman (Host): Of only use the gun as the last resort. And that in itself is even scarier. 

Christian Espinosa (Interviewee): I agree that professionalism should override that emotional state, like you said. But I think, if you kind of peel back the layers, he had been telling the CEO that everything was locked down and we wouldn’t find anything, as well.

So, it’s like a little bit of the ego involved in both these scenarios. The one with the CEO and the one with the security guard. 

Morgan Friedman (Host): Well, it is, so yeah. These are both people that have challenges managing their ego. So there’s this other factor that people tend to hire similar people like, asshole CEOs that I know tend to have asshole VPs and directors working right under them, while nice guy CEOs I know tend to have nice guy CEO, directors working under them.

So, it could be that there’s an even more subtle level to the onion that we’re peeling around this hospital that maybe there’s just a fundamental asshole or unprofessional culture. And that’s the sort of person that they hire for, starting with the CEO and that actually makes real change difficult.

Christian Espinosa (Interviewee): Yeah, that’s a good observation and I like that. And that’s probably true, you know? Because it’s just… yeah. People tend to wanna work with people or hang out with people like them. So yeah, there’s probably a lot of the truth to that. 

Morgan Friedman (Host): People wanting to work with them, hang with people like them, isn’t a bad thing. It’s very natural. If I’m someone who really values whatever, having a high IQ and logical problem solving, I’m naturally gonna hire people that I think have a high IQ and are good at logical problem solving, good or bad. 

If this is the way that humans approach their problems… If I happen to be a guy that values my loyalty to my ethical and moral code, above everything else, guess what? I’m gonna hire people who are really loyal to their moral and ethical code above everything else. 

So, it’s natural that teams, companies, hospitals, congregate around the particular personality characteristics of especially the leadership. In fact, there’s a saying, and I’ve heard this in a whole bunch of Eastern European language, it was like Russian, but I think I never heard it in English, where they say, “A fish rots from the head down.” 

Something like that. This realization against this similar point, like the… if the CEO is this sort of person, then the whole organization is going to be filled with the exact same type of problems. 

Christian Espinosa (Interviewee): I agree with that. Like when people complain about the service they get, and they asked to talk to the manager. My point of view is the manager’s part of the problem, just like you said. So, there’s no point. 

Morgan Friedman (Host): Yeah. Well, for escalating to the manager, I had never thought about it that way, but that actually makes perfect sense. That if there is such bad behavior, then the manager isn’t doing a competent job.

Christian Espinosa (Interviewee): Right. 

Morgan Friedman (Host): So therefore, what do you expect wanting to talk to the incompetent manager. 

Christian Espinosa (Interviewee): Right. 

Morgan Friedman (Host): Okay. So, interesting question, based on this experience, how did you go about changing your company’s policies, or communications, or what have you done differently to avoid these two types of scenarios going forward?

Christian Espinosa (Interviewee): A couple things. One, is to be way more clear on the tactics, and the types of tactics, and the scenarios that may come up. So, we would walk through those and get buy-in, from whoever hired us from the company. They typically don’t tell everybody we’re doing the test, so it’s whoever the trusted source is.

And then…. we’re… if we get buy-in, we also bring up scenarios like, “Well, what’s your policy? If we get into a communications closet, we plan a device and we run in the building, like, how do you normally handle that?” So, we try to ward off these things before they happen and then alter our tactics.

Because I don’t… the last thing I wanna do is put my team in danger or make my team get… not make them, but have them get yelled at by the client again for using too realistic of a tactic. 

So we do that and then I’m also… we also have one of the step where we say, if you want us to free, like before we launch anything, before we do any scenarios, like have you green light them, if you wanted to go that level of detail, we’ll do that as well.

So we make sure we… 

Morgan Friedman (Host): You mean green light the specific level? We’re going to do this test in this way? 

Christian Espinosa (Interviewee): Yes. Yep. So then, the CEO in that case would say, “Oh, I had full on awareness that they were gonna use this specific tactic. I had full on awareness that they are gonna try to get into the com closet, plant a device, and then leave the facility.” So, that doesn’t ward off everything, but yeah. 

Morgan Friedman (Host): So, I have a question on that. Since you’ve implemented that, does that stop them again from getting angry or not so much? Because my guess would’ve been, even if they knew, the asshole CEO would still be angry because there’s this type of person that, in my experience is the… it is really into the blame the messenger, like there’s fire under his ass.

He needs to blame someone else rather than himself, so it just doesn’t matter if he knew or not. 

Christian Espinosa (Interviewee): Yeah. So to answer that, we’ve had super confident, almost borderline arrogant people say, “Oh yeah, use that tactic that will never work with us.” We… my team is trained. So we do get the green light, and then the tactic does work, and they still get mad at us. 

Yes. And that’s the nature of the business I’m in. Like, I have to… people have this false sense of security and one of the things we have to do is, let them know like you’re… you haven’t been secure in a really long time. So, it kind of freaks people out to think every night, all the doors are locked in my house.

But the whole… the reality is someone’s been in your house pilfering through your stuff for a year and you never knew because you thought you were secure. So, it’s a big pill for people to swallow as well. 

Morgan Friedman (Host): So but, super brainstorming, I’m just trying to think based on… I know nothing other than what you told me here, so I’m trying… so riffing, I’m trying to think how would I go about minimizing, getting the clients angry at me, threatening to sue me like they did for these. 

Because telling them at this level of detail doesn’t really solve the problem. Here is one possible way, which is just pure gut instinct riffing, that I think I would try.

What if, after the green light conversations, after all that, before you get started, you have another conversation with them that goes something like this, “Hey, after we do the test, you…” I’m being you in this case, hey, and talking to the client. Hey… 

Christian Espinosa (Interviewee): Sure. 

Morgan Friedman (Host): When we finish this test, you sir, are going to get really, really angry at me. Because I’m going to demonstrate the hole in your security, and you are going to blame me. So, before I even do the test, I just wanna get that out of the system…. out of your system, and now, so, we’re all prepared for it. 

And then continue in there and just like assume the conclusion first and have him get angry there and doing that.

And I feel like, explicitly saying that so directly and letting him that anger out like that has one of two results. First, if it so happens that they are secure enough and you don’t find anything, great, there’s no harm, you wasted 10 minutes in that talk beforehand. 

But, if you do find something, I’m like assuming the CEO or whoever isn’t a total idiot, if you do find something, even with the highest ego and even needing a scapegoat, he’s going to remember that you said, you’re like, Chris told me, I would be furious doing this work. And I am, and just knowing that.

Just lessens the fury incredibly because then suddenly the furious part of the show that Chris is putting on, and that is what he wanted. That’s what he knew was going to happen. And knowing that, just makes it so much more less angry. What do you think of my suggestion?

Christian Espinosa (Interviewee): I think it’s a brilliant suggestion and yeah, I wish I would’ve thought of it. Because it’s… I’ve used that tactic in relationships and other things in the past where I said, later I’m gonna do this and you probably will get upset with me, right? But then when it actually happens or they’re not as upset, like you said.

Morgan Friedman (Host): This is my, this is my… 

Christian Espinosa (Interviewee): It’s a weird human psychology thing. Yeah. 

Morgan Friedman (Host): This is my exact point. Telling someone you are going to be angry at me over X, Y, Z, just makes them half as angry or intense as angry as they would’ve been. Just because by you saying it beforehand, it suddenly doesn’t become this random bad thing that Chris did. It’s rather part of Chris’ plan. 

Actually, I like how I’m saying it. This is good. Because the difference between predicting it beforehand or not, is Chris fucked up? Because if you don’t predict it, Chris fucked up. But if you do tell him it’s going to happen, it’s part of Chris’ plan. 

And guess what? You get angry when the person you… that you’re paying all this money to and hired fucks up, but when the person you hired executes on the plan and you not just know the plan, you know the emotional reaction to the plan, you know you’re getting angry with the plan, guess what? It’s still a plan. 

He was still following the plan, so it’s no longer him fucking up. So, it makes you feel so much better.

Christian Espinosa (Interviewee): Yeah, I think setting the expectations on the emotion response, given a couple of scenarios would be, yeah, I think that’s a great improvement. 

Morgan Friedman (Host): Okay. Come to Client Horror Stories podcast to share a horror story. Leave with getting awesome advice on how to prevent these scenarios even more.

Okay. This is… again, that was a fun riff and a fun brainstorm. To wrap up, are there any other lessons from this incident you wanna share or step back? Are there any other related lessons about professionalism or dealing with difficult clients like in general that you wanna share to wrap up the episode?

Christian Espinosa (Interviewee): Yeah, so what I do now, with much more scrutiny is I realize if a client… when they’re a prospect, not even a client. If they’re difficult in the prospecting aspect of the sell cycle, they’re probably gonna be a very difficult client with delivery. So, I am very more selective now about which clients I even will take on as a client to prevent these sort of scenarios.

Because from my experience, anyone that’s difficult in the sales process and those emotions come out, the asshole factor comes out. I don’t want to deal with them and I don’t want my team to deal with them on the delivery aspect. Because it’s the same, if not worse, because they paid you, so you think they can treat you however they want typically.

Morgan Friedman (Host): That is… yeah, that is a very good point. I do the exact same thing how people treat you before you even start is unfortunately, almost always, that’s like the highest point. 

Christian Espinosa (Interviewee): Yes. 

Morgan Friedman (Host): So like no matter what that is, it’s going to get worse. So, if it already starts at a lower than average place, that is a massive risk factor.

There are few techniques around that. One of which I’ll mention is, have you ever heard of the phrase, it’s sometimes there’s some of the people I’ve worked with call “The Japanese No.” Do you know that one? It’s… 

Christian Espinosa (Interviewee): The Japanese No? 

Morgan Friedman (Host): The Japanese No. I actually don’t even know if this is like a known thing. I think I’ve never Googled it or just like, this is what people… how people in my circles talk. The Japanese No is basically when you don’t wanna take on a project, so you give them a crazy high price. 

What’s interesting about that is, sometimes they say yes to the crazy high price. And it’s kind of brilliant because it’s like it’s win-win. They’re gonna be an asshole and really difficult. That’s not worth it for $10,000. But, for $60,000? That might be worth it. So… 

Christian Espinosa (Interviewee): Yeah, I’ve actually tried that tactic and they have… I’ve had a couple people say yes, and it… I wish I would’ve made the price 10 times higher than what I already thought was outrageous, because the asshole factor was magnified sometimes.

Morgan Friedman (Host): Yeah. So… 

Christian Espinosa (Interviewee): And my team had to deal with it. No, like, it just burned up a lot of… yeah. People’s time and energy. Yep. 

Morgan Friedman (Host): But like often in business, like that’s okay to do if there’s a price. It’s just setting the right price. And one of the learnings from that is, the price is often higher than you think.

Rather than setting it as 5x the normal price, next time, try setting it 20x the normal price. And guess what? Instead of 10% of the assholes saying yes, 5% of the assholes will say yes, but you’ll make four times as much and you’ll have fewer assholes to deal with. 

Christian Espinosa (Interviewee): It’s the asshole tax, I guess, is what we’re adding to the bill. Yes. 

Morgan Friedman (Host): So, I think they’re like tactics. So, there’s an earlier part where you wanna first detect how much of an asshole someone is, and then there’s adding on the asshole tax and it’s also like not a black or white thing. Like, there are gradations of being an asshole. 

As like, for an example, I have a family member that I’m related to by birth, by genetics, and who’s… will not use for the very, very, very small chance this person might watch it, but, this person is not a particular asshole and like would never rip people off, would never do any of these terrible things we’re talking about.

But, this person is a sort of person that asks 4 billion questions and just sends like, he’ll send messages on every little needless stupid detail, nonstop. So, this isn’t like a huge asshole where you need 20x the price and he’ll be miserable, but it’s just this mid or lower level where there’ll be this constant stream of annoyingness….

Christian Espinosa (Interviewee): yeah.

Morgan Friedman (Host): For no reason that you’ll just need to deal with it. So, maybe that’s instead of a 20x, maybe that’s a 10x. A 10x one. So, there’s a gray scale here. 

Christian Espinosa (Interviewee): Yeah. We need to chart out the scale. 

Morgan Friedman (Host): That would be pretty funny. Okay. Chris, so many interesting insights have come from here. This was a fun episode and I love your story of having a file, “Layoffs 2023.” It’s genius. Well done. 

Thank you for coming on the show. It was an honor and everyone who’s watched the episode up until this point, I hope you’ve had as much fun watching us as we’ve had talking about this. Bye-bye everyone. Thank you.

This transcription belongs to Episode #53: Christian Espinosa’s Story, please watch the complete episode here!