3555: Immersive on Why Incident Response Plans Break Down in Reality
Tech Talks DailyJanuary 15, 2026
3555
28:3721.85 MB

3555: Immersive on Why Incident Response Plans Break Down in Reality

What really happens inside an organization when a cyber incident hits and the neat incident response plan starts to fall apart?

That question sat at the heart of my return conversation with Max Vetter, VP of Cyber at Immersive. It has been a big year for breaches, public fallout, and eye-watering financial losses, and this episode goes beyond headlines to examine what cyber crisis management actually looks like when pressure, uncertainty, and human behavior collide. Max brings a rare perspective shaped by years in law enforcement, intelligence work, and hands-on cyber defense, and he is refreshingly candid about where most organizations remain unprepared.

We discussed why written incident response plans often fail when they are needed most. Cyber incidents are chaotic, emotional, and non-linear, yet many plans assume calm decision-making and perfect coordination. Max explains why success or failure is often defined by the response rather than the initial breach itself, and why leadership, communication, and judgment matter just as much as technical skill. Real-world examples from major incidents highlight how competing pressures quickly emerge: whether to contain or keep systems running, whether to pay a ransom or risk prolonged downtime, and how every option carries consequences.

One idea that really stood out is Max's belief that resilience is revealed, not documented. Compliance and audits may tick boxes, but they rarely expose how teams behave under stress. We explored why organizations that rely on annual tabletop exercises often develop a false sense of confidence, and how that confidence can become dangerous when decisions are made quickly and publicly. Max explained that the best-performing teams are often the ones that feel less certain in the moment because they question assumptions and adapt faster.

We also dug into the growing role of crisis simulations and micro-drills. Rather than rehearsing a single scenario once a year, Immersive focuses on repeated, realistic practice that builds muscle memory across technical teams, executives, legal, and communications. The goal is not to predict the exact attack, but to train people to think clearly, collaborate across functions, and make defensible decisions when there are no good options. That preparation becomes even more important as cyber incidents increasingly spill into supply chains, manufacturing, and the physical world.

As public scrutiny rises and consumer-led legal action becomes more common after breaches, reputation and response speed now sit alongside forensics and recovery as business-critical concerns. This episode is a candid look at why cyber crisis readiness is a discipline, not a document, and why assuming you will cope when the moment arrives is a risky bet.

If resilience only shows itself when everything is on the line, how confident are you that your organization would perform when the pressure is real and the clock is ticking?

Useful Links

Thanks to our sponsors, Alcor, for supporting the show.

[00:00:04] Today, I'm going to be joined once again by Max Vetter, VP of Cyber at Immersive. He's also a friend of the show, and one of the reasons for that is he brings a rare combination of frontline experience from his time in law enforcement and intelligence work through to leading cyber resilience programs for global organizations.

[00:00:25] I think he has somewhat of a unique vantage point here with his career. And let's be honest, over the last 12 months, we've seen a huge increase in cyber incidents from high profile retail breaches to supply chain disruptions that ripple far beyond IT teams.

[00:00:43] But today, I want to explore why a cyber crisis never unfolds the way that those incident response plans suggest. Yeah, everyone loves a good plan. We have everything in place. But when a cyber crisis hits, pressure, psychology, and competing priorities all collide in the exact same moment. So maybe resilience is something that you can only truly see when things actually start to go wrong.

[00:01:11] So I want to talk about crisis simulations today, decision making under stress, and the growing role of AI in both attack and defense. And also why compliance alone is never enough when the stakes include reputation, revenue, and trust, and everyone goes into panic mode. If that sounds familiar, you're going to love this one. So buckle up and hold on tight, because I'm going to beam your ears directly into the conversation.

[00:01:40] But before I get my guest on today, I want to give a quick thank you to my friends at Denodo. The data world is louder than ever. AI hype, lake house complexity, and pressure to deliver more with less. These are things that I talk about every day on this show. But Denodo is helping businesses make sense of it all, because they provide a unified data foundation for trustworthy AI, lake house optimization, and data products to finally bring service to life.

[00:02:08] So if you're ready to unlock real outcomes, simply visit denodo.com today. But now, it's time for today's interview. Let me introduce you to today's guest. So a massive warm welcome back to the show, Max. You've been on before. I think it was around a year ago. But for anyone that missed our previous conversation, can you remind everyone listening a little about who you are and what you do?

[00:02:33] Yeah, thanks, Neil. Yeah, good to be back. Yeah, my name is Max Vetter. I'm the VP of Cyber at Immersive. Before Immersive, I did various things. I was in the police. I did some open source intelligence investigations, both in private sector and the police. And then I was a contractor into GCHQ as well. So a bit of potted history and interesting things and investigations.

[00:02:56] Then I joined Immersive, and that was seven years ago now. And we've grown a lot since then. And yeah, I look after the cyber team. So covering all the exercises, simulations, labs and everything we create from the cyber side. Yeah, across all the different products we have. Well, it's a pleasure to have you back on the podcast. It's been a big year for cyber incidents and breaches and incident response plans.

[00:03:25] They've been in the news all year. And particularly, I think the Land Rover incident cost over a billion pounds or a billion dollars, a lot of money. M&S, something like 300 million as well. So I'm curious, from everything you've seen this year, how do cyber incidents, how do they differ from what an incident response plan anticipates? Because there seems to be some kind of mismatch here from some of the headlines I've seen. But what are you seeing? And explain the difference there.

[00:03:50] Yeah, well, the plans are always typically written down, right? And the problem with a written down plan is it's linear. And incidents are not linear. They're pretty chaotic. It doesn't factor in things like psychological stress, doesn't factor in novel attacks, doesn't factor in coordinating legal, PR, tech, board, everything.

[00:04:15] If you're around the world, if someone's on a plane, all these little things that are really difficult to understand before an attack happens. And typically, what we see is the response. So whether a company's done well or badly after a cyber attack is usually actually the response, not the crisis. So it's not actually how bad it was to start with. It was how they deal with it and their response.

[00:04:45] And obviously, you mentioned M&S and Jaguar Land Rover. There's so many things we can talk about in both of them. But essentially, a huge amount of money lost for the company and threat to people's jobs and everything. And with Jaguar Land Rover, not just that was one of the things that came out was their supply chain. You know, they seem to have the cash to be able to get over it. But they thought, you know, tens of thousands of their suppliers might go out of business.

[00:05:15] So that, yeah, they clearly hadn't run those, run it to that detail. If they were running crisis management, you know, tabletops or simulations, they clearly hadn't run it to that extent that they could really understand the impact of a cyber attack like that. Yeah, it reminds me of the old Mike Tyson line. Everyone's got a plan until they get punched in the face. It's kind of right, isn't it? When that attack hits, everything goes out the windows. Everyone goes into panic mode.

[00:05:46] Yeah. And what would you say are some of the competing technical and business pressures that teams are facing simultaneously during a severe cyber event? I'm from an IT background. I've been in those P1 situations. Lots of people running around with a laptop with a screen open, running into the data center, etc. But what are you seeing here, some of those pressures?

[00:06:07] Well, I think JLR, to speed it up, is a good example of, you know, do you contain or is it containment or continuity? Do you pull the plug? Because obviously, very quickly, if you're talking about ransomware, we always have the interesting bit is, would you pay the ransom? And we do this in all the conferences we go to.

[00:06:35] And I thought all the stats show that 85% of people say, no, don't pay the ransom. And as an ex-police officer, obviously never pay the bad guys. That's what we always said. However, when you actually get into, you look at the actual data, it's 85 or 90% of people actually pay. So it's all very well and good saying, well, we don't want to pay the bad guys because it's going to, everyone knows you shouldn't pay. But the choice between, are you going to have a job tomorrow or not?

[00:07:02] And that is the choice all the way up to the CEO when it comes to these attacks. Is, well, do you pay, say, 5 million for someone like Jagger and Rover? I don't know, actually, the amount. But, you know, that's nothing, you know, the amount of money. And so, and then, yeah, do you turn it off or do you try and continue? Do you want forensics or do you want to recover?

[00:07:26] There's all these kind of totally competing things that are happening during an incident. And obviously the technical people will have one opinion, but really the board or the exec are the ones that have to make all the decisions at that level. Knowing techies and being a techie, they would always be, oh, let's do forensics. Let's do containment. Let's, you know, the probably less economical for the business side is like, yeah, never pay the ransom, et cetera, et cetera.

[00:07:55] Whereas the exec have to make those harder decisions, I suppose, to say, well, it doesn't matter if paying the ransom doesn't matter. We just need to get back online faster. And they're kind of the same when it comes to, you know, hate the PR bit. Do you, are you totally transparent? Does that put you in other risks? Yeah. So there's, there's every kind of decision is a, is a trade-off and there's, we call it wicked problems.

[00:08:23] There's no set of, set of, you know, you have four options and they're all terrible. So which, which one do you pick? And so that is, um, that's something that everyone has to get used to. This is a terrible day and you're going to have terrible, terrible choices, whatever happens, but you just got to pick the worst, terrible one. And when I was doing a little research on you, just having a bit of a catch up, one of the lines that you use, which I absolutely love is resilience is revealed, not documented.

[00:08:51] So what does an incident actually expose about an organization's readiness and that reports or compliance, et cetera, all those kinds of checks just, just cannot reveal. Tell me more about that. I'm sure on the last, last episode, I had a rant about compliance, but I'll, I'll go again. And there's, I'll say, I'll, I'll say the small print that there's nothing wrong with compliance. And I think compliance is good and et cetera, et cetera.

[00:09:16] Uh, but yeah, compliance, my opinion is a minimum, you know, if you need to be compliant, sure, get the compliance. That is not, it's a minimum bar to me, not a maximum for readiness and resilience. Uh, so, you know, usually you can be compliant with very little actual security or resilience, uh, in your organization. You can always get the, the tick box. Um, and that's the, that's the point really.

[00:09:44] Why are you doing, um, why are you doing the, if you're doing the resilience tests to get that, um, compliance check, it's not going to be adequate. You know, if you're just doing the document to pass the audit, um, you're not going to be measuring the right things and you're not going to be thinking about how you actually do it in the real world when it happens. Uh, same with the mindset, you know, are you thinking if this happens or are you thinking when this happens? Um, so all those things go into, yeah.

[00:10:13] So I'm, I'm not a fan of security by compliance. I actually think security and compliance often, uh, are opposed to each other. Um, and so compliance, I think obviously you need to meet your regulatory, um, commitments and everything else. Uh, obviously as Jaguar Land Rover will, but I would say this, this is a whole different ball game, uh, actually being resilient. Um, yeah, so you can, you can document what you've done, but it's not really about a document.

[00:10:42] It's about, uh, an ongoing, you know, security is a process, not a, not a, um, that, that is ongoing all the time, not a static document. And in your experience and some of the conversations you've had with your customers, how do you distinguish between teams that panic and those that perform under pressure? Are there any metrics or signal strong or signals that signal a strong crisis response capability?

[00:11:09] Any good or bad examples you can share there and the kind of difference that it makes as well? Yeah, I think the panic versus performance is a, is a key thing. And it's hard to measure because as we've said, this crosses every part of the organization. So any one metric, you know, we've got things like mean time to detect, mean time to contain, mean time to acknowledge all the, all the MTT, whatever's for, for the, the SOC team or the, um, instant response team. They're all really good to measure.

[00:11:39] Um, but obviously that, that's only in isolation to the technical team. Um, so you, you kind of have to, you have to measure a lot of different measures across each, each team. And then there's a lot of our, um, you know, confidence, you know, making bad decisions with high confidence is probably a really good, uh, indicator that that's not great.

[00:12:03] You know, cause after the fact, if, you know, in the, if you can measure it, you measure your decision-making and you measure your confidence. The, the worst way is when people are making bad decisions with high confidence. Um, and, and you can measure that as well. And, and so that's, that's one of the things that we definitely, um, target is like, oh yeah, you're really confident, but you're terrible. Um, which is, that's the most dangerous if you're unconfident and you're like, oh, I'm not sure if this is a good or bad decision.

[00:12:31] At least you can, you know, you, you're, you're, you're better prepared to deal with that. So yeah. Um, other things like false positive rates and, you know, your drill score, if, if you have one, if you don't have one, you know, if, if you're not measuring your tabletops. This is what lots of companies do that most companies will do tabletops. They'll do them, you know, either once a year, once, once, uh, uh, every six months. Are they actually measuring anything or is it, oh, that's good. We've had a chat.

[00:13:02] We feel like we're better prepared. Um, so they, you should be measuring, um, and scoring yourself and because how else would you, you know, every other metric, the exec have to, uh, show they'll have to score and say, oh, we're getting better or worse on this metric. Um, so it should be the same for tabletops and drills really like, how do you know you're getting better? So if you're not scoring it, definitely do that.

[00:13:28] And I also think many organizations have traditionally seen planning as like a, a one-off exercise or even annual compliance training where users just hit next, next for an hour. So we're moving away from that, thankfully, but what does ongoing preparation look like in practice? And how does it shape performance in an actual incident? Any, any examples there? Yeah.

[00:13:52] So thinking of it as a, you can never, you can never simulate everything. You know, I think this is, you can never try enough drills. It'll always be the thing that you haven't practiced, you know, and the point isn't about, um, having practiced the exact thing that happens because that's just never, it's never going to happen.

[00:14:11] And what, what the practice is, what the simulations are about is getting you in the right mindset and, you know, doing that muscle memory of how you deal in a crisis, how, you know, the people who are good under pressure, people who maybe aren't. Um, and throughout the year, it's, you know, we call them micro drills. So how do you test yourself in a simulated environment multiple times? And you can do this, you know, in a cascade in way.

[00:14:38] So we don't expect the execs to be doing this, you know, every month or every, every two months, uh, cause they just, they typically don't, don't have time, but you can definitely practice your sub teams, uh, over and over again, both the technical teams, like what do they do? Um, in an, in a, as likely simulated incident as possible.

[00:14:58] Um, and so, yeah, if you just continually have a cadence of that, so rather than your classic training, your micro drilling, that's really the, the key there is, is, is you take the point about training is to get an outcome and get people better skilled at something, but actually skip over the, you know, the, as you said, the compliance training that you'd learn nothing from and just get to the, the simulations where they're always more engaging.

[00:15:25] Anyway, people actually enjoy them more, um, and therefore they learn more as well. So those micro drills, you know, we, we have all those labs that we talk about and they're essentially each one of them is a, as a micro drill. Um, yeah. And then just continually threat informed updates. So how do you continuously understand the threats against, uh, particularly your organization, um, understand things like cognitive, cognitive load, um,

[00:15:54] do people burn out in a, in a stressful environment very quickly? Um, and then, yeah, the whole, when it happens, not if is the key really just, just assume breach will happen or has already happened. And, and how do you, how do you deal with it then? This month, I'm proud to be partnering with Alcor and anyone who's tried to scale an engineering team across borders. They will know firsthand how messy it can get because they deal with endless providers.

[00:16:23] Then there's confusing rules to deal with in each and every region and fees that always seem to surface at the last minute. Now Alcor, they solve that by acting as a partner rather than just an intermediary. And they focus on tech teams that expand in Eastern Europe and Latin America. And they bring employer of record services together with recruiting. So essentially they help you pick the right country, source the right engineers and assess them properly.

[00:16:53] And then get them active for you and your company within days. And one of the things that stands out for me is the financial transparency. Around 85% of what you pay goes directly to your engineers. Their fee goes down as your team grows. And if you ever wanted to bring your team in-house, you do so with no exit costs. That kind of clarity is why Silicon Valley startups, including several unicorns, have chosen Alcor.

[00:17:21] And you can find out more by simply going to alcor.com slash podcast or follow the link in the show notes below. And crisis simulations, I think they're worth mentioning as well. They're becoming more and more popular this year. So what is it that makes a simulation effective now? And how can it help reveal maybe some gaps in both leadership and technical teams before one of those attacks hit? Yeah, I think it's the gaps that you don't see.

[00:17:51] We had one where there's a large organization and we'd run a drill. So we had the technical teams and an exec running at the same time. And they had their playbook. And so they were using the playbook for this crisis simulation. And it says, oh, you know, go and phone the legal team. And in the book, it says go and phone the legal team. But what it didn't have was the number of the legal team. And they spent half an hour trying to find the number of the legal team.

[00:18:20] Because it was just, you know, one of those gaps that in the book, it probably didn't want to put the number in or whatever. And so, yeah, if it's not actually useful when it actually happens. Because everyone, I'm sure, on the tabletops went, yeah, yeah, we just found the legal team at this point. But they actually didn't have that number. So, things like that, like injected realism, you know, like I talked about, no good answers or all bad answers.

[00:18:50] What do you do? And then, yeah, cross-functional friction. For my work in the police, especially forensics, working with very technical people and then trying to coordinate that with the police officers. We always came across friction there because technical people talk in a different way to police officers. And the board will talk in a very different way to the forensics and the incident response team. Do you know that when they talk to you, you actually understand what they're saying?

[00:19:18] That's a really interesting point of friction where it's like, I don't understand what these guys are telling me. How do I make decisions? They're not speaking English. You know, if the board are talking to the technical people. So, yeah, usually in big organizations, each function runs very well. But in these crisis, you know, crises or the crisis simulations, it identifies they have to cross over.

[00:19:45] You know, every team has to cross over and interact with the other teams for it actually to be effective. So, it does reveal gaps in your technical teams, your, you know, process. Yeah. Yeah. And then obviously a debrief after as well. Because that's the tendency what we have is that if they get the exec in a room, it's very rare because, you know, most of these large organizations, people are flying all over the world all the time. And they do it and they feel great, we've done it.

[00:20:12] But actually, if including the exec, if they don't have a debrief and they don't actually learn from what they've done before, it's just everyone's in the room. You know, they're not getting the learning. So, definitely debrief and learn from each time. And as you know, I love busting myths and misconceptions on here. Maybe pave the way to let you have a little rant here for a moment. I'm sure you've seen a few on your news feed.

[00:20:40] So, based on everything that you've seen over the last 12 months, is there a single misconception about crisis readiness that often leaves organizations exposed? Any misconceptions that we can finally lay to rest and help you sleep over the holidays? Anything you can share?

[00:20:57] Yeah, I think what we've seen with M&S and Jaguar Land Rover is the connections to the real world, operational technology and supply chain. Just companies are not ready. You know, they're not ready for what happens if, you know, getting clearly it got from the IT to the OT.

[00:21:25] And what would happen if you had to close down all your manufacturing plants for five weeks? And it's not, I'm not blaming Jaguar Land Rover. I truly believe most organizations, if they get hit, would do as badly. And, yeah, I think it's really not, it's not a paper exercise. That's the thing. It's more a, and it's not good or bad luck.

[00:21:52] But, oh, you've done well because the hackers weren't good enough or that's a fallacy. It's really learned discipline that you need to practice. You know, are you ready for, you know, what happens if we turn all our, you know, have to close down every manufacturing plant? What happens with all our suppliers? What happens, you know? And supply chain is, comes up and over and over again.

[00:22:17] CISOs are really concerned about the supply chain, but it's a really difficult problem because, you know, some of those, they have thousands and thousands of suppliers. So, yeah, it's really, but the good thing is it is a learned muscle. You can build that muscle as an organization, but you have to, you know, you have to go to the gym, I suppose, for the want of a better analogy.

[00:22:42] You can't assume you're going to be able to have that muscle without going to the gym and actually practicing. So, yeah, that's really the, you know, the organizations seem to just assume they'll be fine. But, yeah, they really need to understand that you have to practice. And, yeah, you don't get muscle memory if you don't use it. Completely agree. And before I let you go, you might not be able to answer this, I don't know if you've heard this or heard anything around it.

[00:23:11] But one of the things I've heard over the last few weeks is repeated ads on the radio telling consumers to join a claim against breach retailers. I think there's one for M&S and one for co-op. I think the website's jointheclaim.com. And this got me thinking regulatory fines, they were once the financial threat, predictable and often manageable for businesses that have been the victim of a breach.

[00:23:36] But group actions driven by consumers, that's introducing a different kind of exposure, one shaped by emotion, media attention and momentum. So, reputation, transparency and response speed suddenly matter as much as some of the forensic detail that we've discussed today. And a slow dismissive response could fuel legal action rather than calm it. But have you heard these ads? Anything about that that you'd like to comment on?

[00:24:04] I think the talking about the M&S did a very good job in terms of PR, you know, they and they had a people get talking about, you know, they just had trust from from their customers. Now, obviously, that started to wane the longer it went on. And I think, you know, it was about a month or so before, you know, they I think they turned it around in time.

[00:24:30] But that is it's it's fascinating how much of a PR thing it is compared to a technical or or a board thing. I think, yeah, there's people will give especially with all the attacks. I think it's accepted now in in in public that these attacks are not possible to stop. And so, in a way, you can use that. You can and with good, you know, good PR, you have to have a good level of transparency.

[00:24:58] If people think you're lying to them, you use that very quickly. And, yeah, it's really interesting question about group actions. And and maybe we'll see more and more of that, because obviously that's, you know, I'm sure people lost a lot of money and, you know, so, yeah, that is a I don't know if cyber insurance will cover group actions against. Yeah, be interesting to see if that if that develops more.

[00:25:25] Obviously, it depends on the the organization affected, whether they have consumers as such as well. But, yeah, I think really that's where PR comes in is keeping the public on side and just say, you know, this is a really bad day. We're doing as much as we can. You know, other organizations are affected as well.

[00:25:45] And and and so and I think that's that's just as important as the really important decisions and the technical teams doing their work is getting that that kind of PR right as well. Yeah, completely agree. I think it's a great moment to end on. It's been a blast as always, Max. And before I let you go for anyone listening wanting to learn more about immersive labs, connect with you or your team, etc. Anywhere you'd like to point everyone listening?

[00:26:12] Yeah, immersive labs dot com is the is the the best place or yeah, or or contact us. And there's there's you know, you can you can do your you can get a demo there. You can have a look at all our products, the labs and the ranges and the crisis simulations, all that's on the website. Perfect. I'll have links to everything there.

[00:26:34] And love chatting with you about how a cyber crisis can throw an organization into an almost fog of war, grappling with dozens of competing priorities all at once, facing a split second technical and business decision that will be scrutinized for months to come. And maybe that's why organizations should be using crisis simulations to stress test the leadership and security teams and build the judgment, clarity and collaboration needed. That's a few takeaways for me there.

[00:27:03] But I'd love everyone listening to get involved in this conversation, share their experiences. But once again, Max, thank you for starting this conversation. Pleasure as always. Thanks, Neil. Cheers. It's always a pleasure speaking with Max and I appreciate how clearly he brings to life the reality of cyber crisis management. It's not a technical checklist, but it's a human and organizational challenge played out under incredibly high pressure situations.

[00:27:32] And as always, love to hear your thoughts on today's conversation. Have you experienced a cyber incident firsthand? Did that incident plan go out the window as soon as everyone went into panic mode and the boardroom and boardroom execs are coming down, piling on the pressure, wanting continuous updates right in the middle of the investigation process? Or have you taken part in a crisis simulation that revealed a few uncomfortable truths?

[00:28:00] Please, I want you to share your perspective, your stories, your insights, your experiences and join this conversation. So pop over to techtalksnetwork.com. You will find nearly 4,000 interviews. But most importantly, around this episode, I want you to hit record on Send Neil an audio message and let me know your thoughts. But that is it for today. So thank you for listening as always. And I'll speak with you again tomorrow. Bye for now.

[00:28:35] Bye for now. Bye for now. Bye for now. Bye for now. Bye for now. Bye for now. Bye for now.