2843: Securing GenAI: Strategies and Solutions from Zscaler
Tech Talks DailyMarch 25, 2024
2843
37:2721.66 MB

2843: Securing GenAI: Strategies and Solutions from Zscaler

How are organizations harnessing the transformative power of generative AI (GenAI) while navigating the labyrinth of security risks it introduces? In this episode of Tech Talks Daily, we're joined by Sam Curry, the Chief Information Security Officer at Zscaler, who delves into the intricate dance between innovation and security in the age of GenAI.

As we unfold the layers of Zscaler's latest global study, "All Eyes on Securing GenAI," Sam offers an enlightening perspective on how businesses are rapidly integrating these tools, the security implications at play, and the steps crucial for safeguarding their digital ecosystems.

The study's findings are a wake-up call, highlighting the enthusiasm for GenAI across sectors, alongside an acute awareness of its potential security pitfalls. With an overwhelming majority of organizations jumping on the GenAI bandwagon, the conversation shifts to the delicate balance between seizing opportunities and mitigating risks.

Sam shares his insights on the pivotal role of IT in driving GenAI adoption, the pressure points from various stakeholders, and the transformative impact of a zero-trust approach in securing GenAI usage.

Through a deep dive into the challenges of data leakage, the necessity of comprehensive visibility and control over AI applications, and the strategic implementation of data protection measures, this episode is an essential guide for businesses at the frontier of the GenAI revolution.

[00:00:00] Are we racing too quickly towards the future of artificial intelligence without considering

[00:00:07] the consequences? Well today on Tech Talks Daily, we're going to explore the world of

[00:00:13] Generative AI but most importantly, the angle we're taking today is the security challenges

[00:00:20] that it poses. Because joining me today is Sam Curry, C. So at a company called Z-Scaler

[00:00:27] or Z-Scaler, depending on where you're listening of course. We've had Nathan Howe on the

[00:00:32] podcast from the company I think three times over the years but today it's Sam that's

[00:00:37] going to be unraveling the Foundings of their global study and titled All Eyes on Securing

[00:00:44] Gen AI. We're going to shed some light on how organisations are currently navigating the

[00:00:50] promises and the barrels of Gen AI tools. But as we sail through the waves of technological

[00:00:57] advancements, through digital uncharted waters today, Sam's going to guide us through the

[00:01:03] sea of opportunities that Gen AI presents too. And yes we will explore the risks that

[00:01:10] loom over businesses and some of those essential steps that we must take to secure our digital

[00:01:16] future. Now before I get today's guest on, quick shout out to the sponsors of Tech Talks

[00:01:21] Daily because in today's remote first world I think settling for outdated managed file

[00:01:26] transfer solutions means ultimately you're risking your sensitive data. But if you are

[00:01:31] great to kite works, the gold standard insecure MFT, boasting FedRamp, moderate authorisation,

[00:01:38] kite works isn't just secure, it's a complete transformation of how your business handles

[00:01:43] file transfers and the communications. So say goodbye to Compromise and hello to unmatched

[00:01:49] security and efficiency and you can do that by making the switch to kiteworks.com, visit

[00:01:55] kiteworks.com to begin, that's kiteworks.com to secure your data and empower your business.

[00:02:02] But now let's get today's guest on. Book a lup and hold on tight as I beam your ears

[00:02:08] all the way to Boston. Sam Curry, CISO is East China, he's waiting to join me today.

[00:02:15] So a massive warm welcome to the show. Sam can you tell everyone listening a little

[00:02:20] about who you are and what you do? Absolutely thanks for having me Neil. My name is Sam

[00:02:25] Curry, I am VP and CISO for Zscaler or Zscaler. And yeah most of my day is spent buried

[00:02:33] I think at least hip deep in cybersecurity of various flavors. And I've been in cyber

[00:02:39] for, I'm ashamed to say 31 years now. So yeah there's that. I say I'm not ashamed

[00:02:44] that I've been it just to shame that it's that long. He doesn't feel like he's that

[00:02:47] long meal. But I've been a CTO. I've been a chief product officer. I've been a CISO.

[00:02:53] It's for startups, for big companies RSA was one of them, CA was one Macafee, CIGAS, isn't

[00:02:58] called CA now. But I and I have a number of patents in the space, it's been an architect

[00:03:03] etc so I'm going to shut up now because I think conversations are probably more interesting

[00:03:06] probably the tall and I'm going to break the rule first of all because it is a tech

[00:03:12] podcast and we're going to mention the word AI within the first few seconds. It's almost

[00:03:16] the law now isn't it? But I mean if we go back what just 12 months ago we were talking

[00:03:21] about chat GPT and Gen AI being able to beat lawyers at passing the bar exam. 12 months

[00:03:28] later we're now talking about text to video wake and literally type any command in

[00:03:32] it all and it will automatically create a video for you. One of the things that put you

[00:03:37] on my radar was having read the old your recent report. I think it was all eyes on securing

[00:03:43] Gen AI and one of the reasons I wanted to get you on here is everyone's excited about

[00:03:48] AI and as an XIT guy there's also that side of me that knows the dangers of moving fast

[00:03:53] and breaking things. What were the key findings in this report? I'm curious. Well there

[00:03:59] were a few. I think maybe the most interesting thing is that almost everyone in the report

[00:04:06] thought that it was a significant technology that in other words it was going to have a

[00:04:11] massive area to hold their business right so you know what they said was that 95 it

[00:04:17] was 95% said that in some way shape or form that it would have a huge impact on their

[00:04:22] business but over half of them considered it to be one of the top risks for the business

[00:04:27] which that seems like a contradiction but maybe it is speaking to the rate of change and how fast

[00:04:34] this has hit us. It's disruptors when they come really weight on our timetable right and the

[00:04:39] sense is that this is going to change how we work and whom we work with and we and I think

[00:04:46] the scramble has been on to how do we get this into our organizations in a responsible way and

[00:04:52] that and that is that that was at the heart of it. There's a lot more data in the report itself

[00:04:58] but that was the significant experience. For instance a third of organizations failed to act

[00:05:04] on security concerns in other words they were embracing this before they started

[00:05:08] they were embracing this before they started to really do the security journey. I've yet to

[00:05:12] speak with a security executive or an IT executive who hasn't been part of the decision and

[00:05:18] the remarkable thing about that is the biggest problem in cyber security continues to be

[00:05:23] lack of alignment with the business meaning cyber cyber executives in particular are seen as

[00:05:30] the people that do the firewall thing right or they do the you know they do the anti-biased thing

[00:05:35] they're off on the side and they're not seeing this business people first and yet

[00:05:40] they were all reached out to because everyone sort of inherently knew there's a

[00:05:46] it is there's something there that we have to consider privacy we have to consider security

[00:05:52] we're going to be putting the guts of our business into these things how could we not and

[00:05:57] that's new because our previous disruptive technologies haven't been through that

[00:06:02] and there was a huge start in that report I think it was something like 95% of organizations are

[00:06:06] using Gen AI tools as I suspect anybody is an appropriately set on the sidelines thinking right how

[00:06:12] we're going to do this responsibly and safely so I've got to ask for any business leader listening

[00:06:18] that's getting their ear tuned off at the moment because they want to sample Gen AI what are the

[00:06:23] most significant security challenges that they face and and how might these tools possibly be amplifying

[00:06:29] existing vulnerabilities without it wanting to or score on everyone's excitement at the moment but

[00:06:35] just doing it in a responsible way really yeah the the the biggest the biggest concerns are you

[00:06:41] know this is reminiscent of when we were using browsers and search engines so or even the cloud

[00:06:48] itself for those who remember it it's our data is leading our control who will be giving this to

[00:06:54] what can be inferred from it it seems like as soon as to some extent people have understood that

[00:07:00] data is the new commodity of choice you know that the phrase we hear a lot is the data is the new oil

[00:07:06] and if that's the case then AI is like the combustion engine because there wasn't by the way

[00:07:12] historically there was an oil rush before the combustion engine it was used for all sorts of

[00:07:16] of industrial purposes princesses alubricate and as a general a general industrial thing but when

[00:07:22] the combustion engine came along that's when the value of it skyrocketed and by the way look at the

[00:07:28] film flowers of the killer moon in the book kind of reflect the insanity as well as other

[00:07:34] in Sanities of that if anybody listening is interested but and the book is excellent by the way

[00:07:39] because it's all from the historical record even if it's fiction but the real danger here is we I

[00:07:45] think we we kind of don't know with all of its use hopping up everywhere how should we be onboarding

[00:07:54] this how should we be what source of data should go into it under what constraints what if any

[00:07:59] are the new business terms and contracts that we should have around this and that emerges over time

[00:08:05] and it may go as the cloud went and as search engines went that they just become an all part of

[00:08:10] business and we have standard ways of doing it it may evolve differently and we think we think

[00:08:16] about into that like how you should use it but the the theory is you're going to put your IP in there

[00:08:19] you're going to this someone's going to be putting PII in there and the funny thing is we've been

[00:08:23] using AI tools in the AI toolkit for a long time gen AI is the first thing that crossed the

[00:08:30] uncanny valley what I mean by that is the uncanny valley is as things look progressively more human

[00:08:36] you go oh that's cool that's cool that's cool but then they hit a creepy point where they're

[00:08:40] almost human and that's where most things if you see them you go that's a little off but this

[00:08:45] actually crossed it even though in front in AI turns gen AI and LLM's they're a big deal but

[00:08:52] they're not that big a deal compared to some things that we can expect in the future

[00:08:57] but it was the first thing the mass public could see that crossed that valley

[00:09:00] and could be used and that took our culture by storm but what's funny is as people start to use

[00:09:06] it they started to realize hey this this that's elsewhere in the tool kits we've been using

[00:09:12] in other words ML has been used for a long time machine learning and we're now we realize as

[00:09:17] collectively we should go back and look at those policies because that stuff was leaving the

[00:09:23] company already and many people didn't realize it and you were talking about history there and I

[00:09:28] feel sometimes it feels like an AI gold rush of source where it's multimedia science business is

[00:09:34] a desperately trying out of all that that shinier lore of its hype but sometimes it reflects on some

[00:09:40] of the lessons from the California gold rush because it was those that sold shovels another

[00:09:45] essential supplies they enjoyed more success than those scouring of the fields looking or the

[00:09:50] mountains looking for those specs of golden a quick scan through the headlights and last couple

[00:09:54] weeks it's things like Nvidia's chips that are powering AI rather than the solutions of tomorrow

[00:10:00] sometimes isn't it yeah and you know the other thing is to think when you when you actually use

[00:10:07] some of these tools in the amount of computing required is enormous which means he mounted energy

[00:10:12] required is enormous so it's not just that others are selling the shovels it's the damage

[00:10:18] it continued the analogy the it's not just panning for golden the river digging is damaging the

[00:10:23] environment in that analogy and so responsible use of AI ways to use it and looking at and you know

[00:10:30] things like Nvidia and the chip makers making more efficient chips from an energy consultant point

[00:10:36] of view has a material impact on things like climate change yeah so I love the lessons of history

[00:10:42] many ways we it's not that it we repeat history it's that it rhymes so there's a there's an awful

[00:10:48] lot that we can learn from it we can also get caught up if we read too much into it but there's a lot

[00:10:55] there and the Corey doctor who's a futurist and science fiction writer wrote an article recently

[00:11:03] that this is definitely on a bubble now he said don't panic at that and I'm paraphrasing

[00:11:10] and it's worth looking this up online he said there's a lot of bubbles the question is does it

[00:11:14] leave value behind afterwards like dot com was a bubble yeah it caused damage but it left value

[00:11:20] behind afterwards so too was the teleco bubble but it left fiber optics in an every way that people

[00:11:25] used there have been bubbles like n-ron that didn't leave value behind and so you could look through

[00:11:32] history and find cases where it did or didn't my bet is this is going to leave value behind that

[00:11:37] doesn't mean people won't have fortunes made and broken as with other to rushes like the Gorgers

[00:11:42] but yeah good observation on your part yeah yeah you too so it's a great way of looking at it

[00:11:47] and I think for many business leaders listening one of the early risks of unwittingly allowing corporate

[00:11:53] data to be fed into machine learning algorithms is a huge risk of course but in terms of early adoption

[00:12:02] particularly in terms of intellectual property and customer data it's as if you can share around

[00:12:07] what can happen when implemented without these proper security measures that we're talking about

[00:12:12] oh because I think this seems to be a big theme of our conversation already about responsible

[00:12:16] usage of AI yeah it is responsible uses the correct phrase I think and I think many people had the

[00:12:22] initial reaction just hold it at the door and I wouldn't do that you're going to wind up just

[00:12:26] if we had shadow IT you're going to have shadow AI and when I when I first said that I threw up

[00:12:31] my mouth a little bit but now now I it's a thing and it doesn't feel quite as pissed bad shadow

[00:12:38] AI will be a real thing if you do that so what I recommend is you do some security awareness you

[00:12:44] get a policy about it and you tell people here's how you do it correctly this is the right way to

[00:12:50] engage and just like let's take insider trading yes there are some instances where you hard stop

[00:12:56] it but by any and large people can do insider trading but they know they're going to get caught

[00:13:01] there's monitoring in place and so what you want to do is is get very good at the categorization

[00:13:08] and monitoring of that and give people the tools right because unlike other security threats where

[00:13:14] there's a malicious opponent an insider for instance that isn't the case here now they may be

[00:13:19] malicious uses like poisoning models or or trying to crack the models to the IP inside that's

[00:13:25] different when you go when it comes to using them tell people the correct tools they can use one

[00:13:30] of the consequences is to be using incorrect tools and have a means by which you can you can have

[00:13:37] a discussion about the things that are missing but they need to do so I need a tool for this I need

[00:13:42] to develop with this these are things you should be prepared to answer and so it's important to understand

[00:13:49] what is the what is the impact of various consumption or use of AI to your business and that's

[00:13:56] going to be based on the market it's going to be based on a rate of change the efficiencies in

[00:14:01] the business what your customers need but if you hold to the door you're probably going to be

[00:14:05] left behind as with most disruptive technologies and bubbles and with the majority of businesses

[00:14:11] eagerly adopting gen ai tools would you say the potential benefits as well as the drawbacks are

[00:14:17] off these trying especially from a security perspective because he's not all doom and gloom but there's

[00:14:21] a lot of potential benefits yeah you know it's I think it's certain Berners-Lie who said that the web

[00:14:27] was both the most over hyped and simultaneously under hyped technology ever I think that's been

[00:14:34] said of a few things and I think this is that's true of any similar disruptive bubble to bubble

[00:14:39] like technology and I think it's it's true here what we saw was you mentioned it he said 12 months ago

[00:14:45] well if you go back 14 months yeah nobody entered 2023 by the way I may be dating your

[00:14:52] podcast I apologize and going back to the beginning of 2023 nobody thought that was going

[00:14:58] to be the thing of the year and it became that certainly barring you know war and geopolitics it

[00:15:05] became that in the tech world to a massive degree and just look at the growth of church

[00:15:11] the fastest growing application certainly consumer application I think in the history of online

[00:15:16] services who would have thought that by I think it was April they would have a hundred million

[00:15:22] paying subscribers that yeah crazy right and so the question is is a really good one

[00:15:28] I think it's it is how are you going to consume it not if you're gonna consume it I don't know

[00:15:35] I don't know if that addresses it for you but it is given the rate at which this happened

[00:15:41] the rate at which advances are happening and the number of advances happening are having a

[00:15:47] combinatorial effect there's something called the law of accelerated returns and so we have to

[00:15:54] get used to in our business being able to absorb and adapt to and be agile with our business

[00:16:01] changing radically in very short periods of time yeah it's funny should say that because I've been

[00:16:07] doing this podcast for what five six seven years now and one of the things that I always amuse

[00:16:12] me is come November December my news feed just fills up with every self proclaimed

[00:16:17] future is from technologists all making those predictions for the next year and I think it was

[00:16:22] during the pandemic of course nobody saw hybrid working remote working primal lost I think

[00:16:28] was it 800 million in a month because every month because they had no income whatsoever because

[00:16:33] they had no online presence I think they also miss zoom of course and Microsoft teams I think

[00:16:39] they had it all been worth more than all the world's airlines combined and then last year

[00:16:44] well not last year just before it came out chat GP telling me about the no October and November time

[00:16:51] all those predictions for 2023 were all wept 3.0 I went matter so we're all kind of know what happened

[00:16:58] there right yeah so I saw a futurist who wrote a book about the next hundred years he said

[00:17:04] it's really easy to predict the hundred years at it's very hard to predict 10 yes right and by the

[00:17:10] way you're it's hard to be held accountable for 100 years yeah it's it you can be held accountable

[00:17:15] for five or 10 years the in the cyber world we'd actually have to deal with opponents rise to

[00:17:21] intelligent adaptation and I've done I've done sort of one of the risks to gen AI presentation

[00:17:28] and the technologies you can use to defend against those risks one of the things that bugs me a lot

[00:17:33] is is when people make these prophetic guesses as to which threats will be the big ones because if

[00:17:42] you're right and people actually respond to your prediction you will be wrong as the bad people

[00:17:47] choose other things yeah right so are you doing it to be a smarty pants and say see I told you so

[00:17:55] or are you doing it because um yeah you know if you do it and people actually listen to you then

[00:18:01] they won't listen to you again and so the kinds of predictions that are useful are projections

[00:18:05] of trends that help you make strategic decisions rather than tactical things like I am forcing

[00:18:11] the use of exploits of X or Y or you know everybody should be prepared for a new increase in

[00:18:18] in SQL injection attacks with everybody did that it would be a waste of resources yeah and

[00:18:24] another thing I I know a slash year I was going to tech conferences all around the world and

[00:18:29] every tech conference and in the first six months of the year we're suddenly changing their

[00:18:33] agendas to say hey this isn't no we've been working on this all year we've got a new product

[00:18:39] coming out towards the end of the year you know but I mean when there is such a rush for this next

[00:18:44] big thing to get that AI narrative against their business and brands so they remain relevant despite

[00:18:51] all those acknowledge secure areas did what where do you stand on that a walker be learned from

[00:18:56] the past technological adoptions in this context I'm not sure if there's any similar comparisons

[00:19:02] around the dangers of rushing into to new tech well there's always dangerous rushing into new tech

[00:19:08] yeah here's here's people aren't going to like this answer it's that it will start to produce

[00:19:13] results when it's boring yeah and and the reason I like it is I can't imagine that's dull because how

[00:19:18] am I going to know it's going to be boring I will be paying attention it's that it's it's like

[00:19:22] the people who first got in self driving cars I think I heard someone say the first five minutes

[00:19:27] were exhilarating after that I was bored yeah right because now what am I I'm in the car it's

[00:19:32] driving okay so what the the novelty of it wore off and and I actually tell people I trust AI

[00:19:40] not when someone it is uses what I call inductive use which is a giant pulsating brain

[00:19:47] the put data into it's it's what I would refer to as deductive years it is the instance where

[00:19:53] it becomes a useful cog in a machine might be a machine made up of other cogs or another orchestrated

[00:20:01] system of applications of the AI tool kit and I use that phrasing very purposefully because

[00:20:07] that's when it's it is using and generating business results for instance inside of her

[00:20:12] it's helping me with anti-malware it's helping me with data protection it's helping me with

[00:20:17] authorization policy it you know so I tell them we would GRC it's helping me with very specific

[00:20:23] things in other workings that I can count on as reliable functions in this system I

[00:20:28] in if it's in by the way if it's if it's a copilot that's great if it's something we were naming

[00:20:33] things in a deception context that's great these are all great uses but I can wrap language around

[00:20:39] that from a contractual purpose I can talk to a vendor about why they take the date and what they

[00:20:43] do with but a giant pulsating brain that is going to be everything in cyber I don't trust

[00:20:49] and I don't know who sees it and I can't talk to a vendor about so what are you going to do

[00:20:54] with the data very easily they're just going to say trust me no thank you right it's when

[00:20:59] you get to that specific use later in the game where it's fairly boring but it's generating

[00:21:04] business results that's interesting now in the meantime you should I would advise businesses as

[00:21:10] you onboard this stuff to create a committee or a panel it doesn't not for bureaucratic purposes

[00:21:17] but to monitor these rapid changes and to much like you would do in a hospital doing research

[00:21:23] you would say which ones are we going to bring in that we're going to enter into our practice

[00:21:28] which ones are we going to say we're not going to do right now and and the how are we going to change

[00:21:34] our processes and things like our security reviews our privacy reviews our attestations

[00:21:39] which are the ones that we're going what data we're going to put into it not put into it

[00:21:44] and that's a moving target and so that's why I would say convene once a quarter and have have

[00:21:50] projects that are looking into this because there's a plethora of small companies generating new

[00:21:55] applications of this technology it's worth staying on for the business and sticking with that

[00:22:02] security mindset that we have right now how does the lack of monitoring and proper data class

[00:22:07] of occasion in general I told you see I just impact an organization's overall security posture

[00:22:15] well it can be done by the way and and it is done and it will get done better over time

[00:22:21] I'd be remiss if I didn't say that it's e-scaled we infect do that but if you don't have the capacity

[00:22:27] or the capability to do it in a practice then you can't you can't you can't get fine-grained

[00:22:34] in the policies you can't get for instance I'll give you let's use an analogy if you have a

[00:22:40] hospital you're doing surgery and you don't know what the tools can do there maybe there's a new

[00:22:44] scalpel I'm not a doctor but maybe there's a new medicine for anesthetic maybe there's a

[00:22:50] a new way of keeping the theater clean then it changes the kind of surgery you can do then the

[00:22:55] benefits you can give your patients and so the hospital policy about what kind of hospital you are

[00:22:59] changes what what types of grants you can do what kinds of research you can do what kinds of services

[00:23:04] you can give the community the cost of those things so that the tools affect what you can do

[00:23:11] from a business perspective and the demands of the business affect what tools to use so if you're

[00:23:16] not looking into and being to do things like data classification and granularly then you can't

[00:23:23] for instance say this type of data doesn't go anywhere unless it's this group and even in that

[00:23:28] group it shouldn't done any of these circumstances see that level of control becomes trivial when

[00:23:35] you can do that right and you shouldn't be having to for instance go into an eternal project of

[00:23:41] tacking everything that's a useful tool but this gives us the ability to do so in a closer to real-time

[00:23:49] manner with a high degree of certainty so you in other words you're not going to interrupt legitimate

[00:23:55] business you're going to be able to say well now that that is part of the fabric of what we're doing

[00:24:00] we can now actually write policy and use it and move on with business the tools in other words I'm

[00:24:05] soaking up our time and the process is instead the tasks we want to be performing as a business

[00:24:10] are where we put our energy and I must admit I come from a time where IT used to be seen as the

[00:24:16] guardians of the network that the team then say no more than yes and yes take two lots it should be

[00:24:22] no okay no doubt it should be keen on w united out and also of course take too long to deliver

[00:24:29] projects but now I think it's increasingly seen or maybe I'm too hopeful here and optimistic

[00:24:34] but I think they're seen as the business and able is so looking forward how do you see the role of

[00:24:39] IT in maybe driving adoption of gen AI tools rather than just saying no and how does this impact

[00:24:46] me the overall strategic approach to security and and privacy concerns across an organization

[00:24:52] well I think I think I see has to go through a transformation as as does cyber security

[00:24:57] and DevOps has been through one and there's going to be another one so what we want to do is to have

[00:25:02] IT move from being custodians to the infrastructure to being workers of services right that that's

[00:25:08] the ideal place to be I mean there will always be somebody with with a spanner or a screwdriver

[00:25:15] I mean but it shouldn't consume most of our time and it shouldn't be everyone so we want to do

[00:25:21] is to go from perhaps 70% of our time managing the infrastructure and 30% on on on the quality of

[00:25:27] the services and flip that around to begin with well if we only spent 30% of our time managing the

[00:25:33] infrastructure troubleshooting it what could we do with the time we freed up to provide better

[00:25:37] quality services now what happens when that can be 10% that's an interesting thought so how do we use

[00:25:44] these technologies and how do we use new architectures like zero trust architecture how do we do

[00:25:50] that and when we do it how does it level what it means to be an IT person in the first place

[00:25:55] 100% with you on that and we will have people listening from all over the world a lot of

[00:26:00] business leaders so someone is very passionate about this space what stat should any organizations

[00:26:06] take to ensure their gen to secure their gen A I told you've said and and how do you with your

[00:26:13] zz all zz all is a scat it all depends who does it depends on where you are to be honest and so

[00:26:19] either is fine either it goes fine so your question was was how do you make sure that it's secure

[00:26:25] is that it yeah yeah and how does zero trust exchange platform could maybe contribute to that

[00:26:30] process too yeah and so the first thing I'll say I answered the general case and then the

[00:26:34] specific case with respect to you zscaler is that scale it so first of all secure secure is not

[00:26:41] a binary state yeah and people need to be aware of that and so it's not they buy buy three secures

[00:26:48] and you're good right instead instead it is about the active practice of security as part of

[00:26:55] the business the most important thing you can do is close that gap between security in the business

[00:27:01] there's an opportunity with this to do so and to invite the security department and there's a

[00:27:06] responsibility on the purpose of your to show up and do this right into business conversations

[00:27:13] and that means to some extent cyber has to care about non cyber things right they have to care

[00:27:19] about what the business cares about and then it has to influence the cyber program but the

[00:27:23] business could could welcome them in and then the cyber the whoever the chief information security

[00:27:29] officer is has to begin to think of themselves as a business person first and realize they're

[00:27:34] always going to be a security person that's not going anywhere everyone knows that the smartest

[00:27:38] security person around but that's one side and become a business person and be seen as that it's

[00:27:44] about lateral relationships that will go a really long way because I believe that leadership teams

[00:27:51] are fundamentally problem solving teams and when they have trust which means that they they are

[00:27:56] lying with each other they talk the credibility and reliability things that people usually blame

[00:28:01] when lack with lack of trust are they're not real honestly by the time you're a sea level in a

[00:28:05] large organization or in a mature organization everybody knows their job and you can count on them

[00:28:12] right otherwise they wouldn't be there what it requires is alignment and and discussion and intimacy

[00:28:17] at that level and then and then they are fundamentally problem solving teams and this is a problem

[00:28:22] they will solve so that's the general case when we start to get more specific it means

[00:28:27] not going straight to the tools first it means the right architectures and and I'm a big believer

[00:28:33] in in having a simplified transformed IT infrastructure and security infrastructure which is

[00:28:40] whether zero trust exchange comes in and so that's one of the things we do to help people do that

[00:28:46] and what's important there is you you have to start removing options for attackers you have to

[00:28:52] taking taking yourself out of the equation the business way to summarize that is only enable what

[00:28:58] the business needs when it needs it be where it only is hard but the closer you get to that with your

[00:29:04] controls and your infrastructure the better it becomes more observable it becomes cheaper

[00:29:10] and frankly the risk goes down as the amount of inherent trust goes down it's a paradigm shift

[00:29:15] in how you think so I hope that answers it if you want to get technical we're actually an authorization

[00:29:21] tool at scale doing fine grain authorization and so every request is uniquely resolved according to

[00:29:28] policy and we resolve that close to the edge and so you get the best pathway from a quality of

[00:29:35] service perspective from a security perspective and you can write policy that it is interpreted to

[00:29:40] say what should people be doing at any point in time and on top of that you can now start to build

[00:29:45] value you know you can look at the data and you can make data classification decisions as part of

[00:29:49] that policy you can do risk scoring you can look from now where you can do sandboxing you can do

[00:29:54] firewalling in a virtual sense for every connection in the company users devices workloads etc so

[00:30:00] that that's what we do I think that's an incredibly powerful moment to end on but before I do let

[00:30:06] you go a big thank you for sharing your insights but I want to say I want to see if there's anything

[00:30:11] else you can share with us today because I always ask my guests to leave everyone listening with

[00:30:15] one final gift and that is either a song that we can add to our Spotify playlist that

[00:30:20] uh meet something to you on a book that we can add to our app and wish list that you'd recommend

[00:30:24] all I'm gonna ask you though what would you like to leave everyone listening with and why?

[00:30:28] Well I'll give you I think I'll give you both yeah and I'll give it I'll give you both because

[00:30:32] I the song that I would recommend is a counter culture song so I'm Canadian originally

[00:30:39] uh I say originally because I became an American as well so the song is the 2112 over

[00:30:46] to by rush rush is a Canadian it's only three people in the band which is amazing when you hear

[00:30:51] it all the tempo changes and things they did this song and it's longer so you may not want to put

[00:30:57] it on Spotify or you may because they were told short songs sell and they didn't want to make

[00:31:05] just some candy that you eat a ear candy that you can digest not the same wrong with that

[00:31:10] they wanted to tell a story and the late meal part was an amazing canter here actually who's

[00:31:15] an amazing drummer truly if you look up on youtube he's incredible he loved science fiction and he

[00:31:21] wrote as well he he was he actually put together a story around this and it is it is in the future

[00:31:30] and it is about a dark age that music compens and and brings like to and so if you know

[00:31:38] with that story I highly recommend reading it then listening to it and you'll hear something

[00:31:42] very different as well as the music being amazing the second recommendation is a book and so the theme

[00:31:48] between these is dark ages and I think it's it's relevant as we we talk about data and what data is

[00:31:56] kept or not kept and what history is and what role AI plays in that and it this is going backwards

[00:32:03] in time rather than forwards it's 1177 B.C. by Eric Klein and it's about the late Bronze Age

[00:32:10] which has so many parallels to us today you really have to read it to see it it is a globalized

[00:32:16] world which is hard to believe in that timeframe but globalization refers to how many obstacles

[00:32:21] between any elements in in economy or culture and so it doesn't have to mean around the world

[00:32:27] that meant things like cancun Egypt a nice bomb um and the hitter those sorts of things yeah

[00:32:34] well it collapsed and it left a dark age from about about 1200 actually literally 1177 B.C.E

[00:32:43] through think of it as maybe four or five hundred B.C.E and and there's this gap in the historical

[00:32:51] record there which nobody fully understands what the main trigger was but the history that's

[00:32:58] written in stone is fascinating literally in stone and so I would recommend those two things and

[00:33:05] I think there's so many lessons from both as well as them both being cool stories one is of course

[00:33:11] future infection and the other one is past and history wow you spoiled us today then there you go

[00:33:17] not only some classic prog rock but also an intriguing book was to jack out of it well played yes

[00:33:24] so I will get that song added to the spotify playlist I would just to immerse myself in it more

[00:33:29] than anything as well it is 20 minutes it's an investment it's an investment for the years yeah

[00:33:34] yeah I was thinking it's the 22 actually yeah I was as you were talking now I was trying

[00:33:38] to think of some russians I mean was Tom Sawyer was that there it was yeah and Robert Bounds Tom Sawyer

[00:33:43] you know you're in and and why why Z why why Z sorry as a Canadian I use both interchangeably but yeah

[00:33:50] a lot of Canadians are out there going it's Z right yeah yeah well you're talking to a bridge

[00:33:55] that I give you a pass I think my parents my parents were Brits so I can I can do that to you

[00:34:00] and for anyone listening just one in a fight out any more information about you what your work

[00:34:06] the zero trust next change platform we mentioned or contact you or your tape what's the best

[00:34:10] starting point for apathy well for for everything Zscaler for zscaler dot com yeah I invite everyone

[00:34:16] to connect with me on LinkedIn I I I I will help people I believe in paying it forward in our

[00:34:21] industry so if anyone's trying to break into cybersecurity or if they have questions about something

[00:34:26] that they heard here or anywhere please just message me that seems like LinkedIn is becoming

[00:34:30] the professional network for social media so reach out and I will connect you and help you

[00:34:35] make connections within Zscaler and by the way I made I made a conscious choice to come here

[00:34:41] to Zscaler because I actually decide where to go on the basis at the impact that we can have

[00:34:47] and so far the opponents in cyber conflict have been advancing faster and all we're doing is

[00:34:54] incrementalism in the cyber industry and when I looked around for something that could change

[00:34:58] the equation where I want to put my weight it's here and so it's great people it's great tech

[00:35:03] and it actually makes a difference I could ask for more well again thanks for joining me today

[00:35:08] Alan so much about from where how organizations are approaching gen AI tool usage the security

[00:35:14] implications it's having some of the security concerns where they're coming from and how I T

[00:35:20] can take back control of gen AI use and make sure it's used securely and that word we kept using

[00:35:26] responsibly just a big thank you for showing that with me today baking you so as we draw our

[00:35:33] conversation to a close today I think it's clear that the journey towards embracing gen AI in

[00:35:38] business is fraught with both unparalleled opportunities and a significant amount of risk too but

[00:35:44] Sam Curry from Zscaler has illuminated that path forward for us today and I think it's so important

[00:35:51] that he emphasise the importance of strategic adoption vigilant secure images and also because

[00:35:57] yes the digital landscape is evolving and so must our approach is in securing it too but the big

[00:36:04] question is what steps will your organizations take to harness the potential of gen AI responsibly

[00:36:11] I'd love for you to share your thoughts with me today whether that be on the genitive AI

[00:36:16] unpacking the studies finding that we drew from today or navigating the security landscape

[00:36:22] how are you exploring the specific security concerns raised by gen AI adoption and what is

[00:36:29] the role of IT and your future outlook around the main drivers of gen AI adoption and pondering

[00:36:36] the shift towards IT as a strategic partner in business innovation we covered so much in what

[00:36:42] 30 minutes there so if you want to continue the conversation please tech blog writer outlook.com

[00:36:48] twitter linked in instagram I'm just at Neil CQs let's carry this conversation on I love to hear

[00:36:53] your thoughts you've heard from me you've heard from Sam this is your true calling let me know

[00:36:59] other than that I'll be back tomorrow with another topic of how technology is transforming our

[00:37:05] lives business and even world hopefully you'll join me again but thank you for listening and until

[00:37:11] next time don't be a stranger