Stuart Seymour (Group CISO, Virgin Media O2)

Available on:
Season 3, Episode 3
15th October 2024

In this insightful episode, Stuart Seymour, Group CISO and CSO at Virgin Media O2, joins Andrew Ash (CISO, Netacea) to discuss how his experience as a British Army Captain shaped his unique leadership style in cybersecurity. Stuart also shares his passion for building diverse, neurodiverse teams, drawing from his own experience with dyslexia. He dives into the growing importance of AI in SOCs and the complex challenges of navigating global cybersecurity regulations. A must-listen for anyone looking to understand the evolving role of a CISO in today’s landscape.

Host

Andrew Ash

Andrew Ash

CISO, Netacea

Guest

Stuart Seymour

Stuart Seymour

Group CISO, Virgin Media O2

Episode Transcript

[00:00:00] Andrew Ash: Hello, and thanks for tuning into the Cybersecurity Sessions podcast, season three from Netacea. I'm your host, Andy Ash, CISO at Netacea. For the new season, we've decided to invite peers from the world of cybersecurity to join us and discuss the issues that are affecting them and the cyber industry.

[00:00:20] Today, I'm delighted to welcome Stuart Seymour, Group CISO at Virgin Media O2. And, Stuart, do you want to do a quick intro to yourself?

[00:00:30] Stuart Seymour: Thank you very much. Yes, delighted to be here. My name is Stuart and I am the group CISO and CSO at Virgin Media 02. Prior to that, I was at BAT. Before that, Centrica.

[00:00:48] Lockheed, Risk Consultancy, and the British Army. Been at Virgin Media O2 now for about, a year and a half. And I'm also proudly dyslexic and champion neurodiversity. So I think, Andy, that's me.

[00:01:15] Andrew Ash: Excellent. Thank you, Stuart. Really interesting career. We'll, I'll just go through what we want to talk about today.

[00:01:22] So we're going to, we're going to cover the evolving role of the CISO, next generation of SOC, and defensive AI, and then emerging challenges within the cybersecurity industry. But before we jump into those topics, I always ask a question about AI. It's slightly tongue in cheek, and normally people on the podcast go right in the middle of 10 scale.

[00:01:46] So let's see where you, let's see where you land, Stuart. On a scale of 1 to 10, what is the logical end game for AI in human society? One being that we succumb to our robot overlords and enter an age of servitude, and 10 being humanity is freed from the shackles of earth and goes off to explore the universe in peace.

[00:02:04] Where do you sit on that scale?

[00:02:07] Stuart Seymour: I'd say a healthy seven.

[00:02:11] Andrew Ash: Oh, good.

[00:02:12] Stuart Seymour: So I do not believe, that Skynet will arrive anytime soon. I believe that AI will be used, as a supplement and an enabler, as opposed to, something that completely replaces humans. and I believe that, we, like I said, we will, utilize AI for the benefits of humanity and to speed things up, and to remove waste. But I really do not believe that we will be looking at a T 1000 anytime soon.

[00:03:12] Andrew Ash: No, exactly. I think I'm on, I think I'm, I swing between a five and an eight or depends on the day and what I've just read but my feeling is that AI will be an enabler, with some downsides as most technology has, look at the internet, the huge benefit that's brought human society. And obviously there are some darker sides to the internet, as we all know, working in security. But that doesn't, I think the benefit outweighs the bad.

[00:03:50] Yeah. Okay. Thank you. Thanks for that. So we'll talk about the evolving role of the CISO. So you served as a British army captain before you went to IT and cyber security, how did that affect your journey and other aspects from that experience shape your approach to a CISO, thinking around crisis and incident management and building teams and, all of the, stuff that is renowned in the British army, how has that shaped your journey?

[00:04:22] Stuart Seymour: Yeah, no, thanks, Andy. So first and foremost, I think one of the things that the army taught me and allowed to, and allowed me to bring was this concept of servant leadership. And it's a concept where, when we were in the regiment, we always ate last, we always ensured that we looked after our soldiers, we always ensured that, if there was anything that was running out that we would dip out, and not them.

[00:05:06] And, in looking, and in moving through that sort of servant leader journey, I think that sort of helped me immensely throughout my career. If you think about the cap badge that you're given when you first arrive at Sandhurst, it says on it very clearly, serve to lead.

[00:05:33] So I think that aspect, it helped me, helped me a lot. And for me, as I've grown, grown and gone through my career, I've, my, my sort of philosophy and my perspective has significantly changed and I'm now more focused on the how than I am the what. In other words, the team and how you achieve the name and how you achieve the project as opposed to the achieving a project at all costs and leaving a ton of dead bodies behind and nobody ever wanting to work with you and that, I think, has its seeds in servant leadership.

[00:06:31] So I think servant leadership is incredibly important and it's something I've taken away from the army. And then the ability to, I think, administer myself, be able to prioritize a lot of competing, imperatives and being able to differentiate quickly what's urgent and what isn't.

[00:07:02] Because again, one of the things that the army teaches you is you have to maximize your time and you have to be efficient with your time because you don't get a lot of it. And then finally, when you mentioned crisis management, yes, I think the Army's been incredibly helpful in grounding me in things like crisis management, but also giving me perspective.

[00:07:29] And I think, with what veterans nowadays have experienced, and have seen, and how they've come back from where they've been, I think that the ability to have perspective of what really is important and what really is critical and what isn't, I think has really helped me.

[00:08:03] And I think if I look at some of the bosses that I've had previously, that I remember one during an incident comes in and the first thing is, his hair's on fire. Everyone's hair then catches fire because his is on fire. If we do this badly, we're all going to get dismissed, et cetera.

[00:08:28] And all that angst and stress brought into the incident, which in itself is complicated and stressful. And me being able to have the perspective of actually, nobody's gonna die today.

[00:08:51] Andrew Ash: Yeah.

[00:08:53] Stuart Seymour: Then that allows, I think that element of calm and control. So yeah, no, I'm incredibly grateful, for having served in the army.

[00:09:06] I, I loved every second of my 12 or so years that came away with some amazing lessons and groundings and it was the foundation of what made me, what makes me so incredibly blessed. And of course, it is a privilege to serve, and it's a privilege to serve not only your country, but also your soldiers.

[00:09:35] Andrew Ash: A couple of, an observation from me, in terms of, ex military personnel in cyber. And I don't know whether this is true, it's a feeling I have. In the UK we do get ex-military personnel interviewed at Netacea. And obviously there are a lot of ex-military personnel in the cyber community in the UK, but I think in the US it seems much more prevalent.

[00:10:01] It seems that, is that correct? I'm not ex-military, so I don't have that. Oh, does that feel to be true or...

[00:10:10] Stuart Seymour: Yes. I think I, I think it is because I, think, the US and the development of cyber as an offensive capability, I think the US are world leaders in that and I think that just as, just by the size and scale of their armed forces and their development and being at the cutting edge of cyber security technology and the like, and using that in terms of national security. It's a yes, I think. I think you do see that.

[00:10:55] I also think that there's something that the US does incredibly well, which we don't as a nation. And that's the, they recognize and support their veterans. So you also, you'll hear a lot more of it, which might lead to that perception because not only are people around the supermarket with their veterans caps on and, or the number plate saying I'm a veteran or, stickers on their car, you don't really see that in this country, but also you don't really see the fact that veterans are celebrated.

[00:11:42] You go into the States and every single shop pretty much that I've ever been in offers a veterans discount. So I think there's a combination of size in terms of their military capability, moving cyber into the, sort of the military, as a sort of a, an offensive capability, setting up those organizations that deal in this and then acknowledging people, when they do come out of that.

[00:12:20] So yeah, no, I think that in the US, I would leave with the same perception.

[00:12:27] Andrew Ash: Yeah, so every other person you speak to is either ex CIA, ex army, ex, ex marines. Ex Air Force in a lot of cases as well. It's funny, really interesting. I think there's a, I think basically what the reason for that question is obviously the armed forces teaches you, you've just been through it.

[00:12:50] That kind of servant management practice, but also I think the really key bit for many people is incident management in our industry. How you actually, incidents are inevitable. And that's true in all cyber, but also in all technology, and my experience of managing incidents is I was actually a service delivery manager for years and years before I actually came into cyber.

[00:13:17] And the difference in an organization and how it manages something that is not good, it's actually, it's a differentiator in the marketplace. It's a real business skill to have, to be able to succinctly manage a crisis, an incident, whether that's some downtime, whether that's a cyber attack.

[00:13:39] And that kind of management that you described of the big boss coming in and shouting at everybody and saying, we must fix this, if we do that, you can't drive that through fear. You can't drive, you can't drive that from the back foot. You have to be calm, you have to be collected, and you have to understand the subject matter as well, and how it actually affects the business.

[00:14:04] And, at Netacea, we actually train on this. We do incident management training so people know how to escalate. People know the routes that are open. They know the bridges that they need to be on and they have the run books from the technology that's affected. But yeah, I think, I was interested in the army piece, oh sorry, the military piece because those skills are so prevalent, in ex service personnel.

[00:14:30] Stuart Seymour: And Andy, just building on what you're saying, which I completely agree with. If you take it one step further and you think about the role of a CISO, and you, very quickly ascertain that you can't secure everything. So you're in the game of risk management. And when you're in the game of risk management, you're in the, you're thinking about limiting the blast radius and limiting the impact, which then takes you to resilience, which then takes you to recovery.

[00:15:14] Even if it's not cyber attack led, but even if it's an outage, even if it's an update that got pushed out that suddenly is not compatible with other applications on that system and brings it down, it's all about resilience, nowadays, it's not about security, certainly in my mind, and certainly where I've seen security departments to be the most effective is with a focus on response and recovery because things will go wrong, even with the best will in the world. And, not necessarily due to a cyber attack, like you mentioned, just an outage.

[00:15:59]

[00:15:59] Andrew Ash: All of my learnings from incident management in big infrastructure, back in the old days, data center outages, it all scales directly into, cyber and managing cyber incidents.

[00:16:15] The, base principles are the same. And that's why we, that's why we train on it. That's why we, that's why we want people to be calm, want to understand what they need to do, what their part of the team is, and how they reach out for help. So yeah, no, it's really interesting stuff.

[00:16:34] Just to move on a little bit, so the kind of evolving role of the CISO, there's a lot of new legislation across... across the globe at the moment, around data protection. California, GDPR, different standards across different states in the US. Has this added more responsibility onto the shoulders of the CISO or has that, is that? It has, okay.

[00:17:02] I was going to say, or is that the DPL part of, the privacy around data protection?

[00:17:09] Stuart Seymour: No, I think... so if you think about the biggest issues that a CISO has to deal with, especially a CISO from a, of an international company is, global regulatory frameworks. When I was at BAT, I think we were present in a hundred and eighty plus countries.

[00:17:38] And a data spill in one country was incredibly different to a data spill in another. There were some countries that from a regulatory point of view just didn't care. And, literally, you could... whatever happens, and literally they wouldn't care. And other countries cared significantly.

[00:18:09] So I, I think that, on the whole, regulation is there because industry hasn't really kept its house in order. And I think regulation predominantly helps us as an industry, helps one as an industry, be better. And I don't believe that predominantly, major economies want to regulate for the sake of it.

[00:18:53] So no, I think regulation is, is, one of the, is one of the key things that, that CISOs need to, need to be mindful of. And also, now that if you're looking at, what happened, what's happening in the States and the Security Exchange Commission and the fact that now CISOs carry personal liability and personal risk.

[00:19:24] And if you see what happened to... what's happening to really good people in the US is, the personal liability question, really takes hold. I think, yeah, regulation is very definitively something that should be on every CISO's radar.

[00:19:55] Andrew Ash: Okay. So just moving on, question around, how do you, what do you look for when building your team, Stuart?

[00:20:04] I know you care a lot about neurodiversity. How does your own experience shape your outlook and how do you attract diversity?

[00:20:17] Stuart Seymour: I think, fundamentally, if we're thinking about, diversity and inclusion and the sort of the necessity for it. In essence, it's all about having different viewpoints and for the ability to take the best out of the different sort of thoughts, and contributions that are put on the table, to make it

[00:20:49] the best, concept, strategy, whatever you're doing, it can be. And I'm very blessed because, my leadership team, 50 percent of my leadership team is female. I think, a third of my leadership team is neurodiverse. I, also within my leadership team, I think a third of it is, LGBTQ and, all of those different points of view, not just in terms of the what, but also in terms of the, how, which we talked about earlier in the, in, in the podcast are really important to me.

[00:21:38] And think and I believe one of the reasons why, we've moved and we've gone so far, so fast at VM O2, but specifically, a cause very dear to my heart, is neurodiversity and dyslexia. I'm dyslexic and very proudly so, I talk about it quite a lot. My daughter's dyslexic too

[00:22:06] and she's absolutely amazing. And dyslexia is all about looking and thinking about something from a different point of view or a different optic. And in terms of neurodiversity, one of the things that I actually actively do is, look to recruit neurodiverse individuals for my security operations center and for incident response.

[00:22:38] And in my career, I've had some really amazing people, neurodiverse people that have added incredible value. And there's some traits that some of them had, for example, some neurodiverse people are very uncomfortable with change. But if you think about what an incident responder does, and what a good incident responder does is they look for changes in behavior.

[00:23:12] This machine doesn't normally behave this way. This system doesn't normally do this. There is the absence of the normal and the presence of the abnormal. And in being uncomfortable to change and being comfortable with the routine, if you think about the very essence of malicious software, it's software that operates like it's not designed to and or software that changes configurations that it's not meant to.

[00:23:46] So neurodiversity, neurodiverse people are very good at spotting that divergence and change. Sorry, that divergence and normality, and all that change. So that's one area where they've been super successful, and where the neurodiverse people in my teams have added immense value. The other is in terms of the sort of hyperfocus and ability to follow something right to the end because of a sort of a hyperfocus and a wanting to make sure that done is done and not being satisfied with 99%, taking it to a hundred.

[00:24:37] And again, I had a couple of incident responders that were neurodiverse and had that, that characteristic. And what it ended up doing is it ended up inspiring the other people in the SOC to go that extra mile and to, if this person is really looking at every avenue possible, path that this piece of malware took or this intruder took, and shutting every single door and not taking shortcuts, neither am I, and I'm inspired, and they were inspired to really go that extra mile. So I think, clearly that there are things that neurodiverse people, isn't their strength.

[00:25:30] So for example, please don't get me as someone who's dyslexic to write you a... handwrite you a 3,000 word essay, and ensure that it's all spelt wonderfully and all grammatically correct, because that's not my forte. And my dyslexia does not lend itself well to that, but give me seven different things that are seemingly urgent, ask me to prioritize very quickly, be able to ascertain what's important, what's not,

[00:26:02] and look at a problem from multiple angles, even backwards, then yes, that is something that my dyslexia lends itself to.

[00:26:12] Andrew Ash: I was going to ask, the prioritization piece of urgent tasks, and it being immediately apparent to you what is urgent, how we're going to structure this work so we get the most important bit done first.

[00:26:28] The communication of that to people who may not have that, that kind of ability to prioritize very quickly. How do you deal with that? Because there's often pushback. I have a similar kind of way of being able to say that's important, that's not. And I don't know if it comes from neurodiversity, I'm not sure.

[00:26:49] But it is something that I have. And my experience of that is you can quite often get pushback from people who haven't quite got that ability to do that very quickly?

[00:27:00] Stuart Seymour: I think it's all in the, and again, this is something that I learned quite far into my career. It wasn't something that I learned either during my time in the army or my subsequent two or three jobs afterwards.

[00:27:17] But I think in communicating things like priorities, I think would sometimes, often, if you get, if I get pushback on communicating a priority, I always look at it that I haven't explained myself properly and the communication barrier is me, not them, first of all. Second of all, and I think what most people miss is, what it means for the individual that you're communicating the priority to.

[00:27:55] A lot of people predominantly play to the head, the logic, as opposed to the head and the heart.

[00:28:08] If you think about it, if you see a speed sign that says 30 miles an hour, that applies to your head. That's, you cannot go over 30 miles an hour. If you do, you're breaking the law.

[00:28:23] But then if you see a sign that says, children playing, 30 miles an hour, then that appeals to your head and your heart, because it's the fact that there's children there and that they're playing, that a ball might go onto the road, and that you might inadvertently be part of a tragedy, and I have children of my own, and if I didn't have children of my own, I might have nephews and nieces, or I remember when I was a child, or, the fact that you say children playing appeals to the heart as well as the head.

[00:29:00] And I think in terms of communication, and, I think, being able to see the priorities is one thing, but being able to communicate them, utilizing both head and heart, is something that, that certainly wasn't immediately apparent to me in the early to mid stages of my career. But then I had this aha moment in, when speaking to the head of HR at Centrica, actually.

[00:29:29] And she was my, my sort of unofficial mentor. And she started talking to me about how you get people to buy into something, as opposed to, obtaining malicious compliance. And she started talking about head and heart, and, it was a real aha moment for me. So, I think that in, in turn is, I think neurodiversity helps you sort out

[00:29:58] wheat from chaff and prioritize an order. but then the other bit that I subsequently learned is, head and heart.

[00:30:08] Andrew Ash: That's really interesting. Thank you for that. and I agree. I think yes, there is a immediacy of decision, doesn't always go over well, as I say, doesn't always lend itself to support, so yeah, that's a really, good strategy for dealing with that. So thank you for going through that, your journey. We'll just spend a little bit of time talking about a bit more tech. So next gen SOC and AI. We touched a little bit on AI at the start. What I want to do is understand what AI means to us as, CISOs and security professionals.

[00:30:55] So for Netacea, defensive AI is an absolute necessity due to the, one, the complexity of the problem, and the sheer volume of data we're analyzing. We're taking trillions of requests for our server side defense product every year and we can't respond quickly enough without being, without AI.

[00:31:20] Humans cannot read this data. Humans cannot act on this data. I'm just wondering, do you have a similar experience in how you process and respond threat data, indicators of compromise within your SOC?

[00:31:33] Stuart Seymour: Yeah, and I think speaking of sort of SOC and AI, I think a point of view to be had is that AI is not new, so machine learning isn't new, and processing data and learning from those processes and making those processes better, more efficient isn't new and automation isn't new. And, so I think when you, when, people speak about AI, and I think it's because it's burst into the public consciousness with things like ChatGPT and the, I think an important thing, certainly from SOCs that I've led and run in the past is that an element of artificial intelligence slash machine learning, slash automation has always been there.

[00:32:43] For me, it's not something that's suddenly new and it's something that, like I said, we've always used machine learning to, to enhance our SOCs and things that, I've put a lot of value in and put a lot of stock in are things like user behavioral analytics, which again goes back to that

[00:33:13] presence of the normal, absence of the normal, presence of the abnormal, right? The machine is meant to behave like this. It's not, therefore let's investigate. And I think in terms of user behavior analytics, that's where I think AI is a huge help also in terms of, like you said, analyzing terabytes of data efficiently and quickly to be able to, to, cut through the noise.

[00:33:55] Because, one of the things, throughout my career, especially in the SOC is that what you need to do is ensure that your analysts don't get snow blind. In other words, that they just keep on seeing that, the same things, and most of them are false positives, and therefore there's that analysis fatigue and what have you.

[00:34:25] However, analysts that I've been privileged to work with in the past that have been top draw that what really gets them excited is, it's a bit of a threat hunt. It's a bit of using their, their brain to work out a puzzle or a problem or what have you. And, so I think AI, where I see its greatest value is removing that sort of noise and snow blindness and allowing analysts to really focus on interesting problems.

[00:35:02] Andrew Ash: Yeah, then there's a couple of interesting quotes I've heard. One, one last year, around AI enabling humans to get out of the cockpit and into air traffic control. Literally taking away the mundane so that you can focus on the strategic or the, the tactical way that we're going to be in this attack.

[00:35:23] And like you say, taking away that white noise that is just absolutely, alert fatigue is real, and again, not just across cyber, but across infrastructure. Alert fatigue is a real challenge. The speed with which companies are innovating, even a company like Netacea, where we don't have 5,000 developers, we're reasonably small, we produce a lot of software, we produce a lot of services, we have a lot of public endpoints, and those public endpoints are only growing.

[00:35:54] So every time you get one, you have to refine that noise back down. And for me, AI is that place that it solves that problem. But then also a quote, or a theory recently, and I hope I don't butcher this, but essentially, with LLMs, we're used to prompting the, the AI to come up with a solution for us.

[00:36:21] But actually, what might work better within, within a SOC is the AI producing prompts for those people who love to investigate, i. e. collating all that data in a succinct way. Presenting it in a way that it makes it easy to investigate and, then, potentially to automate that action as well. But that, I think we're a little bit away from that.

[00:36:45] But yeah, so it's role reversal. It's prompting humans to do some good work as opposed to the AI to find something it's that, you've already got that information. doeD that resonate?

[00:36:59] Stuart Seymour: Yeah, it does a bit, but with a fairly significant caveat that, it's being documented and well proven that, if you put into AI, best looking person in the world, sitting by a beach, it will give you a certain type and a certain demographic because it's learned that, and like anything, it's all about what you teach it

[00:37:32] and how well AI is taught and how you can teach it or the ability to re educate it.

[00:37:44] Andrew Ash: Yeah, so those feedback loops are really important. Training the models is massive, but, those bias and hallucinations are real. Yeah. Within any AI system. They're also real within humans as well.

[00:37:59] We probably wouldn't call it bias or hallucinations. We might call it, spot mistakes or human error.

[00:38:09] Stuart Seymour: No, you would call it bias. Conscious or unconscious bias. Yeah. Yeah. Yeah, you would call it bias and maybe not hallucinations, but bias. And I think that's a really important point.

[00:38:21] In that, a lot of what people discuss in relation to AI is, it's like this amazing silver bullet that is, that does everything and is, the singular cure for all ails. And, people do need to be very cognizant that just like a human, if you bring up a human to, to, not care for it's sibling, to, to be selfish, to be what have you, you're going to get one product.

[00:39:03] And if you train AI in a certain way, you're going to get that similar product. And then AI hallucinates and it has issues. And as long as people go in with their eyes open to that, as opposed to, I've just bought this amazing security tool and it has AI and therefore, my risk of breach has gone down,

[00:39:29] down to manageable levels or is mitigated or this and that. Then there's a conversation to be had, right? Because I don't believe AI is the silver bullet.

[00:39:40] Andrew Ash: I don't believe it's the silver bullet either. I think the bit about the prompt is the, collation of information that is then presented to a human,

[00:39:50] and that human can then detect bias, detect elucidation.

[00:39:55] Stuart Seymour: Yeah.

[00:39:56] Andrew Ash: And, yeah, we always talk about AI like it's going to be, or it is perfect. So yeah, I think evaluation of product is very important. And we always ask our customers to do POCs with us just so they can, they can validate and look at the efficacy of the solution.

[00:40:14] And that's massive, massively important.

[00:40:17] Stuart Seymour: And of course, from, as a CISO point of view, and with a, sort of a, pen test red team mindset that I have whenever someone talks to me about AI is the first thing that comes to my mind is how can I poison it

[00:40:34] Andrew Ash: Yeah.

[00:40:36] Stuart Seymour: And how can someone poison it and how inoculated is it from someone teaching it something or teaching it to look another way and all of these kind of things.

[00:40:49] Andrew Ash: Yeah, I think that is a current, but more future challenge of the poisoning of models and how those models proliferate that, there are lots of copies of a lot of copies of AI models out there now.

[00:41:06] But yeah, there's a whole world we could talk about on that, but unfortunately today I think we're just up on time. So thank you to Stuart for joining me today. If you have any questions for us, please either leave a comment if you're listening via Spotify or YouTube. Or you can mention us on our X account @cybersecpod or email podcast@netacea.com. Please do make sure you subscribe wherever you get your podcasts. And finally, thank you for listening. We'll see you next time for more Cybersecurity Sessions.

Show more

Block Bots Effortlessly with Netacea

Book a demo and see how Netacea autonomously prevents sophisticated automated attacks.
Book

Related Podcasts

Podcast
S03E04

Dr. Christoph Burtscher (AI Researcher & Author)

Join us for an engaging discussion on how AI is reshaping cyber defense. Learn about the shift from human-led security to machine-led defenses.
Podcast
S03 E02

Arve Kjoelen, CynomIQ (former CISO, McAfee)

Get valuable insights into the world of CISOs with guest Arve Kjoelen (former CISO, McAfee) Topics include compensation, governance, and preventative security.
Oasis ticket scalping
Podcast
S03 E01

“Bot’s the Story, Morning Glory?” Oasis Ticket Scalper Bots

Discover the behind-the-scenes battle against bot-driven ticket scalping. Learn about the challenges and strategies for managing high-demand events like the Oasis reunion tour.

Block Bots Effortlessly with Netacea

Demo Netacea and see how our bot protection software autonomously prevents the most sophisticated and dynamic automated attacks across websites, apps and APIs.
  • Agentless, self managing spots up to 33x more threats
  • Automated, trusted defensive AI. Real-time detection and response
  • Invisible to attackers. Operates at the edge, deters persistent threats

Book a Demo

Address(Required)
Privacy Policy(Required)