3234: Arcium and the Rise of Encrypted Supercomputing
Tech Talks DailyApril 07, 2025
3234
35:0228.06 MB

3234: Arcium and the Rise of Encrypted Supercomputing

What if the internet could run fully encrypted computations without sacrificing performance or control? In this episode of Tech Talks Daily, I sit down with Yannik Schrade, CEO and Co-Founder of Arcium, to explore how confidential computing is reshaping the future of digital infrastructure.

Arcium is building what Yannik describes as an encrypted supercomputer, a decentralized network that allows data to stay encrypted even during processing. This approach gives developers and organizations a way to build privacy-first applications across sectors like healthcare, finance, artificial intelligence, and government services.

We talk through how Arcium leverages secure multi-party computation to make this possible and why that matters in a world where sensitive data is often a liability.

Yannik explains how Arcium enables companies to collaborate without revealing proprietary data, and how it allows for mathematically provable trust in digital processes. From private order books in financial systems to AI models that train on encrypted health data, the scope of this technology is already expanding into real-world use cases.

We also explore the importance of decentralization, not just as a technical feature, but as a way to reframe the conversation around digital sovereignty, compliance, and individual freedom. Yannik reflects on his own journey from app developer to cryptography leader, and how his background in computer science, mathematics, and law helped shape his vision for Arcium.

If you're navigating challenges around data privacy, regulation, or AI ethics, or if you're curious about what confidential computing can unlock, this is a conversation that brings clarity to an area often clouded by hype. What would your organization build if privacy was no longer a limitation but a default capability? Let me know after you've listened.

[00:00:04] In a world where data is the new currency, how do we protect it? Protect it without compromising speed, efficiency and trust? Well try to imagine an alternate future where sensitive information remains encrypted even while being processed. That's exactly what Yannick Schrade, CEO and co-founder of Arcium is doing. They're building an encrypted supercomputer that's designed

[00:00:31] to power a hyper secure digital world. So from AI and healthcare to finance and blockchain, confidential computing is already reshaping industries by enabling secure, verifiable and trustless interactions. But what does all this mean for the future of digital privacy? And how can businesses, and indeed individuals like you and I, prepare for this shift? Well join

[00:01:00] me as we explore how Arcium's decentralized approach to confidential computing is revolutionizing the internet as we know it. But enough from me. Time to get Yannick onto the show now. So thank you for joining me on the podcast today. For everyone listening, hearing about you for the first time, can you tell everyone listening a little about who you are and what you do? Yeah, thanks for having me here. I'm Yannick, CEO and co-founder at Arcium. And with Arcium, we are building the encrypted

[00:01:28] supercomputer. So what it means is that we allow for any type of computation to be executed in a fully encrypted way with Arcium. Be it allowing for encrypted, yeah, DeFi, ThreatFi, finance applications, where it becomes possible to, yeah, hide what you're trading, what your positions are, or at the same time enabling fully encrypted AI training and inference in order to preserve privacy.

[00:01:57] So our mission at Arcium really is it to allow for every computation to be encrypted in order to allow for entirely new kinds of applications where privacy is key. It really is such a huge topic right now. And that's what put you on my radar. I think I read about you round about the same time that Apple got into a little bit of trouble over here in the UK for almost saying that they were lowering encryption there for law enforcement.

[00:02:25] So when I read that Arcium is pioneering encrypted supercomputing, it just seemed like the perfect solution. So I've got to ask that there's got to be a story behind the company. What inspired the idea? And what challenges did you face in bringing it to life? Because it's quite a complex feat that you're going after here. Yeah, so we are really a founding team at Arcium that's really passionate about the underlying technology and what it entails.

[00:02:55] So I think it originally really started with us wanting to build stronger privacy technology and making that privacy technology available to not only individuals, but to anyone, right? So the way I see it is that with this encrypted supercomputer, what we're enabling is for every individual to have more privacy.

[00:03:16] But at the same time, industry governmental players, right? Big players being able to use this kind of encrypted computing to be able to operate more safely and be able to build more powerful applications because it becomes possible to process more sensitive data without ever having to see the data.

[00:03:37] I think that's really this powerful element that really inspired us realizing that with this new form of mathematics and cryptography, it becomes possible to run these full black box computations where someone runs a computation without seeing the data. And that enables an entirely new wave of applications.

[00:03:57] And especially with the advent of chat GPT and others, I think it has become quite tangible for a lot of people that there is a lot of data out there that is being exploited.

[00:04:10] And if we were to build frameworks where this data can be used more securely without any trusted intermediaries, we were to build more powerful AI applications because we can now give this AI our most sensitive data without ever having to risk losing the data or giving third parties access to the data who shouldn't have access to that data. I was just going to say everything you're highlighting there.

[00:04:38] That's one of the things that's holding many businesses back from getting company data, isn't it? Yeah, it is. It's interesting because I think privacy really is this big area. You touched on that with government surveillance. I think that's an important aspect as well, right? And I think the history of privacy really is in the internet, there's a reason why we are using HTTPS on every website instead of using HTTP, right? Because it's more secure.

[00:05:06] So encryption and privacy, yeah, bring multiple aspects to the table. One is individuals' freedom, being able to speak your mind freely, I think is the most simple form of privacy, right? Me and you being able to have a private conversation. This right here is a public conversation where we explicitly want this conversation to land on the internet. But I think there's a lot of conversations that people don't want to publicly land on the internet.

[00:05:35] And so for these, individuals require privacy. At the same time, as a business, you need privacy for your day-to-day operations. As a government, you need privacy for some of your operations, right? And so the interesting thing that we introduce is being able to operate over encrypted data without having to see that data and adding verifiability.

[00:06:00] So a good example really is every human being in the UK, let's say, having their sensitive patient data and then collectively training some predictive models on top of that sensitive patient healthcare data. And not having to give the data to some provider, not having to give the data to OpenAI or anyone else, instead being able to just train those models.

[00:06:27] And then having those models not even public, but keeping them encrypted. And then let's say a doctor being able to infer using those models, again, with highly sensitive patient data, without having to share the data with anyone, without seeing the model, without seeing the data used to training. So this is a crazy new concept, I think, of entirely encrypted computations. And that enables a lot of new types of applications.

[00:06:55] And as a business, I think you're extremely cautious right now because you can't really give all of your customer data and things like that to those operating platforms like chat GPT with OpenAI, right? So I think there's a lot of privacy concerns in that regard. And our approach really is the most secure kind of approach where we don't rely on any trust assumptions, but instead solely mathematics.

[00:07:24] And this verifiability aspect, I think, is important as well, because although it's this black box computation, this encrypted computation, where let's say you're training a model using Archeum, at the end, you can still verify the correctness of the results you get without the party, in that case, the Archeum network running this computation, not seeing the data.

[00:07:50] Still, you can mathematically verify the correctness of whatever output is being produced. And that's highly significant because currently in our proprietary Internet infrastructure, you have a lot of trust and a lot of single points of failure. If I outsource some computation to AWS, I have to trust that the execution is correct.

[00:08:12] But here I can just mathematically verify the output and I can be convinced that, OK, this execution has been executed in a correct way. This is the correct output without me also knowing the input data. So it's a very strange concept, but it enables a lot of new use cases. And confidential computing is also gaining a lot of traction across industries from AI, healthcare, blockchain and so many others.

[00:08:40] Why do you think this technology is so critical for the future of digital security and trust? I know we've hinted at it a few moments ago, but for people listening that are unaware on why this technology is so important, could you just expand on that a little on the importance of it? Yeah, sure. So when you have any kind of data, usually how it works is that when the data is in rest, so it just sits somewhere, it's just being stored somewhere, it's encrypted, right?

[00:09:08] If I'm storing a password on some server or if I'm storing images in my iCloud, maybe not in the UK, I guess, but in general, it would sit there in an encrypted, secure way. The reason for that being you don't want hackers, unauthorized third parties to breach the security and get access to all of that. And secondly, there's this human level right to privacy, I guess. And for businesses, it's important as well.

[00:09:36] But when you store data somewhere, it usually is securely stored using encryption, which is nice. The problem really arises when you now want to do something with that data that is stored somewhere, because it's in this scrambled encrypted representation. And usually what happens nowadays is this data is being decrypted.

[00:10:02] So it exists in this very vulnerable, publicly viewable state where anyone who now runs this computation sees the data and basically becomes the owner of the data because they can just store a local copy, I guess, right? So processing data is extremely vulnerable right now. And what we're enabling is we're enabling a way to process the data where it never has to leave this secure encrypted representation.

[00:10:31] It never has to be decrypted. And that's a significant paradigm shift because now the data can always, even when it's in use, remain in a fully encrypted, fully secure state. And that's important for privacy and security. And hence can enable quite a lot of new applications because the data you're now using in computations can be way more sensitive.

[00:10:56] And although many companies claim to prioritize privacy, true end-to-end encryption at scale is rare. And even those that do master it, they usually end up with a frustrating performance when it just frustrates users. So how are you enabling fully encrypted computations without compromising performance? Can you tell me a little bit more about that? Yeah.

[00:11:19] So the key primitive that Arceum uses is so-called secure multi-party computation, which basically circles around the concept of multiple parties all having sensitive data and keeping that data private and running a computation collectively, while being able to individually keep their secrets private.

[00:11:46] And so secure multi-party computation, MPC, is a cryptographic primitive family of protocols that has been developed over the last 20 or so years. And within the last years, we've seen a lot of significant breakthroughs in that space. At Arceum, we have a lot of PhDs who've done some of the most foundational work in that space in our team.

[00:12:12] And so through those really mathematical and cryptographic breakthroughs over the last few years, it became orders of magnitude more efficient to run these kinds of computations. And that's really what enabled us to build all this technology in a way where it's way more seamless than before.

[00:12:35] And it really introduces this new concept of, yeah, in the context of AI, I really like to call it end-to-end encrypted AI, right? So people are familiar with end-to-end encryption where it's me sending a message to you, Neil, which would mean I have some key, you have some key, and I can encrypt my messages with a key you give me. You can do the same thing with a key I give you combined with your secret key.

[00:13:02] And so we can create this messaging channel between the two of us, which is basically the most basic form of privacy. You can have this end-to-end encryption, I guess, one level above just you encrypting your data and storing it somewhere and then decrypting it yourself. End-to-end encryption allows for an exchange.

[00:13:21] Our idea really is that you can encrypt stuff and give it to Archeum without Archeum actually being able to decrypt it, right? So it's like end-to-end encryption, but the other party you're communicating with Archeum, it's not someone who then decrypts it and processes it. Instead, it remains encrypted and they can process it and then send it back. And so Archeum using secure multi-party computation is a distributed network.

[00:13:50] So it consists of distributed nodes that anyone can communicate with and task with running encrypted computations in a permissionless and trustless way. And so I guess the two key factors really are distributed network plus secure multi-party computation as this cryptographic principle.

[00:14:16] And combining those two is incredibly powerful because now there isn't a second party who can decrypt. Instead, it's this entire network that collectively processes your data without anyone seeing it. And decentralization is a key theme in your work and a huge change, a positive change, I might add, for many business leaders listening.

[00:14:40] So can you just expand on how decentralized confidential computing solutions do have the power and opportunity to reshape industries that have traditionally relied on those more centralized trust models? Because it's a phrase that's overused, but it feels like somewhat of a game changer moment.

[00:14:59] Yeah, I think this kind of technology really shows that, and I mean, MPC and this kind of encrypted computing isn't possible in a centralized setting. I think that's important, right? So if you were to run two computers near in your basement and run multi-party computation with those two, that's a single point of failure.

[00:15:24] So what you need, basically all you need with Arceum is two distinct parties running a computation. So if you have 100 computers within Arceum or 1,000, all you need is one honest participant is what we call it. So there's this notion in distributed computing of malicious players taking control over some network.

[00:15:52] So as long as there's one single computer, doesn't matter which computer it is, within the entire network that isn't controlled by some attacking malicious party, you have full security. And that's a very, very, I guess, powerful concept if you compare it to, I have this one single centralized computer which can be attacked.

[00:16:16] So that for one is important and you wouldn't be able to run these kinds of computations in a centralized setting because then they don't make sense. This cryptographic principle only makes sense in the distributed setting. But distribution can also mean that it's multiple centralized parties, I guess, collectively running a computation.

[00:16:39] So this kind of technology that we are offering within Arceum really allows anyone to use the encrypted supercomputer, but also enables people to sort of spin up their own smaller encrypted supercomputers with other third parties.

[00:16:59] So a good example would be healthcare data owners, providers, each running a computer in their own, I guess, permissioned Arceum cluster, and then running those computations by each one of those parties having their own computer simply just dedicated for processing this healthcare data. And none of those actors then are able to see any of the other parties data.

[00:17:26] So it can be quite nuanced, this kind of decentralization. But what matters is that there's always at least two parties in the actual computation involved. And before you came on the podcast today, I was doing a little research on you, and I quickly saw online that you spoke at many major events from the World Economic Forum and so many other different areas as well. And always around privacy and decentralization.

[00:17:54] So I've got to ask, do you ever see any resistance from traditional financial institutions about the work that you're doing here? And if you do, how do you address that to increase adoption and get more people on board? So I think that's an excellent question. And most interestingly, in traditional finance, I think nowadays it's around 60% of daily US spot volume that goes through so-called dark pools, right?

[00:18:21] Which is off exchange trading where some big player, for example, wants to hide their large asset movement and doesn't want to impact the market. So instead they go to JP Morgan and they basically privately traded on their books and then settle it at the end of the day. And one of the big problems in that area is that while you're not impacting the market and getting front run there,

[00:18:51] JP Morgan and others who have access to this private information offering those dark pools are able to trade based on that data. And a lot of those big institutions for that reason have been sued for that and had to pay very high penalties over the years because they undermine this private trading. So with the technology that we are offering, you can build this kind of trading in a fully trustless way.

[00:19:19] So none of the players actually need to go through JP Morgan. JP Morgan would just run the order book in a fully encrypted way. So JP Morgan couldn't undermine the privacy and that would be beneficial for all players involved. And I think what this kind of technology also enables is giving more access to this kind of financial activity where even smaller players can get access to these kind of private order books,

[00:19:46] enabling to also keep their trading strategies, I guess, private in any kind of financial setting. So I think that's very interesting in the financial area. But what's also interesting is that our technology can't just be used to add privacy. It can also be used to add compliance. So I think that's what people need to understand about this concept.

[00:20:12] While our technology can be used as the individual to be more private, you could also employ it in a surveillance setting where the surveillance technology that you employ becomes more privacy-preserving for the individual. Because, for example, you define some rule that if and only if a certain face on a governmental blacklist is detected by a camera,

[00:20:40] then you do some action and you run it with an argument. So nobody ever sees the camera feed, except if in a video frame there's the face of the bad guy and then you see it. So you can really define those strict rules in a privacy-preserving way. And I think that would help individuals a lot if we were to employ this kind of technology in this way.

[00:21:05] Same for compliance or credit scoring or all these kinds of things where we can actually keep the data private and add verifiability because we can predefine a rule set. And based on a rule set, then the data is being processed and results are being produced. So that adds a lot of transparency while keeping a lot of data more private.

[00:21:25] So I think that's something that everybody should be very excited about, be it traditional financial players, be it individuals don't wanting to be surveyed by the government, be it government players. Because I think at the end of the day, everything can become more transparent by everything being more secure, more encrypted, more privacy-preserving. And as we said earlier in our conversation, with AI advancing rapidly, many businesses are concerned around data privacy.

[00:21:54] They're increasing, especially with all their corporate data around the world. How does confidential computing protect sensitive information but still enable AI innovation? Can you expand on that? Because, again, a big topic right now, isn't it? Yeah, so I think one of the big issues that we've seen really is data exploitation, I think, to a degree.

[00:22:21] And so everybody's, I guess, the worst state we could be in is every single bit of information you have on your phone. And I guess Apple introduced Apple intelligence, which can automatically interfere from all of your text messages and emails when your next meeting is, right? Which sort of, if you think about it, can be quite scary. There's this intelligence running somewhere that's reading all of your messages.

[00:22:48] And so Apple has actually gone to great lengths to convince people that their AI is private, but their system relies on trusted execution. So basically, what they did is they designed a hardware chip where, yeah, simply through hardware security modules, they claim that those systems operate in a privacy-preserving way.

[00:23:15] And the way they describe it is basically that there's more or less armed guards protecting those chips when they leave their manufacturing plant and go to the data centers. So you always need to protect physical access to those chips that no hacker gets access to those chips. And so that's the trust model they're operating in, which is a quite scary trust model, I have to say.

[00:23:39] And so that's where fully encrypted computing comes into play, where you don't need this kind of physical access protection, because you can now securely process computations in the cloud. I think for AI, big advantage really is being able to operate over way more sensitive data.

[00:24:01] And I think a very, very tangible example for this is really healthcare, because there we see some of the most sensitive information that is easy to understand why it's sensitive. Right. And so it becomes more powerful, these kinds of models, because we can give it way more data. Why can we give it way more data? Because there's no one who could exfiltrate their data.

[00:24:30] There's that there's no one that can can steal the data. If you look at companies like I think 23andMe, right, that take DNA samples. What if we were able to sample and serialize all DNA sequence, every human's DNA strain and map that with the diseases and everything, but do so where nobody has direct access to it. Everything is encrypted.

[00:24:57] And the models and databases, everything is fully encrypted, shielded from anyone's access. And then we can just algorithmically define rules on how this data or the outputs can be accessed. And then only based on those algorithmic rules, people can run, I guess, inference into this large encrypted black box, I guess, and get outputs back.

[00:25:22] So there's no risk of information exfiltration and only shared benefits. And I think that's what's so powerful. There's this new concept where everybody can win. We don't need these intermediaries that control data. And I think it also enables companies more because they can think of of those actual innovations besides just getting, I guess, value accrual from from from from from stealing data at the end of the day.

[00:25:51] So it's actual technological advancements that we can push with that. And I think it's very interesting because of this kind of confidential computing. Yeah, we always call it encrypted shared state. So you can really think of every data point in the world as something that down the line could be encrypted and then things being trained on that. And I think that's also partially the answer for building responsible AI systems.

[00:26:21] Because in my opinion, there's there's multiple aspects to to responsible AI. One is privacy. Another one is also explainability. So those sort of seem to be conflicting ends. If you want a privacy preserving AI system and an explainable AI system, but it's actually possible to to build both at the same time. We just recently published a research paper, a cryptographic research paper about building explainable privacy preserving AI models.

[00:26:50] And so I think that's that's the direction where we'll be heading where everything is encrypted and the models that are being trained can only be trained on this encrypted data. So there's in consequence, if we train those models so that the models also remain encrypted. There's no central party who controls that model.

[00:27:13] And I think the closer we move towards HGI, that becomes more important because that makes it, I think, a way less scary scenario where there's no single actor in the world who can have access to all of this data because everything is encrypted. And hence, there's no player who can be corrupted by having access to this all powerful model.

[00:27:35] Instead, we can have some some rules to how this encrypted model that only exists in the encrypted space can be used from from the outside. So I think that's that's what what fascinates me, I guess, down the line, being able to enable these kinds of things that that will greatly benefit humanity.

[00:27:56] And if we look further down the line, if we take into account every conversation that you're having with business leaders, with entire industries, what do you see as the biggest challenges and equally opportunities for for building secure and verifiable digital economies? Because that's like the grand vision. But any challenges and opportunities you see on that journey?

[00:28:18] I mean, the biggest challenge is actually, I think, all of the cryptography and mathematics involved. And that's why we've at Archeum undertaking this challenge to solving that problem so that others can build on top of it without having to worry about it so that companies and applications can use Archeum without even requiring the slightest bit of understanding about cryptography and confidential computing.

[00:28:46] So our goal really is to make this technology as accessible as possible. And that also ties in into us building this permissionless encrypted supercomputer so that anyone can use it, which which is a different approach from, I don't know, becoming this proprietary company that that then, I guess, gives gives exclusive access to this technology. Instead, we want anyone to just be able to use this kind of technology.

[00:29:14] And that's why we are going to great lengths to abstract everything away from the teams using our technology. And what we did last year, for example, is we acquired one of our largest competitors. So Archeum has been around for three years and we've acquired our largest competitor from the US.

[00:29:39] And they've done some of the most foundational work in the in the in the AI space for confidential computing, whereas we have focused more on, yeah, different kinds of computations to be executed.

[00:29:54] And so we merged this technology together and offer it in the most simple interface where now developers building software systems don't have to learn anything new to just basically turn their computer programs. They already have into encrypted computer programs.

[00:30:14] And I think that's that's what excites me about this, that now everyone can think about, OK, where can I implement this kind of confidential computing? Where can I protect privacy more? Where can where can I make my systems more secure and resilient against attackers? Right. Where can I maybe collaborate with with competitors?

[00:30:36] Because by using confidential computing, we can find some set intersections of, I guess, markets we haven't captured it. Right. So I think there's a lot of a lot of things to think about where there's actually new value accrual that that that that that gets constructed on top of those systems, even by competing players, because you don't have to expose your data. So maybe there are some win win situations that can be can be produced.

[00:31:02] So in general, I would say what excites me most about it is that we've reached a point where this kind of technology now becomes incredibly accessible. And so anyone can use it without first having to have achieved getting a Ph.D. in mathematics. Well, thank you so much for sharing your invaluable insights today. It's something we could talk about a lot longer about. But before I let you go, I want to have a little bit of fun with you now.

[00:31:31] We have a Spotify playlist. I always ask my guests to add a song to that list, something that means something to them. Or, hey, just a guilty pleasure. It can be anything at all. But what would you like to add to our Spotify playlist? I'm going to go with Night Call by Kavinsky. Yes, that is a great tune, isn't it? It is. It is. Yeah. Man, that needs to go on straight away. I will get that on.

[00:31:59] And anybody that's not heard it, I don't know who that might be. But if you haven't heard it, get on that playlist. Listen to that now. It is a great tune. And for anybody listening that just wants to find out more information about anything we talked about today and get their hands dirty and have a play with it as well. Where would you like to point everyone listening? Yeah. So you can go to Arcium.com, A-R-C-I-U-M.com.

[00:32:23] There's our homepage where we have a lot of documentation, examples, and ways to use the encrypted supercomputer, which in its current state is in what we call private testnet. And yeah, within the next month, we'll go fully public. Yeah. Also, I think the best channel to actually stay up to date would be on X, looking at both our Arcium feed and my personal feed. I think we have a lot of activity there.

[00:32:53] So if you're interested in what we're building and how things are developing, I think, yeah, checking out our Twitter would be the best way to go about it. Yeah. I'd heard anyone listening to check you out on there. There's over 100,000 followers, I think, and very active and engaged and passionate discussions going on there. So I'll add links to everything so people can find you nice and easy. But just thanks for coming on here.

[00:33:20] Demystifying how decentralized confidential computing can reshape AI, healthcare, security, and blockchain industries. But most importantly, that grand vision for the role of privacy enhancing technologies in safeguarding both our personal freedom in a digital world, not just our work, of course. Thank you for shining a light on this crucial topic. Thanks again. Thanks for having me, Nia. It was my pleasure.

[00:33:45] So as the digital world becomes increasingly interconnected, the need for privacy and security has never been greater. And as my guest and his team at Arcium are doing here, they're proving that encryption doesn't have to be a tradeoff. It can be the foundation of a more secure, verifiable, and trustless digital economy.

[00:34:04] Whether it be safeguarding AI models or financial data or even enabling private healthcare analytics, confidential computing is set to transform the way that we interact in line. But what do you think? Will encrypted computing become a new standard or will businesses struggle to adopt it at scale? Let's keep this conversation going. Reach out. Share your thoughts on X or LinkedIn.

[00:34:32] Just I'm at Neil C who's on everything and Instagram too. And if you enjoyed today's episode, please don't forget to subscribe for more insights into the future of tech, business and beyond. But thank you for your time today. I've taken up far too much of it already. Please join me again tomorrow. We'll do it all again. See you then. Bye for now.