2985: How SSDs Are Powering the Future of AI
Tech Talks DailyAugust 06, 2024
2985
27:1018.76 MB

2985: How SSDs Are Powering the Future of AI

In this episode of the Tech Talks Daily Podcast, we are joined by Roger Corell, Senior Director of AI and Leadership Marketing at Solidigm. Roger brings his extensive expertise to discuss the transformative role of solid-state drives (SSDs) in enabling artificial intelligence (AI) and enterprise workloads. With a new product launch on the horizon, Roger provides an inside look at how Solidigm is pushing the boundaries of storage technology.

Solidigm, a leading provider of NAND flash memory solutions, offers one of the most comprehensive portfolios of enterprise SSDs. These solutions are designed to accelerate workloads, including AI, from the core to the edge, unlocking unprecedented performance while lowering costs and scaling efficiently. Solidigm's commitment to quality and reliability is evident through its rigorous testing and validation processes, ensuring their SSDs meet the highest industry standards.

Roger highlights the critical importance of storage for AI applications. As AI models and datasets grow exponentially, the need for fast, dense, and power-efficient storage solutions becomes paramount. SSDs offer significant advantages over traditional hard disk drives (HDDs), including vastly superior speed, density, and energy efficiency. Solidigm's SSDs, particularly the upcoming PCIe Gen 5 D7 series, are engineered to deliver industry-leading performance across all metrics, making them ideal for demanding AI workloads.

The conversation also delves into Solidigm's customer-centric approach. Roger explains how Solidigm collaborates closely with enterprise and cloud customers to optimize firmware and testing for real-world conditions. This collaborative effort not only accelerates time-to-market for their solutions but also ensures that the SSDs perform optimally in practical applications, beyond just peak specifications.

Roger shares compelling examples of the real-world impact of Solidigm's SSDs. For instance, one hyperscaler was able to cut AI data preparation time by a factor of 50 by switching from HDDs to Solidigm SSDs. Similarly, OCIENT reported significant energy reductions in AI and HPC workloads using Solidigm's QLCSSDs. These examples underscore the tangible benefits that advanced SSD technology can bring to AI and other high-performance computing environments.

Looking ahead, Solidigm is poised to sustain its leadership in high-capacity and high-performance SSDs, with plans to introduce next-generation drives exceeding 61TB. The company is also investing in edge data processing capabilities, recognizing the growing importance of data gravity in the AI landscape. By maintaining performance leadership, Solidigm aims to maximize the utilization of costly GPU servers, which are critical for AI training and inference.

Join us for this enlightening episode as Roger Corell provides a detailed overview of how SSDs are revolutionizing AI and enterprise workloads. How is your organization leveraging advanced storage solutions to enhance its AI capabilities? Share your thoughts and join the conversation!

[00:00:01] [SPEAKER_00]: How is the evolution of AI dependent on advancements in storage technology?

[00:00:07] [SPEAKER_00]: This is a topic I want to explore today because I'm going to be joined by the Senior Director

[00:00:12] [SPEAKER_00]: of AI and Leadership Marketing.

[00:00:15] [SPEAKER_00]: Add a company called Solid Arm, an air leading global provider, innovative flash memory solutions

[00:00:20] [SPEAKER_00]: and NAMD flash memory solutions and as AI and enterprise workloads demand faster and more efficient

[00:00:28] [SPEAKER_00]: data processing. Solid Arm State of the RSSDs are right at the forefront of enabling these

[00:00:35] [SPEAKER_00]: advancements. So I want to explore the importance of SSDs in accelerating AI, how Solid Arm

[00:00:42] [SPEAKER_00]: is collaborating with customers to deliver on parallel storage solutions and dive into a world

[00:00:48] [SPEAKER_00]: where cooking edge storage meets groundbreaking artificial intelligence. The cost of hosting

[00:00:54] [SPEAKER_00]: a daily show for 140,000 monthly listeners can be significant and I've like the take-a-moment

[00:01:00] [SPEAKER_00]: to thank those who make it possible for me to keep delivering this content every day table.

[00:01:05] [SPEAKER_00]: And I also want to talk about the fact that legacy DRM failed to securely enable external collaboration

[00:01:11] [SPEAKER_00]: on sensitive files and I think it's important to recognise organisations in this digital age.

[00:01:17] [SPEAKER_00]: Face a somewhat risk, trust contradiction. Yep, they must share content with untrusted third

[00:01:23] [SPEAKER_00]: while also protecting that data. So it's time for a more modern DRM solution, one that

[00:01:29] [SPEAKER_00]: solves this dilemma but without compromising security and productivity. So collaborate as

[00:01:35] [SPEAKER_00]: imagine editing files externally without losing control. Streams, zero latency video renditions

[00:01:41] [SPEAKER_00]: to authorize users, but without any actual file transfers needed. The co-author can view it remotely

[00:01:48] [SPEAKER_00]: while you retain full ownership. Your files stay protected in cart work secure, unclave.

[00:01:54] [SPEAKER_00]: Ultimately they never leave your environment. So you can stay goodbye to data, leakage risks

[00:02:00] [SPEAKER_00]: and experience, seamlessly editing across all file types, not just native applications without

[00:02:05] [SPEAKER_00]: any plugins required. So say goodbye to deployment headaches, file transfer risk collaboration

[00:02:10] [SPEAKER_00]: barriers and all those productivity constraints and experience a more modern way to collaborate

[00:02:15] [SPEAKER_00]: on sensitive content, sacrificing control or security and you could do all that by visiting

[00:02:20] [SPEAKER_00]: kiteworks.com to get started. And with my thank yous out of the way, I'm now officially

[00:02:26] [SPEAKER_00]: excited to introduce you to today's guest. So buckle up and hold on tight as I beam your ears.

[00:02:34] [SPEAKER_00]: All the way stateside work to day's guest is way in a join me.

[00:02:39] [SPEAKER_00]: So a massive welcome to the show. Can you tell everyone listening a little about who you are?

[00:02:45] [SPEAKER_01]: And what you do? Sure, Neil. Thanks. Thanks for the opportunity to be on your show. I am Roger

[00:02:51] [SPEAKER_01]: Karel. I work for Solid-I'm is a US based but with a global presence storage company for data centers.

[00:03:02] [SPEAKER_01]: And specifically in terms of what I do, I am the senior director of AI and leadership market in

[00:03:08] [SPEAKER_01]: Solid-I'm. So what does that mean? It means the team kind of develop solution level value

[00:03:13] [SPEAKER_01]: propositions for our storage products, not just across the AI data pipeline but really for

[00:03:20] [SPEAKER_01]: all data center workloads, whether it's in the cloud or whether it's enterprise that our

[00:03:25] [SPEAKER_01]: drive support and then diving in a little bit more in terms of actually what that means that we

[00:03:30] [SPEAKER_01]: deliver again. We come up with a solution value propositions. We drive influencer marketing programs.

[00:03:37] [SPEAKER_01]: We manage product launches, product campaigns and kind of lead the development of our

[00:03:45] [SPEAKER_01]: kind of technology and thought leadership narratives. There's a pleasure to have you on the podcast today.

[00:03:50] [SPEAKER_00]: There is so much hype around AI right now. We're already seeing in the last few weeks and months

[00:03:57] [SPEAKER_00]: that everyone from Samsung to Apple are ensuring that the next generation of smartphones

[00:04:02] [SPEAKER_00]: are ready for AI ready to handle that. So I'd love to set the scene for our conversation

[00:04:07] [SPEAKER_00]: today as we move into this new era. How does people listening here in about Solid-I'm for

[00:04:12] [SPEAKER_00]: the first time? How do your solutions specifically enhance the performance of AI applications?

[00:04:18] [SPEAKER_00]: And also what role do they play from the call to the edge of enterprise operations? Because

[00:04:24] [SPEAKER_00]: this is the stuff we don't talk about enough sometimes. I think absolutely right, we're not talking

[00:04:29] [SPEAKER_01]: about it enough. I think right fully so the conversation up to this point has been largely focused

[00:04:36] [SPEAKER_01]: on an Nvidia for obvious reasons and specifically their GPUs. But it feels like now the market is

[00:04:45] [SPEAKER_01]: just kind of beginning to get their arms around, understanding that storage plays an incredibly

[00:04:51] [SPEAKER_01]: important role. And as you said, an important role all the way out from what we call the core data center

[00:04:56] [SPEAKER_01]: to the mid tier infrastructure and all the way out to the edge. And why does storage matter?

[00:05:04] [SPEAKER_01]: Well, you know, AI gets, you know, stating the obvious AI gets better with more and more data.

[00:05:10] [SPEAKER_01]: And I don't want to bore you with how much training data sets are growing in size year-on-year

[00:05:17] [SPEAKER_01]: in the model sizes are growing year-on-year but the amount of data feeding these model developments

[00:05:24] [SPEAKER_01]: and then doing the inferences is just really growing in order to magnitude every year.

[00:05:30] [SPEAKER_01]: And that's going to get even, I guess, more data intense when you talk about multi-modal AI.

[00:05:38] [SPEAKER_01]: So multi-modal AI is when you're combining things like text, image, video, audio all into the same

[00:05:48] [SPEAKER_01]: AI model. And as we've kind of alluded to these compute resources to run these,

[00:05:56] [SPEAKER_01]: to do the model development and then infer the results at the end of the pipeline,

[00:06:03] [SPEAKER_01]: they're incredibly expensive on the front end. When you talk about training, I don't think

[00:06:07] [SPEAKER_01]: you can acquire a DGX H100 server from, from Nvidia for under $300,000 U.S. dollars.

[00:06:14] [SPEAKER_01]: So you don't want those incredibly expensive resources sitting idle.

[00:06:22] [SPEAKER_01]: And as you alluded to, in these workloads are moving to the edge to improve experiences,

[00:06:28] [SPEAKER_01]: reduce costs, reduce network traffic, et cetera. And so what does this environment mean?

[00:06:34] [SPEAKER_01]: What does this landscape mean to storage it means that you've really got to move a massive amount of data

[00:06:40] [SPEAKER_01]: and efficiently store access at speed seemingly everywhere? And this is where high performance,

[00:06:49] [SPEAKER_01]: dense, highly efficient SSDs come in just a quick comparison to kind of the legacy storage

[00:06:57] [SPEAKER_01]: that is running a lot of these supporting a lot of these AI workloads, which is HDDs,

[00:07:03] [SPEAKER_01]: depending upon where you are and what we call the AI data pipeline and SSD,

[00:07:09] [SPEAKER_01]: can be 10x to 4,700 x faster than an HDD. A high-cap SSD can store four times as much data

[00:07:18] [SPEAKER_01]: in the same rack space as an HDD while consuming almost three x less power per terabyte of

[00:07:28] [SPEAKER_01]: data stored. So just incredible gains when you talk about the workload intensity of AI and

[00:07:39] [SPEAKER_01]: the need to keep these very expensive resources fully utilized. Wow, so many powerful stats in that.

[00:07:47] [SPEAKER_00]: Here we are talking today in August. You've got a new product launching and for anyone

[00:07:53] [SPEAKER_00]: that missed the announcement today. Can you tell me a little bit more about the launch?

[00:07:56] [SPEAKER_00]: I also have a different choice itself from existing solutions out there in the market.

[00:08:03] [SPEAKER_01]: Yeah, sure. So we're really excited that we're launching a new family of products. It's

[00:08:13] [SPEAKER_01]: PCIe Gen 5. So these products kind of fit into what we call our D7 family and these products

[00:08:23] [SPEAKER_01]: will fundamentally deliver the highest, not to get too geeky here. But in the storage industry

[00:08:32] [SPEAKER_01]: Neil, people talk about four corners performance. So that is the IO pattern between compute and

[00:08:40] [SPEAKER_01]: storage. Is it 100% right? 100% reads, 100% random reads, and 100% random writes. Those are

[00:08:51] [SPEAKER_01]: the four corners. So not only is this drive going to deliver the industries fast just four

[00:08:58] [SPEAKER_01]: corner performance and a PCIe Gen 5 product, but and I think we might be getting into this a

[00:09:05] [SPEAKER_01]: little bit later. What sets all I'm apart is as we engineer for the real world. In reality,

[00:09:12] [SPEAKER_01]: no workload is going to 100% of the time be at those four corners. So understanding the real

[00:09:21] [SPEAKER_01]: IO patterns of our customers we do special things in firmware to optimize them for kind of these

[00:09:29] [SPEAKER_01]: behaviors in a real production environment. And so this really is going to set these drives apart

[00:09:37] [SPEAKER_01]: in terms of just massively accelerating a range of enterprise workloads. And getting back to AI

[00:09:44] [SPEAKER_01]: really quick, we believe we talked about GPU utilization. We believe that this product is going

[00:09:50] [SPEAKER_01]: to be really, really well positioned as direct a touch storage inside of that, you know, we're using the

[00:10:00] [SPEAKER_01]: GXH100 example, you know, inside that same rack to really maximize GPU utilization during critical

[00:10:10] [SPEAKER_01]: AI steps like checkpointing and and restore. And I suspect you've probably got the of a lot of

[00:10:18] [SPEAKER_00]: customers who are communicating with you that they're going all in an AI. They have evolving needs

[00:10:24] [SPEAKER_00]: requirements. I'm curious, how do you collaborate with your customers to tailor storage solutions,

[00:10:30] [SPEAKER_00]: meet those specifically, and maybe you need an and why is this approach important in the context

[00:10:36] [SPEAKER_00]: of enterprise storage? Because I imagine it's a conversation you're hearing more and more.

[00:10:41] [SPEAKER_01]: Again, it's a really good question. I think what sets us apart? Well, let me let me back up a couple

[00:10:47] [SPEAKER_01]: steps. So when you talk about kind of our legacy, our DNA or heritage, whatever we want to call

[00:10:55] [SPEAKER_01]: that we have, solid, has a bit over three decades of storage industry experience. And that really

[00:11:07] [SPEAKER_01]: kind of gives us in the storage experiences is multi level, at multiple technical levels. We are

[00:11:15] [SPEAKER_01]: talking engineered engineer in terms of strategic product planning, engineered engineer in terms of

[00:11:22] [SPEAKER_01]: design, test validation, etc., in these kind of long standing, deep technical relationships,

[00:11:32] [SPEAKER_01]: spanning enterprise and spanning cloud service provider customers are. This really kind of gives us

[00:11:39] [SPEAKER_01]: a rethink, kind of a leg up on our competition to do things that are unique in the industry

[00:11:47] [SPEAKER_01]: that really set us apart in terms of differentiating value. It's things like firmware optimizations

[00:11:53] [SPEAKER_01]: that I alluded to a minute ago that improve performance in real-world conditions. It's things like

[00:12:00] [SPEAKER_01]: creating more and more of our customer test into our test suite to kind of, you know,

[00:12:10] [SPEAKER_01]: multi-level benefits there, provide higher quality product to our customers. And because we're doing

[00:12:16] [SPEAKER_01]: kind of if you will more work for them, we are accelerating their time to market opportunity as well.

[00:12:25] [SPEAKER_00]: And before you came on the podcast, I was having a look on, I think it's one of your social

[00:12:28] [SPEAKER_00]: channels. It probably be LinkedIn and there's a great tagline there, the solid RMU expand the

[00:12:34] [SPEAKER_00]: possibilities of data that fuel human advancement. So we just talked about real-world conditions there.

[00:12:40] [SPEAKER_00]: Are there any real-world applications where solid RMSS-Ds have significantly accelerated workloads,

[00:12:47] [SPEAKER_00]: while lowering costs? Is any of those kind of stores you can share with us today? Because I suspect

[00:12:51] [SPEAKER_00]: that many tech projects now are especially in an era of economic uncertainty that challenge

[00:12:56] [SPEAKER_00]: doing more with the same, more with less of equally. They've got a big expensive tech projects they

[00:13:02] [SPEAKER_00]: need to get over the line. So I've looked to bring that to life a little if you've got any use cases or

[00:13:06] [SPEAKER_01]: all of those things. Yes, I can share a couple with you. Yeah, again, sticking with kind of the theme

[00:13:12] [SPEAKER_01]: that we're on here today, Neil, you know, in terms of AI, there is a hyper-scaler customer of ours

[00:13:21] [SPEAKER_01]: who I can't name. But this hyper-scaler moved their AI ingest in data preparation from HDDs

[00:13:32] [SPEAKER_01]: to solid RMSS-Ds and reduced the time that it takes to kind of complete those two key AI data

[00:13:43] [SPEAKER_01]: pipeline phases reduce the time by nearly 50X. We have a customer in the US named OCS. OCI

[00:13:55] [SPEAKER_01]: EMT is a really interesting customer and I would encourage your listeners to look them up.

[00:14:03] [SPEAKER_01]: They are planning that a combination of their really powerful software along with the hardware

[00:14:16] [SPEAKER_01]: stack they use, which includes RQLCS-S-Ds that they can reduce energy consumption of intense

[00:14:23] [SPEAKER_01]: workloads. So things like HPC, data analytics, AI, they can reduce the energy consumption of these

[00:14:30] [SPEAKER_01]: intense workloads by 50 to 90%. And when you talk about the, I guess, kind of the power constraint

[00:14:40] [SPEAKER_01]: or power challenges AI data centers are up against, you can see how this is tremendous value.

[00:14:47] [SPEAKER_01]: And then just across a range of OEM storage innovators, CSPs, companies like Super MicroVast,

[00:14:56] [SPEAKER_01]: Ebola, NetDell, Kingsoft, CheaterRAID, HPE, BiteDance, etc., all of these customers kind of value the

[00:15:08] [SPEAKER_01]: unique combination enabled by our high-cap leadership in terms of density, efficiency and performance.

[00:15:18] [SPEAKER_00]: And again to drill down on that, is anything you can share? Or maybe help businesses understand

[00:15:24] [SPEAKER_00]: the kind of business value and ROI that we're talking about in our conversation with you.

[00:15:29] [SPEAKER_00]: Is there anything else you can share that will help businesses understand solid arms technology,

[00:15:36] [SPEAKER_00]: help them unlock the potential of data for your customers? And also how that contributes to that

[00:15:42] [SPEAKER_00]: that tagline we mentioned a few moments ago about fueling human advancement and he can share around them.

[00:15:47] [SPEAKER_01]: I think maybe the best example is so let's assume that AI is primarily used for altruistic reasons.

[00:15:57] [SPEAKER_01]: So let's assume that. I think the best example then is if you just look at

[00:16:05] [SPEAKER_01]: the potential of AI, the solve massive problems, it improved the quality of life.

[00:16:13] [SPEAKER_01]: Okay, so how do our products kind of play into that? As that being an example of fueling human

[00:16:20] [SPEAKER_01]: advancement will, again at the risk of getting repetitive AI doesn't happen without massive amounts

[00:16:28] [SPEAKER_01]: of data. Data doesn't happen without storage and scaling of AI to reach its potential doesn't

[00:16:38] [SPEAKER_01]: happen without high capacity, highly efficient and performance storage. So that's kind of AI as an altruistic

[00:16:48] [SPEAKER_01]: example of how we believe that our technology can help fuel human advancement.

[00:16:57] [SPEAKER_00]: Love the uncurious considering solid arms origins in entails innovation and SKA on

[00:17:04] [SPEAKER_00]: leadership. How has it influenced your approach to developing and delivering

[00:17:10] [SPEAKER_00]: cooling edge storage solutions? I feel there's going to be some kind of

[00:17:13] [SPEAKER_00]: synergies there or anything that you can share around them? Well, yeah, I think it's a combination

[00:17:21] [SPEAKER_01]: this heritage within Intel which I'll touch on a bit more and then our ability to

[00:17:30] [SPEAKER_01]: invest to sustain leadership, you know, bad and able by SKA, Inax is our parent corporation.

[00:17:38] [SPEAKER_01]: So kind of getting back to the Intel element of that, what makes us unique again is this

[00:17:48] [SPEAKER_01]: platform solution, I guess, customer centric approach that Intel took in stilltikes and it is

[00:17:57] [SPEAKER_01]: part of our DNA. And this mindset, if I could be a, I love history so I'm going to kind

[00:18:07] [SPEAKER_01]: of click into a bit of history here, you know, this mindset has delivered this long legacy of

[00:18:12] [SPEAKER_01]: storage industry innovation. What many of your listeners might not know is that we ship the first

[00:18:19] [SPEAKER_01]: flash disk in the form of into an IBM PC in 1992. Wow, our heritage goes back to us co-creating

[00:18:30] [SPEAKER_01]: the PCIe spec in O3. We co-founded the Open Compute Project which is an incredibly important

[00:18:40] [SPEAKER_01]: industry consortium that sets hard worst standards for hyperscalers and more.

[00:18:45] [SPEAKER_01]: We invented the ruler, Neil, if you've ever, if your listeners are seeing these long,

[00:18:51] [SPEAKER_01]: skinny SSD form factors that was called the ruler and then we donated that to Snea and that

[00:18:56] [SPEAKER_01]: became the EDSF specification which is increasingly the preferred kind of modern form factor for

[00:19:04] [SPEAKER_01]: multiple reasons. We were the first to ship QLC SSDs into the data center. We lead in high cap storage

[00:19:12] [SPEAKER_01]: now with our 61.44 terabytes drive. So kind of, if you, I guess step back and kind of look at all

[00:19:19] [SPEAKER_01]: these things coming together. This, this investment enabled by our parent corporation, this history

[00:19:30] [SPEAKER_01]: of legacy, this kind of customer first mindset. All of this is now kind of coming together into

[00:19:37] [SPEAKER_01]: we believe that our technology is, you know, kind of led us to, you know, our technology meeting the

[00:19:45] [SPEAKER_01]: AI moment in terms of storage challenges and opportunities. I don't know idea, if we look to

[00:19:53] [SPEAKER_00]: the future, what trends do you see in the future of SSD technology, especially in relation to AI?

[00:20:01] [SPEAKER_00]: And how would you say your position to lead on this continuously evolving landscape that seems

[00:20:06] [SPEAKER_00]: unfolding in real time before our eyes at the moment? Yeah, that's an interesting observation.

[00:20:13] [SPEAKER_01]: We think that, you know, I don't know, let's go up. What is today Wednesday? We go back

[00:20:18] [SPEAKER_01]: back last Wednesday. Boy, we've really got an understanding of the AI landscape and what it means

[00:20:25] [SPEAKER_01]: to storage and, and here we are, we later, oh my gosh, this new thing is emerging and that seems

[00:20:31] [SPEAKER_01]: to, you know, be having implications on storage. So this is just, this is the wild west. This is such

[00:20:38] [SPEAKER_01]: a rapidly evolving industry but super exciting. And so I think our portfolio is well positioned

[00:20:47] [SPEAKER_01]: now and the investments that will make as well positioned to continue our portfolio leadership.

[00:20:55] [SPEAKER_01]: So drill it into that a little bit. As we've alluded to several times, you know, efficient

[00:21:03] [SPEAKER_01]: performance high cap SSDs is certainly critical. That's critical to overcoming the, you know,

[00:21:13] [SPEAKER_01]: it can be a key part of solving the power requirement challenges for AI data centers.

[00:21:20] [SPEAKER_01]: We will continue to invest in where a hundred percent vested in sustaining high cap leadership.

[00:21:28] [SPEAKER_01]: So I think we kind of start there. We will have the next capacity point beyond 61.44

[00:21:34] [SPEAKER_01]: terabytes in market by the end of the year. We talked about the edge a bit. It if you kind of

[00:21:41] [SPEAKER_01]: ear listeners are probably familiar with the term data gravity and the more and more as much as

[00:21:49] [SPEAKER_01]: possible you want to store, analyze act on data where that data is produced. It's a very expensive

[00:21:59] [SPEAKER_01]: to move data up and down the wire. AI workloads are moving to the edge. And so we're looking

[00:22:05] [SPEAKER_01]: at capabilities on what more can we do with data at the edge? So kind of stay tuned for more there.

[00:22:13] [SPEAKER_01]: And we think performance leadership is key as well. And it's especially key, you know, if we

[00:22:18] [SPEAKER_01]: talk about GPU utilization and that kind of direct attach storage element to keeping these 300,000

[00:22:26] [SPEAKER_01]: US dollar GPU servers fully utilized. So we will invest to sustain our performance leadership as well.

[00:22:38] [SPEAKER_00]: Wow, so much food for thought there and a lot of a few teases as well. So I think we will have to

[00:22:44] [SPEAKER_00]: get you back on towards the end of the year to find out more about some of those things. But

[00:22:49] [SPEAKER_00]: I kind of say, you know, for coming on here and showing your insights around AI and the power

[00:22:53] [SPEAKER_00]: of SSDs too enabling and also mentioning the product launch of course. But before I let you go,

[00:22:59] [SPEAKER_00]: I'm going to ask you to leave everyone listening with one final gift. And I always ask my

[00:23:03] [SPEAKER_00]: guests, is that a book that you could leave to add to our Amazon wish list or a song to add to our Spotify

[00:23:10] [SPEAKER_00]: playlist, guilty pleasures all allowed there as well. But all I've been asking is what you'd like

[00:23:14] [SPEAKER_01]: to leave everyone listening at one? Okay, I'm actually going to answer that in two parts and

[00:23:20] [SPEAKER_01]: you can cut one of the mouth if there isn't enough time. But in terms of book, I often refer to

[00:23:26] [SPEAKER_01]: there's a book by a British author Richard Holmes called The Age of Wonder. I think your readers,

[00:23:33] [SPEAKER_01]: you know, you have in a technology podcast, I'm sure they're interested in in the scientific discovery.

[00:23:39] [SPEAKER_01]: This is about the wave of scientific discovery that occurred in England,

[00:23:43] [SPEAKER_01]: kind of between the late 1700s through mid 1800s and it really is a testament to kind of the

[00:23:49] [SPEAKER_01]: capacity of the mind and human inquisitiveness to solve seemingly intractable challenges.

[00:23:59] [SPEAKER_01]: And then I love the question because I'm also a huge music fan my favorite artist by far is

[00:24:05] [SPEAKER_01]: Ray LaMontaine, not sure if he's big in the UK but it's encouraged your listeners to check him out.

[00:24:12] [SPEAKER_01]: Chill vibe he's dedicated to his craft. He's hardworking my absolute favorite song of his is such a

[00:24:18] [SPEAKER_00]: simple thing. Wow, a fantastic artist. Yeah, he used to be a huge fan. He was holding

[00:24:25] [SPEAKER_00]: me in my arms. That must be a big. Yeah, he feels like a few years ago. He was probably

[00:24:29] [SPEAKER_01]: like 20 years ago, you know? Yes. He just dropped a new album the other couple weeks ago

[00:24:34] [SPEAKER_01]: and my wife and I are going to see him in concert for probably the fourth time sometime later this year.

[00:24:41] [SPEAKER_00]: Wow, well I will certainly get both of those added to Amazon wishlist and the Spotify play

[00:24:46] [SPEAKER_00]: is great choices. But anyone listening want me to find out more information about solid I'm

[00:24:52] [SPEAKER_00]: and explore some of the topics we talked about today. Anyway, in particular you'd like to point

[00:24:56] [SPEAKER_01]: everyone listening. Sure, I think the best spot to start is probably our website www.solidime.com.

[00:25:05] [SPEAKER_01]: That is how you pronounce it. Solid I'm combination of solid and paradigm. It's spelled S O L I D

[00:25:14] [SPEAKER_01]: I G M. And since we talked also Neil since we talked quite a bit about AI we have a really informative

[00:25:23] [SPEAKER_01]: AI page. It's under the Solutions tab on our website. And test it. Well, I will have links

[00:25:29] [SPEAKER_00]: and for the blog post that will accompany this episode so people can find all that stuff

[00:25:35] [SPEAKER_00]: from the ISOAZ. One of the things I've looked down with you about today is talking about AI

[00:25:39] [SPEAKER_00]: but looking beyond the hype and how your story solutions are helping to unlock the performance

[00:25:44] [SPEAKER_00]: and are tuned for modern real world applications. But I think the end goal here is accelerating

[00:25:50] [SPEAKER_00]: work like while lowering costs and scaling efficiently. I think it's a bit of a utopia for a lot

[00:25:55] [SPEAKER_00]: of businesses that are aiming for that but you're proving you can do just that so thank you for sharing

[00:26:00] [SPEAKER_00]: your story with me today, much appreciate it. Thank you Neil thanks for the opportunity to appreciate

[00:26:04] [SPEAKER_00]: it. I think as we've seen today the future of AI is closely into twirling with the advancements

[00:26:10] [SPEAKER_00]: in storage technology and a big thank you to Roger for giving us a glimpse into how they're latest

[00:26:17] [SPEAKER_00]: SSDs are not only setting new standards for performance but also efficiency and reliability

[00:26:22] [SPEAKER_00]: across the entire industry. And I think by optimizing storage solutions for real world applications

[00:26:28] [SPEAKER_00]: they're not only accelerating AI and enterprise workloads but they're also paving the way for future

[00:26:35] [SPEAKER_00]: innovations but the big question is how do you see the role of storage evolving in the realm of

[00:26:40] [SPEAKER_00]: AI and LeBee your thoughts. So let me know, TechRug right to Outlook.com, Neil Seahouse on all the

[00:26:46] [SPEAKER_00]: usual social channels but thank you for joining me today, stay tuned for more insights into the

[00:26:52] [SPEAKER_00]: technologies shaping our world. Join me again tomorrow where I've got another guest lined up

[00:26:56] [SPEAKER_00]: an evil speak to you all right in early tomorrow. Bye for now.