In this episode of AI at Work, I catch up with Amanda Brock, CEO of OpenUK, for a wide-ranging conversation on the changing landscape of open technology, AI transparency, and international collaboration.
We explore how OpenUK is working ahead of the market, helping shape policies and support for open source projects while responding to rising geopolitical tensions and funding pressures. Amanda explains how the UK occupies a unique position between the EU and the US and what that means for future AI standards and regulatory frameworks.
We also discuss:
- The sustainability challenges facing open source communities and maintainers
- Shifts in AI development, including legal and ethical questions around IP and model transparency
- The role of tools like Roost and initiatives like Current AI in creating practical solutions for AI governance
- Why "tools, not rules" may offer a more realistic path than top-down regulation
- The importance of keeping open source accessible as a route into the tech industry
Amanda shares her concerns about the rollback of EDI efforts and highlights how open communities can still offer a clear path into tech for people from underrepresented and underserved backgrounds. We discuss OpenUK's upcoming skills report and how it aims to highlight open source as a solution to address the ongoing talent shortage.
Recorded ahead of International Women's Day, this episode also reflects on the slow progress around diversity and how leadership, policy, and community must come together to drive lasting change.
If you're interested in how policy, law, and open technology intersect with AI development, this conversation offers thoughtful perspective, clear examples, and real-world action.
🎧 Listen now and let us know where you think the future of open innovation is headed.
[00:00:03] Welcome to AI at Work, a podcast which is part of the Tech Talks Network. And in this podcast, we're going to venture into the transformative influence of artificial intelligence inside the workplace. And our discussions will focus on both the remarkable breakthroughs, but also the complex challenges of integrating AI into our everyday business functions and workflows.
[00:00:30] So, what does the future of open technology look like? How can business, regulators, and developers navigate the complex world of AI standards and global governance? Well, let's find out now with the always brilliant Amanda Brock. So, a massive warm welcome back to the show, Amanda. We've been trying to get you back on for some time now. Both of our schedules have been so busy, but for anyone that missed our first chat,
[00:00:57] can you just remind everyone listening with a little about who you are and what you do? Yeah, I'm Amanda Brock. I'm the CEO of an organisation called OpenUK. I was a lawyer mainly in the tech sector for about 25 years and then stopped all of that. I reformed six years ago and for the last five, I've been the CEO of OpenUK. And OpenUK is the industry body for open technology in the UK. We bring the community together.
[00:01:24] We do a lot of advocacy and policy work and help to develop skills. And as I said, it's been two years since we last spoke. It doesn't seem that long. I know. But I mean, looking back, the last time we spoke was about the future of open source. There's been a lot of distractions around AI since then, of course. But as regards to open source, what have you seen? What new challenges and opportunities have emerged around that? I think it's super interesting because I would have said we spoke six months ago. Yeah.
[00:01:53] But I think that the work we do at OpenUK generally tracks around two years ahead of the market. And the conversation we had two years ago, there's all sorts of reasons for that. But the conversation we had two years ago is actually the stuff that's coming to fruition now. And we're seeing everybody start to go around those same topics that we covered.
[00:02:17] So looking at how the maintenance and the maintainers and the payment for open source can be improved. Otherwise, how and why can we expect people to continue to pay for this gift of technology that they give to society? And I think there's actually a load of stuff around geopolitics that we talked about that really is playing out. I don't know if you saw yesterday, Keir Starmer was in the White House with, almost forgot his name. How could I? Donald Trump.
[00:02:47] And they were talking about an agreement that's going to be put in place swiftly between the UK and US on technology. And they were also talking about the gap that is happening between the EU and the US. So I think some of the stuff that we absolutely hit on is actually playing out now and we're seeing it. And the geopolitical piece and the consequence of that is obviously absolutely massive. But then there's this practical piece as well about what is open source going to look like in the future?
[00:03:16] Will there be multiple different kinds of things like open source that aren't quite open source because it becomes more of a business model or becomes more, it's not a business model, but it becomes more a software distribution method associated with certain business models? Or will it really shift and become public interest or public good technology? And that was something I know we'll come to it later, something we really saw coming out of the AI Action Summit.
[00:03:43] And I think there's also a piece where AI isn't just distracting because we're looking at what does openness mean? What does open source mean in the AI context? But also because we're seeing input and impact of people using AI tools in software development and contributing to open source projects. Yeah, and there's so much I wanted to talk with you about on that. I mean, most recently, of course, there was the incident where obviously open AI
[00:04:12] made a lot of their success out of hoovering up the data across the internet. And yet when somebody else did that, it was deemed this could be a bit of a problem. And we start talking about IP and AI all of a sudden without any hint of irony there. And I think when balancing regulation and innovation, there are those old stereotypes that the US, they innovate in the EU, they regulate and China maybe imitates is something that has been said before.
[00:04:39] And I know that you recently participated in discussions at Westminster about regulatory approaches, particularly how the UK is positioned between the EU's prescriptive stance and the US's more liberal approach. But what role do you see these standards and open source playing in shaping a more effective and ultimately innovation-friendly regulation? Every word you say, a different answer is going through my head. It packs so much into that sentence. That's difficult.
[00:05:07] I think, again, going back to the absolutely most recent, which is yesterday's conversation with Starmer and Trump, we saw them talking about EU on the basis that it's overly prescriptive. We didn't see any reporting about China, but their conversations will undoubtedly have included that. And we are seeing, for the first time, I think, since Labour came into power,
[00:05:35] Starmer actually, at least for the tech sector, starting to look like he might deliver on a conversation that everybody has constantly. I was at an all-party parliamentary group two days ago in Westminster, and they had John Edwards, who's the Information Commissioner, and Dame Melanie Dawes, we should check her name and correct that if I've got it wrong, who is the head of Ofcom, the communications, telecommunications regulator,
[00:06:05] talking, and they kept coming back to standards. So I think there's a couple of things that are big that you've picked up on in that question. One of them is around geopolitics, around the role that the UK might play and how we're now finally seeing some steps that could take us into fulfilling our potential. And our potential, don't take this as me being a fan of Brexit, but we have to live with what we've got.
[00:06:33] Our potential now is that we can leverage the position we have post-Brexit to sit as an almost independent player who has a long-standing and strong relationship with the US, particularly if this agreement comes into place. Which sets out collaborative moving forwards. And at the Action Summit, we saw the UK and the US not sign the declaration at the end. And then the next day, we saw the UK change its AI Safety Institute
[00:07:01] to an AI Security Institute, following very much with the US. Now, we've been waiting for the AI bill, which I was being told will be any day now in December. It was, of course, the King announced it in his speech in September that there would be an AI bill. There was no draft at that time. The draft we know is almost ready to go out in December, yet we haven't seen it. And there's all sorts of mutterings based on yesterday's press conference that we're not going to see it
[00:07:31] and that the UK won't regulate and will continue to push the AI regulation out to the different regulated sectors. So if you're in healthcare or finance or communications, one of the sectors that is regulated, your regulator may have some discrete regulation, but the UK probably won't have general regulation. And what those regulators were talking about in Parliament two days ago was this concept of standards.
[00:08:00] And of course, standards for open source come with challenges. Standards, people tend to think that a standard is something that's free. A standard is something that is easy to participate in. And that a standard is something created by an NGO or a not-for-profit. And the reality is that standard bodies are for-profit entities, that to be part of them, you pay a membership fee to participate costs.
[00:08:23] You need expertise and experts who can attend meetings and that those meetings are understandably all over the world, bringing people together in the same way as open source has its conferences. But it means that if you want to participate, you need to have money and skills. And it's very difficult to see where the open source community would have that representation in the process.
[00:08:45] But also, some of what they do, because it has a cost, creates friction into the supply chain and breaks the free flow that open source relies on in its licensing. We also have issues with standard essential patents, which are often licensed on a frand-free, fair, reasonable and non-discriminatory basis, which means everybody pays the same, but you still pay. So there's still a transfer of money and a license, which again interrupts that free flow and introduces friction.
[00:09:13] So standards aren't a magic wand. There are all sorts of challenges around how standards are created and implemented. And there's also an opportunity for some of big tech to almost dictate and dominate that and be given the opportunity to effectively create secondary regulation if governments rely too heavily on standards to do what policymakers ought to do. So there's a lot of complexity in getting that right.
[00:09:42] And I think we would really be advocating that all standards that are mandatory, that apply in AI, that apply in tech, are going to be open standards and not encumbered with standard essential patents. And to achieve that, we're going to need our lawmakers, policymakers, governments to invest and collaborate. And one of the key things here, I think, is how we manage global governance and cross-border governance for AI, because it has to be cross-border. It's not going to work.
[00:10:13] So many great points in that. And of course, I think we should also highlight that this episode is going live on International Women's Day. And I know you've been a vocal advocate for diversity in the tech industry. And your work with State of the Open Con 24 continue to push for better representation. And the question I've got to ask, I mean, two years since we last spoke, are we making meaningful progress in this space or is the pace of change still too slow? That's a really difficult question. I don't have any statistics.
[00:10:42] So what I'm about to say is based entirely on a sense and a feeling as opposed to being statistically and factually based. No, I don't think we're making meaningful change. And I think we're about to enter a very difficult time. I think we actually talked about pendulum swings and how Open UK is always focused on belonging rather than just issues around gender or a particular characteristic that would make you diverse.
[00:11:11] And we've tried to do events that focus on those, but also bring everyone together. And we have run State of Open Con 25. So we've done three years now. Our female speakers or our non-male speakers are over 40 percent and our diversity and our audience is around 50 percent non-white, which is pretty much unheard of in tech.
[00:11:35] And it's just about being open to the right people being there and not being exclusionary. And I think it's about an attitude and an approach, which doesn't mean that there's any positive discrimination going on. There's just a very level playing field, which is something we don't see enough of the time. I suspect we're about to enter an extremely hard time for EDI, where a few years ago some of my sponsors were saying to me,
[00:12:03] we can't fund anybody who can't show EDI. And with what we're seeing coming out of the US, it's rapidly going out of fashion. And I think perhaps a pendulum has swung too far on some levels to make people not feel that they belong and that there is an exclusion, which is why that pendulum swing backwards is being backed so heavily by some. And ultimately, we need to find the balance in the middle.
[00:12:30] But it's going to be harder to do that as we move forward. I don't think it's about to get any easier. Yeah, I would agree, especially with so many organisations cutting back on the DEI initiatives. And one of the saddest aspects of that is open source has always been a gateway for talent for all backgrounds and a gateway to enter the tech industry as well. So what do you think could be done more to foster inclusivity and ensure that open source remains that accessible entry point for aspiring developers?
[00:12:59] Yeah, I actually think that open source is one of the solutions to this. Despite the fact that at points in our history, we've not had a great reputation. I think because of that, we shifted very clearly to structures that are based around codes and treating people equally, whoever you are, but based on talent and skills.
[00:13:22] Part of the reason we don't have more women is that it's a shift that takes time to flow through. But it is tough.
[00:13:34] And I think that with open source, what we do is we create an opening, an access point that doesn't rely on funding or it does rely on skills, but it doesn't rely on funding or your socioeconomic background or being able to go to the right schools. We're doing an event in Cambridge in May.
[00:14:25] And I was with my team there yesterday. How you access something. That's about understanding the portal or the doorway in. And I think with open source, we create a doorway, an entrance and opening into the tech sector that requires you to have some level of skill and talent, but doesn't require you to be in a particular place in the world, doesn't require you to have funding. It requires you to have the time to be able to contribute.
[00:14:50] And one of the things that we are very keen to work on is bringing people together with the open source community, the contributing community, people who've been taught to code, but haven't yet had the opportunity to work in tech companies and to enable them to see that doorway into the tech sector and build skills.
[00:15:41] Where you've got access to a really thriving, growing tech sector. So for people who are cast all over the planet and in the UK, cast across rural areas, there is this opportunity to become part of something without leaving where you're from. And we'll have a skills report coming out in March that actually focuses heavily on the ability to access the tech sector through open source.
[00:16:07] And things like that are about showing the door that you can step through to become part of something. And I think that process and project of highlighting to people, how, how do I get to do this? What is this? What does it look like? It is the step that they need to be able to build a career. And for the UK and other countries, it gives you an access point as a country to fill your skills gap.
[00:16:34] And it's absolutely clear that outside of the Bay Area, there's a skills gap. Yeah, there really is. And a question I'd love to go back to, because with your background as a lawyer and your deep involvement in open source that we're talking about here, how do you see legal frameworks evolving to better support open innovation while protecting intellectual property? And a few moments ago, I did pack a lot into a sentence and I referenced open AI and deep seek.
[00:17:00] I'm curious, as a with a legal background, how you or what you think of the how this is playing out? Well, it's quite interesting when you mentioned deep seek. I said you put too much in one question. Yeah. When you when you mentioned open AI, you didn't say deep seek first time around, but you talk about open AI having used lots of existing intellectual property for the inputs that allowed it to build its tools, right? Build its products. And I don't know if you saw at the end of January.
[00:17:28] I was quite pleased after deep seek launched and had its phenomenal success and knocked 17% off Nvidia share price in a day. What we saw was Sam Altman saying that open AI was, in his view, on the wrong side of history. I'm going to get that printed on a T-shirt. He was talking about open weights and how he personally thought that open AI ought to open up its weights.
[00:17:54] And he then went on to point out that that was his personal view, that it is not something that's high up open AI's agenda. But there's been increasing conversation around this in the last couple of weeks. And we are seeing a shift to better understanding of openness, which I think is fundamental to this. And you see that with this very week. And I haven't looked this morning yet. And I don't know if anybody else can tell me, but this has been deep seeks open source week.
[00:18:23] I'm going to get nerdy for a minute. So there's a whole debate around what open source means in the context of AI. And really, we should stop using the term open source because it has a meaning which relates to software. And if we do what Open UK does and talk about AI openness, we sort of decouple the different parts of AI. We decompile them almost. And by disaggregating them in that way, you can look at the data. You can look at the weights. You can look at how we go around.
[00:18:52] You find the different pieces. And you can assess if they are software, whether or not they're open source using the traditional definition. And we saw DeepSeek share its weights on an open source MIT license. So it is open source. But it didn't share everything. And this week, what they've been doing is each day they've been releasing a repository on GitHub, which has an open source MIT license, permissive license, allowing people to use it broadly
[00:19:19] so that they have as many of the tools as they're able to release around that basic central weight code to build up the open source repository. Now, what they also did was they shared very detailed instructions. And straight away, we saw Hugging Face build R1 Open. So R1 is the DeepSeek product that was launched in January. And R1 Open is a Hugging Face project, which sort of decompiled it.
[00:19:45] It took the instructions and went back through the process of building R1 with different data. So obviously, DeepSeek is constrained because it's in China and what it can use. There's all the usual data protection concerns about what it takes in when you're using the tools. But with Hugging Face, they've been able to go and rebuild it and build their own version. And that's the beauty of it being open source.
[00:20:10] So in creating R1, DeepSeek took other codes, other AI, and they distilled it. And that distillation process allowed them to build something for 5 million as opposed to 100 million, supposedly. So 5% of the overhead in creating it. And then you create the cycle of innovation where they've taken, they've been able to move the market forward. The next one builds on top of that and we layer it again and again.
[00:20:37] Now, obviously, that creates a lot of governance conversations. Suddenly, you've got China doing something, which actually they've been doing in the open and they've talked about publicly for five years. And if you were following it, you would know it was coming. But they've suddenly shifted the market. And we'll see, and I think increasingly quickly, more and more of these market shifts, which is part of the reason that I've never been keen on trying to define a sector and why I think open source is a bit of a misnomer.
[00:21:05] Because the Gen AI we knew and loved in chat GPT-4 in March 2023, if I get the year right, is not going to look anything like R2, which is probably going to come out in May from DeepSeek if the rumors are to be believed. So we're seeing a constant shift and change in technology, which legislation can't keep up with. And super interesting at the Action Summit, America and China were entirely aligned in what
[00:21:34] they were saying, which you wouldn't necessarily expect, right? They're coming from very different places, but they're both saying we're going to innovate. Don't really care about regulation. We're going to innovate and then we're going to ask for forgiveness. Yeah, exactly. So if you are not going to follow that route and you're going to innovate with huge amounts of bureaucracy and regulation around it, you are not going to be able to keep up.
[00:21:59] And J.D. Vance, when he spoke at the Action Summit, said that the U.S. was number one and planned to stay number one in that marketplace. There was no holds barred in the way he described that. The only way that you are going to be engaged with the U.S. is to partner with them. You're not going to compete with them on a standalone basis. So the partnership offer was also added to that speech, but you are going to have to be in there in the partnership.
[00:22:25] And hopefully for the U.K., Starmer this week, going back to the conversation with Trump, is doing the right things to make that happen. But if you don't do that and if you overregulate, you are going to be left behind. And I do worry for the EU. And you mentioned the AI Action Summit there, and it was called one of the most important AI events of the year. Were there any other messages that you were trying to champion there? And anything else that you learned about why that event was so critical?
[00:22:52] And what did you walk away from that event thinking about and reflecting on? It was so interesting that the U.K. had set the event in motion with the first summit, the safety summit in Bletchley Park. And you get to summit. Summit two was pretty quiet in Korea. Summit three, obviously, is this one, the AI Action Summit, which was on the 10th and 11th of February. And it's moved so far in that 14, 15-month period.
[00:23:19] You go from 100 people in a room in Bletchley Park on a very closed, concerned basis to, I think there were 1,000 people invited to different parts of the summit between the two days. And it was very celebratory. You had Macron walking amongst the innovators, shaking hands with them. He's a great showman. You couldn't take that away from him. And being able to announce 109 billion of investment.
[00:23:45] When I was watching him deliver this speech, he's talking about 109 billion of investment for France, which in light of recent investments, and he was referring to Stargate in the U.S., where they've announced 500 million, although there's only 100 billion, sorry, 500 billion, but only 100 billion is actually committed from OpenAI and SoftBank and others. You see Macron saying that's about right. A fifth of what the U.S. is doing is about right. And that's the kind of level you're going to have to be at. So I got the sense that within Europe, France is really taking leadership.
[00:24:14] And that leadership is coming partly from Macron understanding that openness is going to be the way forward. There are all sorts of stories that it's Jan LeCun from Meta who's influenced him to believe that. But what we saw at the summit were two major announcements. One of those is on a new foundation called CurrentAI, run by a chap called Martin Tisney. And Martin is a long term data person. He was on the board at the Open Data Institute a few years ago.
[00:24:43] And Martin is driving current AI with 400 million of initial investment that's expected to go to 2.5 billion in five years. He's driving it to look at the data aspects. And when we look at that disaggregated AI that I've been talking about, the biggest concern, the biggest space that we don't understand is inputs, outputs, intellectual property,
[00:25:07] the rights you're talking about, and liabilities, and whether those align, whether you can have ownership rights, but not liability, et cetera, and outputs, whether AI itself should have rights. All of that is going to be discussed in the data work that they're doing. And that's going to be critical. And current AI has been set up as public interest AI. So that's a really interesting focus that we haven't seen from any other summit, backed by real money.
[00:25:35] And then there was a second piece that I felt really privileged. I hadn't understood that when I'd signed up to be in the launch of Roost and Roost as R-O-O-S-T, that there were only going to be 20 or 30 of us in the room. And we had Eric Schmidt and Jan LeCun. And they were there as two of the three people with Camille Francois, who's a Columbia University professor running Roost. And Roost is super interesting.
[00:26:01] Roost is open source software tools that can be used to manage and govern AI. And what Camille was saying was tools, not rules. So instead of building regulation and governance through legislation, what you look at doing is creating dev tools. So a bit like a GitOps for regulation, putting the tools that the engineers will actually use day to day
[00:26:27] and which will achieve your goals rather than giving them a document they won't read or will struggle to pay people to create compliance for. So it actually works in the way the tech sector works. And I've always been a big advocate that when you have a technology problem, the solution is more likely to be a technology one than a legal one. So when you have a challenge, you fix it through technology itself. And I think that this kind of governance, even more than standards,
[00:26:56] I suspect that if this is well managed, it will jump ahead of standards. And what you're seeing is the big techs donating the internal tools that they use to create an ecosystem that has a certain level that is a de facto standard, using these free open source tools. And to have the kind of people backing it that they had, and I think it's 40 million of funding, it could be 30, it might be 30, backed already from day one
[00:27:25] to ensure that that foundation has got the money to go and do the work it needs to do and to now engage with the open source communities and build that out, as well as engaging with AI companies. To me, that's the answer to regulation. It's building infrastructure within the technology that becomes a norm, becomes a de facto standard, doesn't require people to have money to comply with, doesn't require people to be massively financially supported
[00:27:53] or to have massive skills to engage with from the start. I think that's the future of how we manage the AI security that we're looking at now. And I think you'll see, it almost is like certifying, saying it's risk compliant over time by using those tools and become something free to the ecosystem. So when you go back to the Action Summit, for me, those were the two biggest things.
[00:28:17] And they are very clear manifestations of a practical approach coming out of France, enabling more and more openness around AI and open source, which is why we see France having pipped the UK last September to be number one in Europe in open source AI. I don't know how that was defined in the study that Tortoise did, but they have them as number one ahead of the UK. And when we look at GitHub,
[00:28:44] we see France as the fastest growing open source software contributor. The UK is still by far number one, so we're not going to panic, but we're seeing France pushing harder than anyone. And I think that the policy decisions, the practical decisions that Macron is making are what's enabling that. And that's what his Action Summit delivered. And I love that line, tools, not rules. In fact, I think I need to get that on a T-shirt. You could rock the event circuit in our geek T-shirt. But we have to attribute Camille Francois,
[00:29:14] because it's her line, not Iris. Oh, brilliant. And obviously, I know you're big on the event scene there, and there's so much happening at the moment. So I've got to ask, for you personally, what's next for you? What's next for Open UK in 2025? Any major initiatives, events, collaborations, or anything that you're particularly excited about right now? I'm always excited about lots. And don't tell my team that, because when we do social media, nobody's allowed to say we're excited.
[00:29:44] But I am excited. I'm excited about 2025 and a lot of opportunity coming down the line. I think a couple of pieces. Obviously, I want to engage with Roost and with current AI, because I think the work they're about to do is fabulous. Then I see AI for Good Summit taking place in Geneva on the 7th to 9th July, which is run by the ITU. Last year was so oversubscribed, people were queuing out of the door. It had Princess Beatrice and Sam Altman as part of the content last year.
[00:30:13] It will be that kind of level again. But I'm currently in fundraising mode to see if we can bring in enough money to have a decent space for AI openness within the exhibition space there. So focusing on that and really bringing some of the open projects and tooling into that space and the open AI will be really exciting. So I'm hoping that we can do that. And then we're also still working on sustainability and picking up on the work that we did for COP26
[00:30:42] when we had an open technology for sustainability day and delivered a data center project. We have an ongoing project that we hope to take to COP30 in Brazil in November. And I'm hoping that we'll be engaged with the local open source community in Brazil and have something interesting on the AI and data center and infrastructure side to share at that. So I think right now, those are my big things. But of course, we've always got Open UK's awards and we have our thought leadership days,
[00:31:11] probably in Edinburgh in September and Cambridge in May. So there's always interesting stuff going on and all the reporting we do. But those two big events, the AI for Good Summit in Geneva, which was actually called out in the press release of the Action Summit as one of the key events this year. I think for us, that's one of the biggies. Wow. And you've mentioned everything from the UK to Geneva to Brazil. And you are on the road a lot. So if anybody listening wants to find out more information about anything that you're doing,
[00:31:41] maybe help with the fundraising or just find out more information and work with you or meet you at one of these events, where would you like to point everyone listening? Yeah, please do help us with the fundraising. That would make my life so much easier. I would send them to openuk.uk or amandabrock.com if they want me. And LinkedIn has increasingly become a space that we use in the Open UK page and my own page. But also, I think the conversation is shifting now to Blue Sky. So you'll also find us in Blue Sky. Awesome.
[00:32:11] Well, hopefully we can meet in person the next time that we speak. We'll both rock up in our IT Geek t-shirts and maybe record something live. We won't leave it two weeks ago, but thank two years again before we speak. But thanks so much. Brilliant. Thank you very much. What a conversation. Every time I speak with Amanda, I walk away with a new perspective and even more questions. From AI regulation and geopolitics to the vital role of open source
[00:32:38] in fostering innovation and inclusivity. There's no doubt that these discussions are shaping the future of technology. And if there's one thing that stands out, I think it's the battle for the future of AI and open source. It isn't about technology. If you strict back that and zoom out for a moment, it's actually about governance, collaboration, making sure the right people are in the room. And that's why I love speaking with Amanda. She's not only in the room,
[00:33:07] she's influencing the conversation and driving positive change. But what do you think? Should AI regulation be industry-specific rather than a one-size-fits-all? And will open source remain that freely accessible gateway for talent? Or are we seeing a shift towards more controlled models? And do you believe standards are the key to responsible AI governance? Or will tools like Roost play a big part
[00:33:35] in redefining how we regulate AI? Let me know your thoughts. Email me, techblogwriteratoutlook.com, LinkedIn, X, Instagram, just at Neil C. Hughes. I'd love to hear your thoughts on this one. And to bring it full circle, let's make sure we all keep finding ourselves filled with people smarter than ourselves. Because that's where the real learning happens. Bye for now.

