3202: How OpenUK is Driving Open Technology, AI Transparency, and Global Standards
Tech Talks DailyMarch 08, 2025
3202
35:0528.09 MB

3202: How OpenUK is Driving Open Technology, AI Transparency, and Global Standards

How is open source shaping the future of technology, governance, and AI regulation? In this episode of Tech Talks Daily, I sit down with Amanda Brock, CEO of OpenUK, to explore the evolving role of open source in the age of AI, the shifting geopolitical landscape, and the ongoing push for diversity in tech.

Since our last conversation, AI has taken center stage, raising urgent questions about transparency, intellectual property, and the future of open innovation. Amanda shares insights on how OpenUK advocates for open technology frameworks, the importance of sustainable funding for open-source projects, and the growing influence of standards in shaping global AI policy. She also discusses the UK's unique position between the EU's regulatory-heavy approach and the US's innovation-driven model—offering a glimpse into how policymakers grapple with the balance between governance and technological progress.

With the episode airing on International Women's Day, we also dive into the state of diversity in tech. While open source has historically been a gateway for talent from all backgrounds, Amanda highlights the challenges ahead as many organizations scale back DEI initiatives. She shares how OpenUK fosters inclusivity through events like State of Open Con and why open source remains a critical entry point for aspiring developers worldwide.

We also discuss the AI Action Summit and its impact on global AI policy, the rise of open-weight AI models, and why the phrase "tools, not rules" could define the future of AI governance.

Amanda also discusses her work on AI openness at AI for Good in Geneva and sustainability initiatives at COP30 in Brazil. She also shares how businesses and policymakers can contribute to a more open, collaborative future.

How can open source drive innovation while ensuring accountability in AI? And what role will the UK play in shaping the global tech landscape? Tune in for a thought-provoking discussion with one of the leading voices in open technology.

[00:00:03] There's an old saying, if you are the smartest person in the room, you're probably in the wrong room. Because growth happens when you're surrounded by people who challenge, inspire, and expand your thinking. And that is exactly why I love being in the same room as today's guest. Her name is Amanda Brock. She is the CEO of OpenUK. And Amanda is not only a leading voice in open source technology,

[00:00:29] but she's also playing a critical role in shaping conversations around AI regulation, legal frameworks, and diversity in tech. And whether it's influencing policy at Westminster, advocating for open standards, or ensuring that open source remains a gateway for talent from all backgrounds, she always seems two steps ahead of the market, tracking trends before industries even realise what's coming.

[00:00:57] So I'm delighted to invite her back onto the podcast today. So we can unpack the state of open source, the evolving AI regulatory landscape, and we'll also explore how AI governance is shifting from rigid rules to practical tools. And why open source continues to be a battleground for innovation, collaboration, and sometimes, yep, political manoeuvring. We're seeing a lot of that right now. So what does the future of open technology look like?

[00:01:27] How can business, regulators, and developers navigate the complex world of AI standards and global governance? Well, let's find out now with the always brilliant Amanda Brock. So a massive warm welcome back to the show, Amanda. We've been trying to get you back on for some time now. Both of our schedules have been so busy. But for anyone that missed our first chat, can you just remind everyone listening with a little about who you are and what you do? Yeah, I'm Amanda Brock.

[00:01:56] I'm the CEO of an organisation called Open UK. I was a lawyer mainly in the tech sector for about 25 years and then stopped all of that. I reformed six years ago. And for the last five, I've been the CEO of Open UK. And Open UK is the industry body for open technology in the UK. We bring the community together. We do a lot of advocacy and policy work and help to develop skills. And as I said, it's been two years since we last spoke. It doesn't seem that long. I know.

[00:02:25] But I mean, looking back, the last time we spoke was about the future of open source. There's been a lot of distractions around AI since then, of course. But as regards to open source, what have you seen? What new challenges and opportunities have emerged around that? I think it's super interesting because I would have said we spoke six months ago. Yeah. But I think that the work we do at Open UK generally tracks around two years ahead of the market.

[00:02:54] And the conversation we had two years ago, there's all sorts of reasons for that. But the conversation we had two years ago is actually the stuff that's coming to fruition now. And we're seeing everybody start to go around those same topics that we covered. So looking at how the maintenance and the maintainers and the payment for open source can be improved.

[00:03:18] Otherwise, how and why can we expect people to continue to pay for this gift of technology that they give to society? And I think there's actually a load of stuff around geopolitics that we talked about that really is playing out. I don't know if you saw yesterday, Keir Starmer was in the White House with, almost forgot his name. How could I? Donald Trump. And they were talking about an agreement that's going to be put in place swiftly between the UK and US on technology.

[00:03:46] And they were also talking about the gap that is happening between the EU and the US. So I think some of the stuff that we absolutely hit on is actually playing out now and we're seeing it. And the geopolitical piece and the consequence of that is obviously absolutely massive. But then there's this practical piece as well about what is open source going to look like in the future? Will there be multiple different kinds of things like open source that aren't quite open source

[00:04:15] because it becomes more of a business model or becomes more, it's not a business model, but it becomes more a software distribution method associated with certain business models? Or will it really shift and become public interest or public good technology? And that was something I know we'll come to it later, something we really saw coming out of the AI Action Summit. And I think there's also a piece where AI isn't just distracting because we're looking at what does openness mean?

[00:04:44] What does open source mean in the AI context? But also because we're seeing input and impact of people using AI tools in software development and contributing to open source projects. Yeah, and there's so much I wanted to talk with you about on that. I mean, most recently, of course, there was the incident where obviously OpenAI made a lot of their success out of hoovering up the data across the internet. And yet when somebody else did that, it was deemed this could be a bit of a problem.

[00:05:13] And we start talking about IP and AI all of a sudden without any hint of irony there. And I think when balancing regulation and innovation, there are those old stereotypes that the US, they innovate in the EU, they regulate. And China maybe imitates is something that has been said before. And I know that you recently participated in discussions at Westminster about regulatory approaches,

[00:05:38] particularly how the UK is positioned between the EU's prescriptive stance and the US's more liberal approach. But what role do you see these standards and open source playing in shaping a more effective and ultimately innovation-friendly regulation? Every word you say a different answer is going through my head. It packs so much into that sentence. That's difficult. I think, again, going back to the absolutely most recent, which is yesterday's conversation with Starmer and Trump,

[00:06:07] we saw them talking about EU on the basis that it's overly prescriptive. We didn't see any reporting about China, but their conversations will undoubtedly have included that. And we are seeing, for the first time, I think, since Labour came into power,

[00:06:28] Starmer actually, at least for the tech sector, starting to look like he might deliver on a conversation that everybody has constantly. I was at an all-party parliamentary group two days ago in Westminster, and they had John Edwards, who's the Information Commissioner, and Dame Melanie Dawes, we should check her name and correct that if I've got it wrong,

[00:06:51] who is the head of Ofcom, the communications, telecommunications regulator, talking, and they kept coming back to standards. So I think there's a couple of things that are big that you've picked up on in that question. One of them is around geopolitics, around the role that the UK might play, and how we're now finally seeing some steps that could take us into fulfilling our potential.

[00:07:19] And our potential, don't take this as me being a fan of Brexit, but we have to live with what we've got. Our potential now is that we can leverage the position we have post-Brexit to sit as an almost independent player who has a long-standing and strong relationship with the US, particularly if this agreement comes into place, which sets out collaborative moving forwards.

[00:07:44] And at the Action Summit, we saw the UK and the US not sign the declaration at the end, and then the next day we saw the UK change its AI Safety Institute to an AI Security Institute, following very much with the US. Now, we've been waiting for the AI bill, which I was being told will be any day now in December. It was, of course, the King announced it in his speech in September that there would be an AI bill.

[00:08:13] There was no draft at that time. The draft, we know, is almost ready to go out in December, yet we haven't seen it. And there's all sorts of mutterings based on yesterday's press conference that we're not going to see it and that the UK won't regulate and will continue to push the AI regulation out to the different regulated sectors. So if you're in healthcare or finance or communications, one of the sectors that is regulated,

[00:08:39] your regulator may have some discrete regulation, but the UK probably won't have general regulation. And what those regulators were talking about in Parliament two days ago was this concept of standards. And of course, standards for open source come with challenges. Standards, people tend to think that a standard is something that's free. A standard is something that is easy to participate in. And that a standard is something created by an NGO or a not-for-profit.

[00:09:07] And the reality is that standard bodies are for-profit entities, that to be part of them, you pay a membership fee. To participate costs, you need expertise and experts who can attend meetings. And that those meetings are understandably all over the world bringing people together in the same way as open source has its conferences. But it means that if you want to participate, you need to have money and skills.

[00:09:31] And it's very difficult to see where the open source community would have that representation in the process. But also, some of what they do, because it has a cost, creates friction into the supply chain and breaks the free flow that open source relies on in its licensing. We also have issues with standard essential patents, which are often licensed on a frand-free, fair, reasonable and non-discriminatory basis, which means everybody pays the same, but you still pay.

[00:10:00] So there's still a transfer of money and a license, which again interrupts that free flow and introduces friction. So standards aren't a magic wand. There are all sorts of challenges around how standards are created and implemented.

[00:10:14] And there's also an opportunity for some of big tech to almost dictate and dominate that and be given the opportunity to effectively create secondary regulation if governments rely too heavily on standards to do what policymakers ought to do. So there's a lot of complexity in getting that right.

[00:10:35] And I think we would really be advocating that all standards that are mandatory, that apply in AI, that apply in tech are going to be open standards and not encumbered with standard essential patents. And to achieve that, we're going to need our lawmakers, policymakers, governments to invest and collaborate. And one of the key things here, I think, is how we manage global governance and cross-border governance for AI, because it has to be cross-border. It's not going to work.

[00:11:06] So many great points in that. And of course, I think we should also highlight that this episode is going live on International Women's Day. And I know you've been a vocal advocate for diversity in the tech industry. And your work with State of the Open Con 24 continue to push for better representation. And the question I've got to ask, I mean, two years since we last spoke, are we making meaningful progress in this space? Or is the pace of change still too slow? That's a really difficult question. I don't have any statistics.

[00:11:35] So what I'm about to say is based entirely on a sense and a feeling as opposed to being statistically and factually based. No, I don't think we're making meaningful change. I think we're about to enter a very difficult time. I think we actually talked about pendulum swings and how Open UK is always focused on belonging rather than just issues around gender or a particular characteristic that would make you diverse.

[00:12:04] And we've tried to do events that focus on those, but also bring everyone together. And we have run State of Open Con 25. So we've done three years now. Our female speakers or our non-male speakers are over 40% and our diversity and our audience is around 50% non-white, which is pretty much unheard of in tech.

[00:12:27] And it's just about being open to the right people being there and not being exclusionary. And I think it's about an attitude and an approach, which doesn't mean that there's any positive discrimination going on. There's just a very level playing field, which is something we don't see enough of the time.

[00:12:47] I suspect we're about to enter an extremely hard time for EDI, where a few years ago, some of my sponsors were saying to me, we can't fund anybody who can't show EDI. And with what we're seeing coming out of the US, it's rapidly going out of fashion.

[00:13:05] And I think perhaps a pendulum has swung too far on some levels to make people not feel that they belong and that there is an exclusion, which is why that pendulum swing backwards is being backed so heavily by some. And that ultimately we need to find the balance in the middle. But it's going to be harder to do that as we move forward. I don't think it's about to get any easier. Yeah, I would agree, especially with so many organisations cutting back on the DEI initiatives.

[00:13:34] And one of the saddest aspects of that is open source has always been a gateway for talent for all backgrounds and a gateway to enter the tech industry as well. So what do you think could be done more to foster inclusivity and ensure that open source remains that accessible entry point for aspiring developers? Yeah, I actually think that open source is one of the solutions to this. Despite the fact that at points in our history, we've not had a great reputation.

[00:14:02] I think because of that, we shifted very clearly to structures that are based around codes and treating people equally, whoever you are, but based on talent and skills. Part of the reason we don't have more women is that it's a shift that takes time to flow through. But it is tough.

[00:14:59] And I think that's a shift.

[00:15:24] And I think that's a shift. And I think that's a shift. It's a shift.

[00:15:43] And one of the things that we are very keen to work on is bringing people together with the open source community, the contributing community, people who've been taught code but haven't yet had the opportunity to work in tech companies and to enable them to see that doorway into the tech sector and build skills.

[00:16:03] And that means that when you're very much doing things like fixing typos and gradually becoming part of a community, gradually getting more involved in conversations and gradually being taught how to contribute in and building skills that you would only get hands on on the job normally through open source.

[00:16:53] And I think that's a shift. And I think that's a shift.

[00:19:41] And I think that's a shift.

[00:20:51] And I think that's a shift.

[00:21:32] It creates a lot of governance conversations. Suddenly, you've got China doing something, which actually they've been doing in the open and they've talked about publicly for five years. And if you were following it, you would know it was coming. But they've suddenly shifted the market. And we'll see, and I think increasingly quickly, more and more of these market shifts, which is part of the reason that I've never been keen on trying to define a sector and why I think open source is a bit of a misnomer.

[00:21:58] Because the Gen AI we knew and loved in chat GPT-4 in March 2023, if I get the year right, is not going to look anything like R2, which is probably going to come out in May from DeepSeq if the rumors are to be believed. So we're seeing a constant shift and change in technology, which legislation can't keep up with. And super interesting at the Action Summit, America and China were entirely aligned in what they were saying.

[00:22:28] Which you wouldn't necessarily expect, right? They're coming from very different places, but they're both saying we're going to innovate. Don't really care about regulation. We're going to innovate and then we're going to ask for forgiveness. Yeah, exactly. So if you are not going to follow that route and you're going to innovate with huge amounts of bureaucracy and regulation around it, you are not going to be able to keep up.

[00:22:51] And J.D. Vance, when he spoke at the Action Summit, said that the U.S. was number one and planned to stay number one in that marketplace. There was no holds barred in the way he described that. The only way that you are going to be engaged with the U.S. is to partner with them. You're not going to compete with them on a standalone basis. So the partnership offer was also added to that speech, but you are going to have to be in there in the partnership.

[00:23:17] And hopefully for the U.K., Starmer this week, going back to the conversation with Trump, is doing the right things to make that happen. But if you don't do that and if you overregulate, you are going to be left behind. And I do worry for the EU. And you mentioned the AI Action Summit there, and it was called one of the most important AI events of the year. Were there any other messages that you were trying to champion there?

[00:23:41] And anything else that you learned about why that event was so critical and what did you walk away from that event thinking about and reflecting on? It was so interesting that the U.K. had set the event in motion with the first summit, the safety summit in Bletchley Park. And you get to summit. Summit 2 was pretty quiet in Korea. Summit 3, obviously, is this one, the AI Action Summit, which was on the 10th and 11th of February.

[00:24:06] And it's moved so far in that 14, 15-month period. You go from 100 people in a room in Bletchley Park on a very closed, concerned basis to I think there were 1,000 people invited to different parts of the summit between the two days. And it was very celebratory. You had Macron walking amongst the innovators, shaking hands with them. He's a great showman. You couldn't take that away from him.

[00:24:35] And being able to announce 109 billion of investment. When I was watching him deliver this speech, he's talking about 109 billion of investment for France, which in light of recent investments, and he was referring to Stargate in the U.S., where they've announced 500 million, although there's only 100 billion, sorry, 500 billion, but only 100 billion is actually committed from OpenAI and SoftBank and others. You see Macron saying that's about right, a fifth of what the U.S. is doing is about right. And that's the kind of level you're going to have to be at.

[00:25:03] So I got the sense that within Europe, France is really taking leadership. And that leadership is coming partly from Macron understanding that openness is going to be the way forward. There are all sorts of stories that it's Yan LeCun from Meta who's influenced him to believe that. But what we saw at the summit were two major announcements. One of those is on a new foundation called Current AI, run by a chap called Martin Tisney. And Martin is a long-term data person.

[00:25:33] He was on the board at the Open Data Institute a few years ago. And Martin is driving current AI with 400 million of initial investment that's expected to go to 2.5 billion in five years. He's driving it to look at the data aspects. And when we look at that disaggregated AI that I've been talking about, the biggest concern, the biggest space that we don't understand is inputs, outputs, intellectual property, the rights you're talking about.

[00:26:01] And liabilities and whether those align, whether you can have ownership rights, but not liability, et cetera, and outputs, whether AI itself should have rights. All of that is going to be discussed in the data work that they're doing, and that's going to be critical. And current AI has been set up as public interest AI. So that's a really interesting focus that we haven't seen from any other summit backed by real money. And then there was a second piece that I felt really privileged.

[00:26:31] I hadn't understood that when I'd signed up to be in the launch of Roost and Roost as R-O-O-S-T, that there were only going to be 20 or 30 of us in the room. And we had Eric Schmidt and Jan LeCun. And they were there as two of the three people with Camille Francois, who's a Columbia University professor running Roost. And Roost is super interesting.

[00:26:53] Roost is open source software tools that can be used to manage and govern AI. And what Camille was saying was tools, not rules. So instead of building regulation and governance through legislation, what you look at doing is creating dev tools.

[00:27:14] So a bit like a GitOps for regulation, putting the tools that the engineers will actually use day to day and which will achieve your goals rather than giving them a document they won't read or will struggle to pay people to create compliance for. So it actually works in the way the tech sector works. And I've always been a big advocate that when you have a technology problem, the solution is more likely to be a technology one than a legal one. So when you have a challenge, you fix it through technology itself.

[00:27:44] And I think that this kind of governance, even more than standards, I suspect that if this is well managed, it will jump ahead of standards. And what you're seeing is the big techs donating the internal tools that they use to create an ecosystem that has a certain level that is a de facto standard using these free open source tools and to have the kind of people backing it that they had.

[00:28:11] And I think it's 40 million of funding. It could be 30. It might be 30 backed already from day one to ensure that that foundation has got the money to go and do the work it needs to do and to now engage with the open source communities and build that out as well as engaging with AI companies.

[00:28:28] To me, that's the answer to regulation is building infrastructure within the technology that becomes a norm, becomes a de facto standard, doesn't require people to have money to comply with, doesn't require people to be massively financially supported or to have massive skills to engage with from the start. I think that's the future of how we manage the AI security that we're looking at now.

[00:28:56] And I think you'll see it almost is like certifying saying it's risk compliant over time by using those tools and become something free to the ecosystem. So when you go back to the Action Summit, for me, those were the two biggest things.

[00:29:10] And they are very clear manifestations of a practical approach coming out of France, enabling more and more openness around AI and open source, which is why we see France having pipped the UK last September to be number one in Europe in open source AI. I don't know how that was defined in the study that Tortoise did, but they have them as number one ahead of the UK.

[00:29:35] And when we look at GitHub, we see France as the fastest growing open source software contributor. The UK is still by far number one, so we're not going to panic, but we're seeing France pushing harder than anyone. And I think that the policy decisions, the practical decisions that Macron is making are what's enabling that. And that's what his Action Summit delivered. And I love that line, tools, not rules. In fact, I think I need to get that on a T-shirt. I mean, you could rock the event circuit in our geek T-shirts.

[00:30:04] But we have to attribute Camille Francois because it's her line, not ours. Oh, brilliant. And obviously, I know you're big on the event scene there and there's so much happening at the moment. So I've got to ask, for you personally, what's next for you? What's next for Open UK in 2025? Any major initiatives, events, collaborations or anything that you're particularly excited about right now? I'm always excited about lots.

[00:30:32] And don't tell my team that because when we do social media, nobody's allowed to say we're excited. But I am excited. I'm excited about 2025 and a lot of opportunity coming down the line. I think a couple of pieces. Obviously, I want to engage with Roost and with current AI because I think the work they're about to do is fabulous. Then I see AI for Good Summit taking place in Geneva on the 7th to 9th July, which is run by the ITU. Last year was so oversubscribed. People were queuing out of the door.

[00:31:02] I had Princess Beatrice and Sam Altman as part of the content last year. It will be that kind of level again. But I'm currently in fundraising mode to see if we can bring in enough money to have a decent space for AI openness within the exhibition space there. So focusing on that and really bringing some of the open projects and tooling into that space and the AI will be really exciting. So I'm hoping that we can do that.

[00:31:29] And then we're also still working on sustainability and picking up on the work that we did for COP26 when we had an open technology for sustainability day and delivered a data center project. We have an ongoing project that we hope to take to COP30 in Brazil in November. And I'm hoping that we'll be engaged with the local open source community in Brazil and have something interesting on the AI and data center and infrastructure side to share at that.

[00:31:57] So I think right now, those are my big things. Of course, we've always got Open UK's awards and we have our thought leadership days probably in Edinburgh in September and Cambridge in May. So there's always interesting stuff going on and all the reporting we do. But those two big events, the AI for Good Summit in Geneva, which was actually called out in the press release of the Action Summit as one of the key events this year. I think for us, that's one of the biggies.

[00:32:24] Wow. And you've mentioned everything from the UK to Geneva to Brazil and you are on the road a lot. So if anybody listening wants to find out more information about anything that you're doing, maybe help with the fundraising or just find out more information and work with you or meet you at one of these events. Where would you like to point everyone listening? Yeah, please do help us with the fundraising. That would make my life so much easier. I would send them to openuk.uk or amandabrock.com if they want me.

[00:32:49] And LinkedIn has increasingly become a space that we use in the Open UK page and my own page. But also, I think the conversation is shifting now to Blue Sky. So you'll also find us in Blue Sky. Awesome. Well, hopefully we can meet in person the next time that we speak. We'll both rock up in our IT Geek t-shirts and maybe record something live. We won't leave it two weeks ago, but thank two years again before we speak. Thanks so much. Brilliant. Thank you very much.

[00:33:19] What a conversation. Every time I speak with Amanda, I walk away with a new perspective and even more questions. From AI regulation and geopolitics to the vital role of open source in fostering innovation and inclusivity. There's no doubt that these discussions are shaping the future of technology. And if there's one thing that stands out, I think it's the battle for the future of AI and open source. It isn't about technology.

[00:33:49] If you strict back that and zoom out for a moment, it's actually about governance, collaboration, making sure the right people are in the room. And that's why I love speaking with Amanda. She's not only in the room. She's influencing the conversation and driving positive change. But what do you think? Should AI regulation be industry specific rather than a one size fits all? And will open source remain that freely accessible gateway for talent?

[00:34:16] Or are we seeing a shift towards more controlled models? And do you believe standards are the key to responsible AI governance? Or will tools like Roost play a big part in redefining how we regulate AI? Let me know your thoughts. Email me, techblogwriteroutlook.com, LinkedIn, X, Instagram, just at Neil C. Hughes. I'd love to hear your thoughts on this one and to bring it full circle.

[00:34:44] Let's make sure we all keep finding ourselves filled with people smarter than ourselves. Because that's where the real learning happens. Speak with you all again tomorrow. Bye for now.