What does sales leadership actually look like once the AI experimentation phase is over and real results are the only thing that matters?
In this episode of Tech Talks Daily, I sit down with Jason Ambrose, CEO of the Iconiq-backed AI data platform People.ai, to unpack why the era of pilots, proofs of concept, and AI theater is fading fast. Jason brings a grounded view from the front lines of enterprise sales, where leaders are no longer impressed by clever demos. They want measurable outcomes, better forecasts, and fewer hours lost to CRM busywork. This conversation goes straight to the tension many organizations are feeling right now, the gap between AI potential and AI performance.
We talk openly about why sales teams are drowning in activity data yet still starved of answers. Emails, meetings, call transcripts, dashboards, and dashboards about dashboards have created fatigue rather than clarity.

Jason explains how turning raw activity into crisp, trusted answers changes how sellers operate day to day, pulling them back into customer conversations rather than into internal reporting loops. The discussion challenges the long-held assumption that better selling comes from more fields, more workflows, and more dashboards, arguing instead that AI should absorb the complexity so humans can focus on judgment, timing, and relationships.
The conversation also explores how tools like ChatGPT and Claude are quietly dismantling the walls that enterprise software spent years building. Sales leaders increasingly want answers delivered in natural language rather than another system to log into, and Jason shares why this shift is creating tension for legacy platforms built around walled gardens and locked-down APIs.
We examine what this means for architectural decisions, why openness is becoming a strategic advantage, and how customers are rethinking who they trust to sit at the center of their agentic strategies.
Drawing on work with companies such as AMD, Verizon, NVIDIA, and Okta, Jason shares what top-performing revenue organizations have in common.
Rather than chasing sameness, scripts, and averages, they lean into curiosity, variation, and context. They look for where growth behaves differently across markets, segments, or products, and use AI to surface those differences rather than flatten them out. It is a subtle shift, but one with big implications for how sales teams compete.
We also look ahead to 2026 and beyond, including how pricing models may evolve as token consumption becomes a unit of value rather than seats or licenses.
Jason explains why this shift could catch enterprises off guard, what governance will matter, and why AI costs may soon feel as visible as cloud spend did a decade ago. The episode closes with a thoughtful challenge to one of the biggest myths in the industry, the belief that selling itself can be fully automated, and why the last mile of persuasion, trust, and judgment remains deeply human.
If you are responsible for revenue, sales operations, or AI strategy, this episode offers a clear-eyed look at what changes when AI stops being an experiment and starts being held accountable, so what assumptions about sales and AI are you still holding onto, and are they helping or quietly holding you back?
Useful Links
Follow Jason Ambrose on LinkedIn
Learn more about people.ai
Thanks to our sponsors, Alcor, for supporting the show.
[00:00:04] What does selling actually look like when AI stops being just another science experiment and starts being judged on real results? Because sales teams, they've spent years buried in dashboards, CRM updates, call notes and endless internal reviews, all in the name of forecasting accuracy. And yet many leaders quietly admit that the tools that were meant to help them have slowly become part of the problem.
[00:00:31] And by that, I mean more systems, more data, more fatigue, and still not enough clarity on what is actually going to move a deal forward. And my guest today is Jason Ambrose. He's the CEO of People.ai. They're focused on turning raw sales activity into something far more useful, real answers that sales teams can act on.
[00:00:54] And he will argue that the era of AI experimentation is effectively over. And how pilots that never reach production are a liability, not a learning phase. Let's bust that myth first of all. And how sales leaders now need technology that delivers measurable impact. And whether that is better forecasting, clearer deal signals or fewer hours lost to CRM busy work, we will cover it all.
[00:01:22] And we'll also talk about how AI can synthesize emails, meetings and call transcripts, put them into context-rich insights. But most importantly, why sellers want answers rather than another system to log into? And how tools like ChatGPT and Claude, they're quietly tearing down the walls that enterprise software have spent years building.
[00:01:46] But what does all this mean? And what are some of the biggest top performing companies doing like AMD, NVIDIA, Verizon and Okta? What are they doing differently right now? These are a few things we're going to find out today. So buckle up and hold on tight because I'm going to give you a grounded, honest look at where sales technology is heading and where hype is giving way to accountability. Before I bring today's guest on, I just want to give a massive thank you to my friends at Donodo.
[00:02:14] Because after visiting over 25 different events in 2025, one of the phrases I keep hearing is no data, no AI. And agentic AI simply needs better data. Now, agentic AI is here, but it only works when the data behind it is complete, governed and in real time. And this is one of the areas that Donodo helps.
[00:02:37] Because Donodo gives you a logical data foundation that accelerates AI, boosts lake house performance and turns your information into reusable data products. And for every team, visit Donodo.com and start making your data work harder. But enough from me. Let me introduce you to today's guest now. So a massive warm welcome to the show, Jason. Thank you for joining me today.
[00:03:05] For anyone listening, hearing about you for the first time, can he tell you, tell them a little about who you are and what you do? Yeah, absolutely, Neil. Happy to be here and thanks for having me. So I'm Jason Ambrose. I'm the CEO of a company called People AI. And what we do is we use our AI to find answers about what's happening in your field and in your sales processes. So in your accounts, you know, with your contacts and your deals.
[00:03:30] And we do that by looking at the actual activity of the communication between your field and the customers. So emails, Zoom transcripts, calendar meetings. But the key is turning that into answers that are actually actionable, right? So getting specific on here are the things that are blocking this deal. Here are the things that are concerns in this account. Looking for trends across when you select across those accounts. That's what we specialize in.
[00:04:00] We think that's important in the world of both humans and agents looking for those answers. And I love what you said there. We're not just talking about AI here. We're talking about searching for actionable answers. And many companies are still talking about AI pilots or struggling to get out of pilot phase into production. And when they get there, they're struggling to scale. And you've said that the experimentation phase is over now.
[00:04:27] So from your perspective, what is it that's separating teams from measurable revenue and measurable impact from those that are still stuck just kind of playing with tools in a pilot phase? Yeah, I think there's a couple of things, right? One is I think there's a sense of commitment from the organization, right? So I look at Red Hat, one of our top customers. And from Matt Hicks, Matt, the CEO, is using it every day.
[00:04:53] And he committed that everybody needs to be thinking about how they're using AI to change the way they work and at an individual level. So nobody's coming with a program to help you figure it out. His mandate was you need to figure it out yourself because you know best how to do your job. So I think that there's this commitment to get over this fear of what if we get it wrong? What if things don't quite work?
[00:05:20] Because that's really the point of experimentation is to learn and not expect results so that then you can decide what you're going to do, that it's going to generate the results. And I think that that gets to the second piece is things came so fast and are changing so fast, it's difficult for organizations to move through the learning curve when we keep changing the curve to then figure out how I'm going to go generate results.
[00:05:46] But the ones that lean in and accept, hey, we're going to experiment as we go and we're not going to place undue expectations on what we generate and we're not looking for perfect. We're looking for better. I think those are the ones that are creating the culture in the space to actually generate some real impact.
[00:06:05] And then I think a third piece is just being more precise on understanding what AI does and doesn't do and purposeful in how you consider to use AI and what you apply it toward and how that's going to generate change that generates impact. And if we were to imagine ourselves in an organization of any size in any industry, one department that we would expect to see there is the sales department and sales teams.
[00:06:35] Traditionally, they've been drowning in activity data, emails, meetings. Now we've got call transcripts, CRM updates. So where do you see AI genuinely reducing friction today? And where does it risk adding just another layer of noise there? Yeah, we have this ethos within people AI that sort of ties to our brand, which is that we believe people should work with people and AI should do the rest.
[00:07:00] Right. So that drowning that you're talking about is I've talked about this in a couple of my posts, but it's it's pulling people back from the field internally to exchange information and try to vet that information and try to get to what you think is a true answer. Right. So we ask sales reps to enter information into an opportunity or build an account plan or take notes on a call. Then somebody calls them and says, hey, is this really accurate?
[00:07:28] Some other people get on a call and they review this information, either in aggregate or in that specific example to say, OK, so what does this mean? What should we do differently? We try to find trends. And so all of these humans are spending all this time capturing and processing this information, but it's self-reported and it's biased and we don't trust it. So then that takes more human time and it takes people out of the field and talking to customers.
[00:07:55] But what I can do is it can look across all of that and it can synthesize it. That's one of the things that I is very good at doing and and give you those insights. What's important is the. The depth and the precision of analyzing the context to turn those into actionable answers. Right. So I can pull up an LLM, you know, collage, chat GPT, whatever, and say, what should I do in this account? And it'll tell me something like, you know, build trust with stakeholders.
[00:08:25] Well, what do I do with that? Right. But, you know, with tools like ours, you can say, look, you've got this. The security team has concerns about, you know, your GDPR compliance. And here's some content to get them over the hump on that particular issue. You need to. Give them this content and give them these five other examples of customers who have gotten us through this issue.
[00:08:51] Right. That's an actionable answer that comes from analyzing in depth both what you know about your products and your company and the specifics of what's happening in an account. So. So circling all the way back to summarize your question, I think we're going to be spending less time entering information, analyzing information. We'll let the AI do that and we'll have sellers who are spending more time out in the field, but taking the answers that AI is generating and acting on them.
[00:09:20] And tools like ChatGPT and Claude that you mentioned a few moments ago, they're partly responsible for changing expectations now. And how are you seeing these interfaces breaking down the walls that enterprise software spent years building, especially in sales organizations? And also tell me a bit more about yourselves at people. What makes you different as well from some of the names that we hear a lot?
[00:09:45] It's a fascinating time right now and it's it's moving so quickly, you know, I think to to to frame this. Right. Yeah. We've had this world of, you know, whether it's CRM platforms or, you know, sales tech, martech. The sources of data we're trying to build this sort of vertical stack of here's the information all the way up to a specific UX.
[00:10:09] Again, for humans who need to find, enter and caretake this data. Right. What's changing now with the LLMs and agents in general is these tools are naturally cross-functional across system. Right. So we're asking them to not just process information, but to go do things about it. And typically when you do that, you have to go across systems.
[00:10:33] So to be able to do that, you're inherently enabling these agents to cut across these systems. And these agents don't need the user experience because that's not how they interact. Right. They're working through the APIs. So. So it's. The in terms of how these accounts get value, right, how customers get value.
[00:10:54] Whether it's human or otherwise, we're now cutting across these systems and we're intentionally sort of deconstructing this idea of I need 40 tabs in 40 different systems. And I, as a human, have to process across that. Right. I think some customers also have consciously recognized this is their way to break out of the walled garden idea and say, hey, I can control my destiny in terms of what happens with data and how I get these things done.
[00:11:21] And that's causing some panic in some of the legacy systems that sort of built this quote unquote moat from data all the way up through the user experiences. And so we see customers being more aggressive about saying, hey, I need to pull some of this activity out of these systems, give it to these agents.
[00:11:43] And I need to create a different set of experiences that cuts across this, whether it's through chat GPT as a front end or a set of agents. And they're moving very quickly on this idea that humans aren't the ones who are sitting in the interface, dealing with the data, finding out what to do and then going in to do that. And we're also seeing reaction from some of these players who are now trying to put toll gates on the APIs. You know, there's been a lot of press about things that Salesforce is doing and others.
[00:12:15] I think what's going to be interesting is, does that work or does that accelerate customers feeling like they need to be less dependent on these monolithic vendors? Right. And I think that undermines the ability, you know, if. If Salesforce is coming to me and continues to charge 20, 30 percent more on renewals and I don't feel like I can get out, why would I give them stewardship of my agentic strategy?
[00:12:45] So set aside whether or not agent force works, you know, now not only do they have that data, but now they have the agents. Are they going to act the same way in this world? And I think we're going to see a lot this year in terms of how that is shaking the snow globe of technology choices and footprints in enterprise customers. Because from what I see with some of our customers, they're not responding well to that.
[00:13:11] And they're embracing this idea of, hey, I may have a bit more best of breed on this. And there's different buy build choices to the customers are making. Yeah, I completely agree. We've kind of been here before. We've seen with cloud when cloud first arrived and everyone was moving their data centers out into the cloud because of incredibly cheap costs. Get everybody hooked on the technology. And then once everybody's on board, then the prices start sneaking up and becoming unaffordable. You find yourself in a tricky situation.
[00:13:40] And you also mentioned at the beginning of our conversation, you work with Red Hat. But you also work with companies such as AMD, Verizon, NVIDIA, and Okta to name but a few. So I'm curious, when you look at all this work you're doing with these big performers, what do top performers consistently do differently? And why has that insight been so hard to capture with traditional CRM systems over the years that they couldn't do then but they can do now?
[00:14:07] Yeah, I love the question because we're all trying to learn about what the best ones are doing. I think it's difficult to find a universal pattern. You know, like NVIDIA, I don't know many businesses that can look at what they do and draw parallels, right? And, you know, just look at the spectrum of those customers. Verizon's business versus AMD versus Okta, right? Having said that, I'll give it a go.
[00:14:40] You know, I think what I see is active curiosity about trying to figure out and learn what is happening in their field organizations and how to do something about that, right? And so what I mean is it ties back to, you know, in the world of SaaS, we've talked about predictable revenue, right?
[00:15:03] And that has created this idea in the Zerp era that, like, if we just get sameness across our field, right? If they follow the same scripts, if they follow the same first call decks, right? If they follow the same objection handling, that will generate sales, right? And my thesis is that just coincided with an era where we had a lot of money that we could, that everybody was buying software, right?
[00:15:32] So what we thought made good selling may not actually be the case. Maybe it does need to be more personalized to the account, to the seller, and to some of these other things. So some of these larger organizations, they're a lot more thoughtful about how does their growth different in the different pockets of their markets, right?
[00:15:56] So whether it's a segment, whether it's a product family, you know, whether it's a geography, they're constantly looking at how is it different in all of these locations. And I think if you try to play to averages and you try to roll things up too much, that may be a symptom of average or underperforming sales and growth organizations.
[00:16:17] But the ones who are really leaning in to try to understand where do I actually get that growth and how do I see the differentiation in how my business is performing? Those are the ones who are doing better. And I think those are the ones who are leaning in more aggressively to the technologies and the ways to use AI to help them find those answers and to take what's doing and bring it to other parts of the sales organization and stop doing the things that are not working.
[00:16:45] And I think when it comes to talking around CRMs, I think we could all agree that CRM fatigue is very real. Many sales leaders say they want answers, not just another dashboard. We've all seen enough of them throughout our careers. So how does that shift change with what a sales platform needs to deliver in the next two years? Because another dashboard is not going to cut it, is it? It's absolutely not.
[00:17:12] And I think the idea of adding more complexity in these systems feels like the realization that organizations are having is like, I can add more fields and I can add more workflows and I could add more dashboards. But that's more of the same and it's not really changing anything. Yeah.
[00:17:32] I think it comes back to opening this up and being simpler but having more focus and follow through on what it is that's really important and what moves the needle and doing that well. And getting people back out with customers, right? So we've asked them to spend a lot of time in technology, which has taken them away from customers. And there's more that they can do.
[00:18:01] They should be out, right? It's just not natural for us to spend a bunch of time in a spreadsheet or a form poking around with data. Those are the things that AI should do. I think from an architecture in a landscape perspective, I think this is going to open up to say this idea of tech consolidation hasn't really served us well, right?
[00:18:25] So, you know, we put a lot in the hands of businesses that are charging, you know, big margin on top of that. And because innovation is happening so fast in so many different places, you know, maybe we do need architectures that are a bit more closer to the best of breed end of the spectrum and have people that specialize in doing some things very well. And then think about who you want to use to aggregate those answers and act on them, both for agents and for humans, right?
[00:18:54] And there may be different choices about the human side of that experience, right? Some people may want chat GPT. Some people may need the structure of a form that shows them their opportunity objects, right? Some people may need things that are more simplified. That part, I think, is going to play itself out over the next couple of years.
[00:19:12] But I do think the landscape is going to become more open and interoperable and has to embrace people specializing and doing certain things very well because it's just the pace of innovation is too fast on the things that are out there. I've partnered with Alcor and if expanding engineering operations beyond your home market can be overwhelming, you're not alone.
[00:19:36] Because if you've ever wrestled with local laws, slow response times and partners who treat each country as separate rather than part of a wider strategy, you might want to check out Alcor. They approach expansion completely different. They specialize in building tech teams across Eastern Europe and Latin America and they combine employer of record services with recruiting. So you get one singular coordinated process.
[00:20:03] They help you choose the right jurisdiction based on your needs, run proper evaluation of candidates and onboard teams quickly. And their model is also refreshingly transparent. Most of your contribution will go straight to your engineers and their fee shrinks as your team grows. And there is no cost to exit if you move the team in-house at a later date.
[00:20:26] And I think that kind of clarity is why so many high growth companies in Silicon Valley are working with them right now. So you can find out more details at alcor.com slash podcast or simply use the link in the show notes. Scrolling through any of our news feeds and listening to some of the conversations at tech conferences, people are increasingly talking about autonomous AI agents in sales. But autonomy does require context and trust.
[00:20:55] If you don't get that right, then it can cause more harm than good. So what does it actually take to give AI enough structured understanding to act without creating risk? I would imagine it can be quite a balance. I think this gets back to what we were just saying about the specialization because it turns out it's a hard problem, right? Yeah. We've spent 10 years on this. We've processed billions of dollars in transactions. And building an AI specifically to do this is difficult, right?
[00:21:24] Like how do you think about a given email at a large organization, right? You know, let's say Microsoft selling to Verizon, right? There might be five, six, seven different topics in an email, say, to somebody in procurement or to a CFO. You know, how do you parse that and say this is associated with that opportunity or this doesn't have anything to do with that? This is signal, this is noise.
[00:21:51] How do we put that in the context of all of the people who are involved in a buying and selling process, you know, when you get into larger enterprise? It's a difficult problem. And that's why we want to focus on this. You know, then you get into what does it mean for an answer to be actionable, right? To your point, like what's good enough, right?
[00:22:13] There's answers that are effectively useless because it's just inaccurate, but you don't need absolute precision. You know, we, especially when you're talking about qualitative answers, right? So one thing we distinguish when we talk to customers is they'll say, listen, the answer needs to be perfect. And so yes, like when it comes to numbers, like if you want to know, you know, what was the total dollars on this deal, right?
[00:22:43] Or how long did it take to close this deal? You're asking for a specific number. That should be 100% answer, right? But questions like what are the blockers to this deal or what's the mindset of this key decision maker? That's a more nuanced answer, right? So how do you decide what is perfect, right? You can try to say what your confidence is in that answer, which again is a hard question, right?
[00:23:09] But shaping an answer that is actionable and good enough, to your point, that takes a lot of work, right? And it takes looking at a lot of data. And so that's why we specialize in that piece. And we understand that our job is to provide those answers to humans and agents where they're needed, right? So we're not trying to build a full stack and ask you to do everything on a platform.
[00:23:31] We believe that in this world that I'm describing, there needs to be an open architecture because different agents, different systems, different humans are going to need access to those answers. And we both mentioned that we expect price models to change as this technology evolves. And I think it's also safe to predict that value will increasingly be tied to token consumption rather than those traditional seats or licenses. So what does this mean for buyers?
[00:24:00] And where could it catch enterprises off guard if not this year, next year? Yeah. Yeah. You know, it's wild is I was talking to somebody who was telling me this story, you know, and I've heard a couple of versions of this. They're still anecdotal, but it's interesting to me where, you know, they were selling a solution to a customer who came to them and said, hey, look, you know. So we've got we've got more open AI tokens than we're going to use.
[00:24:26] So we'd like to use that here to as essentially a barter to say, we're not going to pay you cash, but I want to give you tokens because I have more than I am going to use. Yeah. And that was an interesting trigger for me, because in a certain sense, it is becoming a currency where we all understand the value because both vendors and customers are using these tokens and the LLMs are driving the sense of understanding of what a token means. Right.
[00:24:55] Now, how this actually shows up, I don't know. Right. But it's interesting that what is an esoteric unit of work has now kind of become standardized, even if we don't really understand it. You know, it's sort of similar to the dollar in the sense of I don't understand really why a dollar is worth a dollar, but I understand the worth of a dollar. And so how does that start to show up in commercial transactions?
[00:25:22] My suspicion is we will move to more usage based, right, where, you know, vendors like us will say, you know, here's our version of a token. Here's what we do to enhance the ones that we use from the LLMs. Here's how we're better at this. But when you ask an answer, here's what it's worth in our version of tokens. Right. Yeah. I think to your question of, you know, what does that mean for buyers? You know, we have a little bit of an AWS issue again. Right.
[00:25:50] So when AWS came out and they did their spot pricing and they're essentially their version of consumption pricing, they made it so easy to spin up these environments. You know, seven, eight years ago, I remember hearing from a lot of customers who would say. Our AWS costs were run away because these devs are spinning up environments and they're doing a bunch of stuff and we have no idea, you know, what's happening and where. So there is going to need to be some governance around this.
[00:26:18] There's going to need to be some controls of who's consuming what. And it's probably most important for customers to get a handle on how they govern the usage of these LLMs. Also, as I think they're going to start. Charging more because they have to write. Right. They've to your point of like get them hooked. They've been subsidizing it for a period of time. There's a lot of money that they spent that they got to make back. Right.
[00:26:41] So the cost of this is going to become more material element of evaluating your AI solutions and strategies. So all of this, you know, taken together says it feels like we're going to be in a landscape of. Thinking about tokens as a unit of work, as a unit of value and as an economic unit for the commercial relationships.
[00:27:07] And if we look further ahead, is there an assumption that sales leaders still cling to about AI that you think will quietly disappear over the next few years? Is there anything that springs to mind now? Anything you're going to see disappear? Whether it be just with hoping it will disappear or you actually think it will. I think there's a question of how much automation is going to solve. Yeah. You know, that's the first one.
[00:27:36] I think humans in the selling process and the specifics of a given deal aren't going to change. And it's it's an important to get more personalized, not less. Right. So I think trying to take. The same ideas from predictable revenue of let me enforce this sameness. Right. And use AI to do that. So as we generate proposals or whatever else. That's what we should be using AI for. I would argue the opposite.
[00:28:06] And I think that that's what will emerge is the more sameness we have in the sales experiments, the more you lose your differentiation. And it's hard for a customer. Even if a couple of words are different, understand how you're different than, you know, vendors two, three, four and five. If it's all sort of the same feel, the same motion, the same content, whatever else. So I think that will start to fade as the place for value. What will emerge is more uniqueness instead of sameness. Right.
[00:28:36] So how do I really personalize that moment using AI going deeper into the context of what's happening at this point in time in the market with the account, with these particular buyers, with the seller to know what exactly should happen in this moment. That's where I think AI is going to be super powerful. And that's a different pivot from where we're headed right now, which is try to automate and create the sameness everywhere.
[00:29:02] And I always try and give my guests an opportunity to step on my virtual soapbox and finally lay to rest any myths or misconceptions that they see online. But I've got to ask you, what is it that you think or would you see people misunderstanding most about your industry? Are there any myths or misconceptions about your job, field of expertise or industry that we can lay to rest? If you could choose something, what would it be?
[00:29:31] So this is a personal belief. And I kind of hope I'm right as somebody who's been in the world of sales for a long period of time. But I don't believe sales is programmable. Right. And we are headed to I've heard this from a different viewpoints that say, look, at some point it's just going to be, you know, an AI bot that does the selling process. I just don't believe that it's fundamentally a human activity.
[00:30:00] We are so good at working with incomplete information. Reading patterns. It's still relationship based when you're doing true selling. Right. When you're trying to persuade and build trust and get somebody to see what you think is valuable to them. That's a human activity that's just too hard for to see AI getting to that space. Now. Processing orders. Right.
[00:30:28] Once you've decided to buy, that's a different thing. Right. That can be automated. So understanding the distinction between selling and order taking. I think we kind of conflate the two and we think tech that because it's clear how we can automate order taking, we can do this same for selling. I think there's an art and a craft to it. That's very human. And technology is just not going to go all the way to that last mile. And I think that is a powerful moment to end on today.
[00:30:58] But for anyone listening wanting to find out more information, connect with you or your team, find out more about announcements at People AI and everything that you're doing. Where would you like to point everyone? Yeah. Just come straight to our domain. www.people.ai. You can find us on LinkedIn. I'm pretty active there as well. So if you want to hear more of the viewpoints, come find me there and follow me. And, you know, give us a call if you have questions. Awesome.
[00:31:24] Well, I would add links to everything that you mentioned there. I love what you're doing here. There's so much talk at the moment around AI and there's a lot of hype and failed projects. But what I love more than anything about what you're doing is you've built it not on the technology but around this one mission to give revenue leaders more than raw data. Give them answers that they need to take action, make a measurable impact and difference there. So keep doing what you're doing.
[00:31:52] I'd love to stay in touch with you, get you back on later in the year, see how this space is evolving. And I think it will move incredibly fast, even more so than last year. So it'd be great to get you back on. But thank you for sharing your story today. Thanks again. I'd love to do it. Thanks, Neil. Thanks for having me. Yeah, I appreciate the opportunity. I think there was a real moment in this conversation where Jason makes something very clear. Sales is not programmable. And that idea alone cuts through a lot of noise.
[00:32:21] Because yes, AI can process vast amounts of activity, spot patterns humans would miss and surface insights at speed. But persuasion, trust and judgment, these things still sit firmly in human hands. And what changes is where people spend their time. Less manual reporting, less internal validation, more time acting on what matters in the field. And we also talked about CRM fatigue, why another dashboard is rarely the answer.
[00:32:48] And how open architectures and agent-driven workflows, these things are starting to reshape enterprise software choices. And when you throw in token-based pricing models and usage governance instead of licenses, I think it's clear that buyers will need a much sharper understanding of how AI value is measured in the years ahead. But over to you. In a world full of data, dashboards and agents, what would it take for your sales team to finally get the answers that they need,
[00:33:16] rather than just more noise and fatigue? Let me know. So, techtalksnetwork.com, socials just at Neil C. Hughes, wherever you hang out, you'll find me there. But that's it for today. So, let me know your thoughts on this one. Big thank you to my guest today and an even bigger thank you to each and every one of you for tuning in. I'll return again into your podcast feed bright and early tomorrow. Speak with you all then. Bye for now.
[00:33:47] Bye for now. Bye for now. Bye for now.

