Tungsten Automation: Why AI ROI Starts With Boring AI And Real Workflows
Tech Talks DailyFebruary 14, 2026
3589
27:1925 MB

Tungsten Automation: Why AI ROI Starts With Boring AI And Real Workflows

What happens when the noise around AI starts to drown out the actual business value it is meant to deliver?

In this episode of Tech Talks Daily, I sat down with Adam Field, Chief AI and Product Officer at Tungsten Automation, fresh from the conversations unfolding at Davos.

While headlines continue to celebrate agentic AI and sweeping automation claims, Adam offered a grounded perspective shaped by decades of experience turning AI pilots into measurable, ROI-driven deployments. His view is simple. The hype cycle may be accelerating, but many organizations still struggle with the fundamentals.

Adam described a common boardroom dynamic. "What do we want? AI. What do we want it to do? We're not sure." That pressure to move fast often collides with a deeper reality. Software has shifted from deterministic to probabilistic. Leaders who grew up expecting the same inputs to always produce the same outputs now face systems that behave differently by design. Measuring value in that environment requires a different mindset.

One of the most compelling ideas in our conversation was Adam's concept of "boring AI." While splashy announcements about replacing hundreds of employees grab attention, he argues that real returns often come from quieter use cases. At Tungsten Automation, that means intelligent document processing, extracting trusted, AI-ready data from the 80 percent of enterprise information that is unstructured. Contracts, invoices, transcripts, compliance paperwork. The work may not trend on social media, but it saves time, improves accuracy, and fits directly into daily workflows.

We also explored accountability. AI can compress output, but it concentrates responsibility. When generative tools make architectural or compliance decisions, the liability does not shift to the model. Organizations remain accountable for privacy, ethics, and customer trust. Adam shared his own experience rebuilding a legacy application in days using AI code generation, only to discover licensing and compliance nuances that required human judgment. The lesson was clear. AI amplifies capability, yet human oversight remains essential.

For leaders searching for signals that an AI strategy will actually deliver long-term returns, Adam pointed to two patterns from the small percentage of projects that succeed. First, integration into daily workflows drives adoption. Second, partnering with trusted vendors often reduces risk compared to attempting everything in-house. In a world flooded with open-source experiments and "X is dead" headlines, discipline and focus still matter.

Tungsten Automation has spent four decades evolving alongside automation technologies, previously known as Kofax. Today, the company applies large language models and agentic workflows to transform unstructured data into decision-ready insights across finance, logistics, banking, and insurance. It is a reminder that the future of AI may be less about replacing people and more about removing friction so humans can do the work they were actually hired to do.

So as AI investment continues to grow and pressure for returns intensifies, the question becomes harder to ignore. Are we chasing the headlines, or are we building systems that quietly deliver value where it counts?

Useful Links

[00:00:03] Is the AI conversation starting to feel louder, but not necessarily clearer? Well, coming out of Davos in January, the buzz around AI is predictably everywhere still. But so was the frustration around stored pilots and fuzzy returns. And today's guest has seen enough hype cycles in their career to know when it's time for a reset of sorts.

[00:00:28] His name is Adam Field. He's the Chief AI and Product Officer at Tungsten Automation. And he spent decades taking AI out of the lab and into production. And knows all about where it actually earns its keep. So we'll talk about why a back to basics mindset is long overdue. What real AR ROI looks like when the noise dies down.

[00:00:53] And why many organisations are still over-complicating problems that just need clearer thinking. Not bigger models. So if you want a perspective from someone who has been quietly delivering results, while the rest of the market keeps arguing about potential, then this conversation is for you. And on that note, I will officially introduce you to Adam now. So thank you for joining me on the podcast today.

[00:01:21] Can you tell everyone listening a little about who you are and what you do? Adam Field Sure, Neil. Thanks for having me. My name's Adam Field. And I'm the Chief AI Officer at Tungsten Automation. Tungsten Automation is a worldwide leader in what's known as Intelligent Document Processing, which is document intelligence software that helps organisations make sense of a whole bunch of information inside of their organisation that's really difficult to make sense of without some advanced technology.

[00:01:51] Tim Starr Awesome. Well, thank you so much for sitting down with me today. We've got a lot to get through. I mean, last year there was that famous stat around AI and I think it was 90 or 95% of AI projects were failing or were failing to provide a return on investment. Here we are in 2026. In January in Davos, it was made clear that there is mounting pressure to start turning those large AI investments into financial returns.

[00:02:16] The big focus this year is all on ROI as well as launching agentic AI agents everywhere. But from your vantage point, why are so many organisations still struggling to convert that AI spend into a measurable business outcome? Tim Starr Yeah, I think first of all, the hype train's not necessarily helping, right? You know, we hear a lot in boardrooms and I'm obviously paraphrasing, but it's kind of one of those things where, what do we want? AI.

[00:02:46] What do we want it to do? We're not sure. When do we want it? Now! Right? And so like, the hype is kind of getting ahead of the understanding of what's possible. And if I were to boil it down, I think it's this, software in our careers has moved from being very deterministic to now rather probabilistic. And that's really difficult to, when you've grown up in a world where even the great technologies like these machine learning models,

[00:03:12] while they were built on probabilistic capabilities, at the end of the day, the same inputs with the same test data would bring back the same output. So, regulators began to appreciate it and understand it. Customers were accepting of it. These technologies work extremely differently. So I think it can be quite difficult to pinpoint how you should use it, what use cases you should apply it to, and how you should measure the returns, frankly.

[00:03:41] And I sometimes think I'm guilty of living in a bubble because it's what, recording this in February, I've been to two tech conferences in different parts of the world and it's all AI, it's all agentic AI, predictably. But recent CEO research almost shows a different story. That shows confidence in near-term growth at a multi-year low, even as AI budgets rise. So I'm curious, from what you're seeing here, all the conversations with customers, etc.,

[00:04:07] what disconnect are you seeing between those executive expectations and how AI is actually being deployed inside enterprises? Yeah, I think the first thing, Neil, is, and I didn't make this up, it's been said about a lot of technology revolutions, is in the short term we tend to overestimate the capabilities and in the long term we underestimate it. So I think in the short term, again, going back to hype, it's like we want to do it fast and we want to do it now.

[00:04:33] And I think what people are realizing, and I've come to the realization, is large language models in Gen AI is much more of a, what I often refer to as a technological art than it is just a pure science. Understanding how to apply it, how to get consistent behavior out of these things that are just relatively inconsistent by design and by nature. And then I think the other thing, I've been in the software automation space for over 20 years, and all we've really focused on to date is efficiency.

[00:05:02] It's all about KPIs wrapped around, how do we do it faster? How do we do more with less? Sadly, sometimes, how do we replace people with software? What we're not talking enough about is proficiency. So, like, I think these new capabilities, and I'll give you a personal example. I have a computer science background, but I have not written a line of code in 20 years that anyone would want to deploy to production. But these new tools have made me code again.

[00:05:29] It's been fun, and I was able to create some of Tungsten's first prototypes myself while I was in meetings and doing other things. And it made me more proficient in my job. How do you measure that? How do you put a measurement around how much more proficient I am and how much more I like my job? And so sometimes I think these things can be difficult to measure. And I'm a firm believer, even if you are in management and you move away and you lose your technical abilities, it's always in you. Has it unlocked something in you again? Has it unleashed something?

[00:05:59] Are your developers a little worried now that you're back in the game? Well, I don't know if they should be worried, but I'll tell you a story. I downloaded one of the AI code generation tools, and I said I wanted to play again. So I took an old app that I had been writing in many decades-old technology for a really long time, and I recreated it in three days. Oh, wow. And I added new functionality that now I've deployed to my little server in my home, to my wife's chagrin. You know, I keep telling her about it, and she rolls her eyes. But I got really excited.

[00:06:29] I can't believe I recreated this. But it wasn't with its own pitfalls. You know, I noticed a couple of things that I had the background to question why it did certain things. And it said, oh, yeah, you're right. I shouldn't have done it that way, and it fixed it. Or there was one instance where it recommended a certain library that it downloaded, which was perfectly legal for me to use in development. But if I would have taken that to production, I would have been out of compliance, and I had no idea until I did a little quick Google and realized that. So, you know, it's interesting.

[00:06:59] With obviously great power comes great responsibility, as we all know. So while I'm really excited again, it gets me thinking about, you know, what the future of the makeup of these teams look like. Yeah, and I just love the passion and excitement that clearly comes from you there, bringing this back to life. It's phenomenal. You have warned, though, that the AI hype cycle does need a reset back to some of the fundamentals, some of the belts and braces side of IT.

[00:07:26] So for people listening, what does that back to basics really mean for leaders who still feel that pressure to move faster, deliver faster? Yeah, I think what everyone's looking for is the above-the-fold headline, right? We see a lot of them, you know, such – I won't name companies, but have used AI agents to replace X100 call center reps. And these are the things that make headlines, and everyone retweets it.

[00:07:50] But if you follow up months later, quite often they backpedal, you know, just ever so slightly, and maybe some of the results weren't what they said, but they got the headlines and good on them. But I've been talking a lot about, quote-unquote, boring AI. I was giving a presentation, and I was trying to think how to be somewhat provocative, and I named my presentation around boring AI.

[00:08:10] And what I meant was, I mean, we're a very exciting company, and we're doing very exciting things, but the stuff that we've seen get the return on investment are use cases that maybe no one's going to write a big article about. It might not make the headlines, but in our world it's document intelligence.

[00:08:25] You know, it's banks and insurance companies and logistics companies not having to have human beings look through piles of forms and getting accurate information and getting insights into a whole bunch of data inside of their organization in the workflows that their people are doing every single day, making them better at their jobs, saving them time. So, yeah, you might consider that a little boring and unsexy, but that's the stuff that's working really well right now. Love it.

[00:08:53] And there is a growing narrative that software development is becoming obsolete because AI can now write code, and every developer hearing me say that out loud is shaking their fist in the air, and I completely understand and agree with them on the frustrations around narratives like this. But based on your own hands-on experience, and you provided a great example a few moments ago, why do you see that belief as risky rather than reassuring? Just for any leader that might be, a non-technical leader that might be reading headlines like that and thinking, oh, we don't need developers anymore.

[00:09:24] Yeah. Well, lest digital archaeologists dig this up in 25 years, you know, I don't want to sound like I'm a dinosaur or nothing. Of course it's going to change, right? The role of the developer, what the developer does is going to change. But here's what concerns me is I have a son in college right now. He's in his first year, and he talks to me all the time about what he should focus on and what his future looks like.

[00:09:48] And I think while development teams before, you know, might have been large, they're obviously going to be much smaller. But I worry that not enough people are going to focus on the full stack. You know, so I think instead of specialists that just do UI really well or just code, you know, certain things really well or just understand security, you're going to need these full stack experts that know how to question the AI, that know the design patterns, that understand the implications of certain decisions.

[00:10:17] Those people, I think, are going to be more valuable than ever. Now, it may just be them and one or two other people on a project as opposed to 25 people. So it will certainly change. But let's, you know, not forget that the computers are only good as what we continue to train it to do. And if there's not enough people that understand how it works, well, we won't progress. And you gave a great example a few months ago of how you brought an old legacy project back to life.

[00:10:44] You turn it around in three days, but there was a few oversights in there. Fortunately, you've got the technical know-how and insights and experience to know that and a combination of your skills and AI skills to get it over the line. But not everybody has that.

[00:10:58] So when AI-generated code begins to make architectural or compliance decisions that might also look fine on the surface, the question I've got to ask is who should ultimately be responsible and accountable for questioning some of those assumptions and catching those second-order risks? Sure. Well, obviously, these AIs are going to get a lot better. Like, we know that. However, to directly answer your question, we are. I mean, the companies responsible for putting it out are. A regulator doesn't care who wrote the code.

[00:11:28] If you expose, you know, someone's personally identifiable information accidentally, then you're going to be held liable. Your customers vote with their wallet, right? So, I mean, we're responsible for making sure that what we're putting out is usable, is safe and secure, is ethical, provides the great customer experience that it's meant to create. So, ultimately, at the end of the day, of course, it's us.

[00:11:55] Before you came on the podcast, I was doing a little research on you, and you used a line that I absolutely loved. It was, AI compresses output, but concentrates responsibility. So, how does that change the profile of engineers and consultants that enterprises will maybe depend on over the next few months and even years? Yeah. I come back to, I think the future is people that are well-rounded. I can learn so many things so quickly.

[00:12:21] So, just focusing on a certain niche that could get disrupted when you wake up in the morning and look at your X feed and realize there's something new that's just been deployed, and suddenly you look and say, all of my 10,000 hours that I just spent becoming the expert in this has now been rendered obsolete. I think you have to have multiple skills. I, you know, I ran product management for a long time.

[00:12:43] I run Tungsten's AI office now, and I can't imagine in just a few years hiring a product manager who just knows technology. I think they've got to understand product marketing. They've got to understand go-to-market and the business side of it. They have to understand really good product design and, you know, even really want to dive into the user experience. Those used to be very discreet degrees you'd get in school. They used to be very discreet job descriptions. I don't think that's going to exist in the very near future.

[00:13:13] You're going to have to bring all those skills to the table and then utilize the technologies to learn more and, you know, help automate some of that. And I also love how you described boring AI, the unsexy work, where the real value gets created. So what would you say separates that dependable production-ready AI from the more glamorous experiments, the innovators that dominate the headlines that we all so-called? Yeah. Yeah.

[00:13:39] You know, what was funny was when I was kind of writing some of this, I went back in the internet way back machine to about the 2010 era, and I found some articles and headlines about these RPA deployments. So for the listeners that my robotic process automation is hot for a very long time, and it still plays a very important role. And that was all about automating clicks and research and, you know, augmenting people.

[00:14:04] I did a presentation and I put these articles on the screen and I redacted some of the technologies, like the words RPA or agent, and they were indistinguishable from articles and press releases that were written five minutes before. Right? It looked like we're going to augment humans and they're going to make decisions and all of this stuff.

[00:14:23] I kind of want to go back to the problems that RPA was meant to solve, just doing it the right way with these, you know, wonderfully, amazingly capable agentic technologies and just focusing on those things that you never hired human beings to do in the first place. You didn't hire empathetic, smart people to fight with software. You hired them to answer the phones and talk to your customers if they were in a really bad situation.

[00:14:47] You hired them to be creative and smart and innovative and the technology over time got in the way. So just going back to those core things when people raise their hand and say, I don't want to do this. I was never hired to do this. Tech didn't help. Tech got in the way. Let's get it out of the way and let people do what they're meant to do. And that's when I say boring AI. Those are sometimes just some of the things that machines are really good at. Let them do it. Let humans. I love that. I'd love to see a copy of that presentation.

[00:15:15] All the words redacted there and it say the same thing. Yeah, sure. And of course, we're already racing through 2026 and the conversation continues to shift from what's possible to what works. So on that side of things for leaders looking for takeaways from the conversation today, what signal should they be looking for and investors watching for? Or to identify AI strategies that will really start delivering those long-term returns on those expensive tech projects? Sure.

[00:15:45] Great. You referenced earlier the MIT study about only 5% of these projects achieve the stated ROI. So when I read the report, I tried to turn it on its head and look at instead of the headlines were the 95% fail.

[00:16:01] I tried to look at the 5% that succeeded to find some patterns and I used my AI tools and said, find some patterns in these studies and things like that. And what was really interesting was there's a couple of just very core basic things. One was you have to integrate it into your workers' daily workflows.

[00:16:19] Whenever we're bombarded with a new tool, if it doesn't just meet how I live every single day, if it's not in the apps that I'm currently using, if you just want to staple some new thing off to the side and it's not seamless, it tends to not get the adoption and people don't like it and they're not going to use it. Therefore, you're not going to get the return.

[00:16:38] One of the other things that bubbled up to the surface was partnering with the right tech vendors over DIY for everything. So there's this belief out there that I can just make everything work with AI. I can just vibe code everything. And we see it at the largest organizations in the world. And frankly, that's what we compete against a lot now. I can just vibe code it. Why do I need you?

[00:16:58] And I think when reality hits, they realize is working with a trusted partner, the intangibles of trust come into play where when you have a partner that focuses on something that shifts the risk away from you as an organization. Like I'll give you an example. We sell software for accounts payable and accounts receivable. It's invoice automation. Why would a big bank or an insurance company or an airline or anyone want to have experts on staff who know how to do compliant invoicing?

[00:17:28] In 150 countries around the world. Leave that stuff to tungsten, right? And then we use these powerful AI capabilities to allow you to do it quickly and compliantly focus on coding those things that are going to differentiate you from your competitors. So you're focusing doing DIY when it makes sense, but partnering with the right vendors when it makes sense. Those are part of the 5% that that were really successful.

[00:17:53] Love that. And something else I always try and do with my guests is now and again, I bring out a virtual soapbox. So when you are scrolling down your X feed, Reddit, LinkedIn, all the usual suspects, you probably see a few myths and misconceptions that frustrate you. You probably end up shouting towards your wife again if she rolls her eyes at you. Or something triggered you there. But is there any myths or misconceptions or things that people just generally misunderstand about your industry or your work or technology today? What would it be? Let's see if we can lay them to rest today.

[00:18:24] I think the biggest one. I think the biggest one. I wake up every morning. I look at my X feed and insert something here is dead. You know, like those are the those are the ones it gets all the clicks and you know, look directionally. They're probably correct in some way, but SAS is dead is the latest one, right?

[00:18:41] I've been going for a few months. We really gonna like vibe code the next ERP tomorrow. Like I just I don't think so now. I again, I won't call out names of companies, but I do have a colleague that vibe coded a productivity tool and canceled a subscription because there was no intellectual property there. There was no transfer of risk. So to some degree, these articles aren't wrong, but it's clickbait a lot of times in the blank is dead just drives me banale.

[00:19:12] I love it. If there was a room 101, I would put that in there right now. Trust me. But obviously, everyone's been going crazy around AI for three years now, but you've been around AI and automation for a lot longer. And I think tungsten automation on your website states you've been doing automation has been in your DNA for 40 plus years. So for anyone listening, hearing about you guys for the first time, would you like to tell everyone listening about how you're helping people and the technology you're using?

[00:19:41] Yeah, I'm happy to know. Neil. And thank you for giving me the opportunity to do that. If anyone's hearing about tungsten automation for the first time, you may have known us for 38 of our 40 years as COFAX. We've been doing this, we invented the intelligent document processing space, we rebranded a few years ago. So we help organizations take all of their unstructured information. And that might be you think about a document of like a tax form, that's highly structured, that's easy.

[00:20:09] Unstructured things like a thousand page contract that has paragraphs and pictures. These are things like transcripts from phone calls inside of your contact center. These are social media posts. These are news stories. Over 80% of information inside of organizations right now is considered to be unstructured.

[00:20:28] We take all of that. We run it through our system. We extract it and we spit out trusted AI ready data for agentic workflows. We orchestrate them with powerful AI agents. We bring humans in the loop when necessary. We automate what can be automated. And it's amazing now these new use cases that we can begin tackling with these large language models and generative AI that weren't possible before.

[00:20:54] Like finding trends in unstructured data. It's been possible for a few decades to find trends in structured information, but being able to find trends in words that live all over your organization is really exciting. And we've evolved along with the technology in the industry. And I think we're going to go for another 40 years, hopefully.

[00:21:15] I was going to say, if you've got to look at your clients from five years ago before the whole AI thing hit and the kind of thing that they were coming and asking for your help with compared to now and what they're looking to do this year and beyond. What are the big changes? Are there any trends in the things that they're coming to you and saying, Hey, can you help us with this? A hundred, a hundred percent. So, you know, going back, what it was prior was we've got some documents.

[00:21:40] We'll give them to tungsten, give us the information off of it, and then we'll, we'll go process it. As we see fit in our processes, maybe through tungsten, maybe through another tool that they already had in place or feed some downstream system. Now it's about these data insights. They're realizing that if over 80% of the data and growing is unstructured and we can't make sense of it,

[00:22:03] how do we understand our customers better? How do we do work more effectively? How do we increase the decision velocity? And some of it's regulatory driven, you know, KYC processes. You have to look sometimes through mountains of paperwork, things that are not easy for a machine before just a few years ago to be able to figure out.

[00:22:28] And now there's even these new technologies that, you know, hopefully you'll see, you've seen some from tungsten and we're continuing to innovate on it, which helps pull relationships together that the human mind just naturally figures out. But machines for a long time struggled to because it was words. And now you can create these relationships amongst words and it's almost mind bending,

[00:22:52] but it's allowing these organizations to do things and make decisions on information that they would have either had to hire an army of people before to do, or quite frankly, just wasn't possible for them to do before. Yeah. And is it overwhelming to many leaders out there? Because I think there will be people listening and that might be reassured to, okay, we're not on our own. We're not on our own. That's quite slow getting out here and making these changes. It can feel so overwhelming, the pace of it all, but is that common too?

[00:23:20] Neil, let's be honest. I want to be honest to all the listeners. I wake up every morning and it's overwhelming to me and this is what I do for a living. Like, let's not, let's not lie to anybody. Again, I go to my X feed and every morning I think my business has been completely disrupted by some new open source project that someone put on GitHub and they're all touting it as the next thing. And then I step back and I take a deep breath for a minute and I realize that, you know, these really huge businesses, we sell to the fortune 1000 companies in the world.

[00:23:49] They don't just pivot overnight. But I understand how if you're sitting in a boardroom of one of those companies, it can be quite overwhelming to figure out what do I want to do next and how do I apply this technology? And that's where I think 40 years of trust with tungsten comes in. We've done this and we're experts in many different industries in helping them solve the problems that we know they need to solve.

[00:24:15] We try to cut through the hype and not, you know, over promise what this technology is capable of doing, but show them real world examples and real outcomes that their peers have gotten. And, you know, I think that's why organizations stay with us and we've got customers that have been with us for the 40 years of our history. And that's what that's what keeps me going. I love that. And I think that's a powerful moment to end on. But before I do let you go, we've covered a lot of information there today.

[00:24:43] There might be people listening want to carry on the conversation we started today or just keep up to speed with some of the developments coming out of tungsten and the kind of thing that you share online too. Where's the best place to find more information? Where would you like to send everyone? Sure. Well, of course, you can go to tungstenautomation.com. And then we would love for anybody who's interested, we do the Tungsten Summits all around the world. I think we're going to 12 or 15 cities all around the world this year starting next month. And I'll be at most of them.

[00:25:11] So you can you can find us there and you can find the list on our website. Of course, personally, I do most of my my stuff on LinkedIn. So you can Adam Field, chief AI officer at Tungsten. You can find me on LinkedIn. Excellent. Well, I'll add links to everything that you said there. And thank you for coming on here today, not just sharing your story, but also some of the insights and discussions coming out of Davos last month and why the AI hype cycle could use a back to basic reset of sorts.

[00:25:40] But this is not just you guys jumping on the bandwagon. As I said, you've been in this space for 40 years as a company and you yourself have spent decades turning AI pilots into real ROI driving deployments. So getting that insights and expertise out of you, out of your head and into a podcast today is phenomenal. So thank you for sitting down with me. It's my pleasure, Neil. And thank you for having me on. And thank you to all the listeners for listening in.

[00:26:05] I think one of the things that stood out for me there was Adam's calm insistence that AI progress does not come from chasing the next headline or chasing rainbows. It comes from discipline, focus, solving problems that people actually have. And in a moment where ambition is high and expectations are even higher, I think his experience offers a useful reality check here.

[00:26:28] And whether you yourself are leading an AI strategy, funding pilots or simply trying to separate signal from noise, I think there was plenty here to reflect on. But that's just my takeaway. I'd love to hear what resonated with you most from this conversation today with Adam. Are we finally ready for a more grown up AI conversation? Or are we still distracted by the shiny next big thing? Let me know your thoughts as always. TechTalksNetwork.com. Love to hear your thoughts on this one.

[00:26:58] But that is it for today. Time for me to get out of here now. I'll be back again in your podcast feeds tomorrow. And hopefully, I'll see you there. Speak to you soon.