What happens when companies rush into AI without fixing the fundamentals that actually make data useful?
In this episode of Business Tech Perspectives, I sit down with Rajan Sethuraman, CEO of LatentView Analytics, which is a global data engineering and analytics company that helps businesses excel in the digital world by harnessing the power of data. I learn more about their refreshingly pragmatic approach to AI adoption that many organizations are overlooking.

Rajan brings a rare blend of leadership experience to the table. Before becoming CEO, he spent more than two decades at Accenture, including a role leading talent and people strategy. He later joined LatentView, eventually guiding the company through its IPO and its first acquisition while expanding its work with more than 50 Fortune 500 clients.
Our conversation begins with an idea Rajan describes as AI minimalism. At a time when many executives feel pressure to experiment with every new generative AI capability, Rajan argues that the real challenge is not adopting more technology. Instead, organizations need to simplify their data ecosystems and create trusted foundations before scaling AI initiatives. Without that clarity, companies often end up with multiple data pipelines, conflicting metrics, and competing versions of the truth.
We also discuss the hidden friction in many AI projects. Rajan explains that technology is rarely the real barrier. Culture and clarity often determine whether a transformation succeeds. If organizations cannot agree on how key metrics are defined or where their source of truth lives, even the most advanced AI models will struggle to deliver meaningful results.
Rajan also shares what he has learned working with global enterprises across industries such as financial services, retail, healthcare, and technology. From governance and data lineage to embedding analytics into everyday decision making, he outlines the patterns that separate organizations that claim to be data-driven from those that actually operate that way.
One of the most valuable moments in the conversation comes when Rajan offers practical advice for CEOs under pressure to accelerate AI adoption. His recommendation is surprisingly simple. Start by defining the metrics that matter most to the business. Then work backwards from those metrics to identify the data, systems, and decision processes that influence them. Only after that foundation exists should organizations decide which AI capabilities to deploy.
We also explore how LatentView is helping enterprises apply emerging technologies such as generative and agentic AI to improve efficiency, effectiveness, and speed across business operations. Rajan explains why partnerships, experimentation, and ecosystem collaboration are becoming essential as the AI landscape evolves.
If you are trying to cut through the noise surrounding AI and focus on what actually drives measurable outcomes, this episode offers a thoughtful and practical perspective. Are organizations moving too fast in the race to adopt AI, and could a simpler, more disciplined approach actually create stronger results?
Useful Links
Learn More About LatentView Analytics
[00:00:01] What does it take for a business to become AI-driven and do it in a way that really works? Well, there's a lot of noise in the market right now. Every company seems to be chasing the next AI announcement, the next tool or the next promise of instant transformation. But my guest today brings a far more measured and valuable perspective.
[00:00:24] He's the CEO of a company called LatentView, and he argues that real progress does not come from piling on more technology, but actually from clarity, culture and making sure the data foundations are strong enough to support better decision-making. So today we're going to learn more about why AI minimalism could be exactly what many enterprises need right now,
[00:00:49] and why leaders must focus less on hype and more on the metrics, behaviours and decision-making habits that actually move a business forward. So how do you build AI capability that creates real impact without adding even more complexity? Well, hopefully these are just a few of the things that my guest can answer and share his insights with. So, enough from me. Let me introduce you to him now.
[00:01:19] So, a massive warm welcome to the show. Can you tell everyone listening a little about who you are and what you do? So, Neil, my name is Rajan Sitaraman. I'm the CEO of LatentView Analytics. LatentView Analytics is a data analytics company, and our mission is to help the world's leading businesses use data analytics to transform and survive in a digital economy, in a digital world.
[00:01:46] We do a lot of work in the technology, in the consumer retail goods space, in industrials and in the banking financial space. Personally, I've been with LatentView for the last 10 years. I had a 20-year career with Accenture before I came on board at LatentView. I initially was a chief people officer for the first three years, and I took over as a CEO about seven years back. So it's been a very interesting ride so far.
[00:02:14] Yeah, you've had an incredible ride. Leading the talent at Accenture, stepping up as CEO at LatentView, taking the company public, and driving its first acquisition. So I've got to ask you, if you look back at your career here, how is your background in people and culture? How would these things shape the way that you approach AI and data-led transformation? I bet there's quite a few synergies there. And I would argue it's more important than ever, because we keep hearing about the importance of human in the loop.
[00:02:44] But tell me about those skills and how they come in so handy in the tech world. No, absolutely. I believe that the people skills are paramount. I remember from my Accenture days, we used to have this business integration methodology, which had strategy, people, process, and technology. And the people element was always a very important component of any consulting exercise that we did, whether it was entry, growth strategy, or even whether it was supply chain, cost reduction, and performance improvement.
[00:03:12] I see the same theme play out even when it comes to large digital AI-led transformation initiatives. You need to have a fair degree of change management in the mix in any of these initiatives, because obviously people are used to a certain way of decision making. Data analytics goes to the heart of decision making and optimization.
[00:03:34] And when you challenge people's traditional beliefs on decision making, whether it is based on data or whether it's gut feel and experience, you need to handhold them in a significant way. So I feel that the whole cultural aspect of this is something super important. And most companies only realize it a little too late when they embark on their initiatives.
[00:03:55] So I feel that my background in working with people earlier in my Accenture days and also early on in Latent View has been very, very important for me to define how we progress as Latent View, but more importantly, how we handhold our clients for accomplishing the transformational objectives. You set off my tech spidey sensors when I was reading how you talk about AI minimalism.
[00:04:21] And you do this at a time when many leaders are chasing every new gen AI release. So what does AI minimalism actually mean in practical terms for enterprises that are trying to stay competitive right now? Right. Most organizations over the course of their time, they accumulate a lot of what we call technical debt because you do things which are expedient, right?
[00:04:47] Companies want to address specific use cases or problem statements in as fast a way as possible. And they end up from a data analytics perspective, you know, building data pipelines or platforms or analytics models or AI models, for example, right? And there is a proliferation of these, right?
[00:05:07] Over a period of time, providing different views and versions of the same data, looking at the data from different perspectives and then resulting in multiple answers even for the same questions. So what I mean by AI minimalism is first an attempt by an organization to take a look at what is the plethora of mechanisms that they have in place today? What are the common threads?
[00:05:32] And how do you put in place the common foundations and fundamentals of data, for example, that will bring that degree of confidence in the source of truth? Common definitions of KPIs and terms. Believe that this is the right source of data for going after a particular modeling exercise or answering a particular question. And if organizations are able to do that and put in place that common minimum foundation, then a lot of the initiatives becomes easier.
[00:06:02] So that's really what I mean by AI minimalism. Don't chase too many things and then add to the proliferation, but instead simplify first. Create mechanisms that build confidence in the underlying data ecosystem so that any problem that you want to go and solve, there is faith in the data, there is faith in the approach that's being used to solve the problem.
[00:06:24] And at Latent View, you work with over 50 of the Fortune 500 companies across a wide variety of sectors from financial services, retail, healthcare, and technology. And I've got to ask, what are the most common data foundation mistakes that you see holding organizations back from scaling AI effectively? I'd love to hear more about the kind of trends that you're seeing here.
[00:06:49] I would say that the biggest challenge often is the common definitions that I talked about earlier. When it comes to analytics and decision-making and optimization, a common understanding of what the KPIs are and how they are defined, that becomes really important. If you miss out on that, then organizations struggle and talk at cross-purposes. I mean, people might be saying the same term, but they might be meaning very different things, right? So that's number one.
[00:07:16] Second is the mechanisms that are needed to define a single source of truth, the governance and the lineage that accompanies that. Meaning that I know that if I have to model for a particular problem or if I'm looking to build a dashboard that answers a particular query that I have on what happened or why something happened,
[00:07:36] that I know what I know, what I think that I know that I know that I know that can go after. And the third thing I would say is just the orchestration of the end results of these analytics into the decision-making processes of an organization.
[00:08:00] Unless you take the analytics and embed them into the decision-making process, they just tend to remind exercises that you have done right in an ivory tower, but people still struggle with how do I implement it in my processes. I think these are the three things that I see. And if you solve for that, there is a lot more traction that organizations are able to build.
[00:08:20] And I also think there's a lot of pressure right now for organizations to show quick wins for their AI projects and prove ROI on those expensive tech investments. So how do you balance delivering those short-term wins, those impacts for clients, while also looking at the bigger picture, building long-term sustainable capability? A bit of a balancing act, I would imagine. Absolutely. I think it is a balancing act and one that is very much necessary.
[00:08:50] In today's fast-changing dynamic tech ecosystem, it wouldn't be appropriate if an organization doesn't do the necessary amount of experimentation either, meaning that you cannot wait for somebody else to solve the problem for it to become too well-established as a technology or approach. Then you say, no, I will jump on the bandwagon. It might be too late by then. But that doesn't mean that you only keep doing experiments.
[00:09:16] You also need to look at what are the most important metrics that I want to impact as an organization. These could be metrics related to your revenue and growth, or it could be related to your cost control and your margins. It could be other operational metrics like customer satisfaction and other scores.
[00:09:33] Once there is clarity on what the key metrics are, then you define the use cases that can really impact those metrics and direct the amount of experimentation on those use cases and those metrics. So that way, the experimentation loop and cycle as well becomes a self-reinforcing one. And then it leads to the creation of capabilities and assets and data monetization that can actually help with that stuff.
[00:10:00] So I would say that the experimentation has to dovetail with the most important use case and the metrics that an organization would like to impact. And on a personal level, since taking Latent View public and indeed leading its first acquisition, what have you learned about scaling responsibly, but also without losing cultural clarity or focus? And again, I would imagine your people and culture skills will come into play here. But tell me what else you've learned here.
[00:10:28] I believe that that vision and the direction setting that needs to come from the top in an organization is super important. For example, when we started our journey, right? I mean, when I came on board, one very important aspect for me was the shift from being an analytics execution partner to an analytics thought partner and a consulting partner.
[00:10:52] Meaning that we don't just show up with skills and capabilities and throw them on the table and then ask the client, what do you want us to build? But instead, we have a perspective on what are the most pertinent pain points and opportunities that are important within a client's context and that we have an approach for solving them, right? And we have an asset or an accelerator or a solution that addresses that. So that shift was very important. And that shift can happen only if that clarity is there in terms of what is the vision and where do we want to take the company?
[00:11:22] So for any large-scale transformation, I would say that defining what the goals are, the measure is, and bringing that clarity right at the top, saying that these are the biggest initiatives that we have and that we want to go after, I think that is super important. The second thing I would say is that data analytics and analysis in general has to become a way of decision-making.
[00:11:46] I mean, I've seen companies that do a lot of data analytics, but they don't find utilization and application when it comes to day-to-day decision-making. And then there are other companies that we work with where every meeting is run using the power of data analytics. Whether it is the smallest of decision or whether it is a strategic acquisition decision or an investment decision that they are making, they really look at how does the data support this, right? And what does the data tell us? And I think that orientation is important as well.
[00:12:15] So scaling becomes a lot easier if there is alignment, right? In some sense, it's like you're trying to, I don't know, like push a big bus, for example, that has broken down, right? Unless everybody is pushing it in the same direction, it's not going to move, right, where you want to go. So I think that's the biggest lever that's available to organizations, bringing the clarity and vision of it.
[00:12:42] And if we look at our news feeds, keynotes at tech conferences, they often focus on technology, sometimes the problems that they're going out to solve. But one of the things I love about what you're doing here is you also emphasize clarity, curiosity, and culture. So in your experience, which of those is most often missing when AI initiatives stall and how can leaders better rebuild that foundation? Because I think too often we focus on, hey, this technology is going to fix all these problems.
[00:13:10] But we don't talk about the adoption, the culture change, and everything that's required from people to make that project a success. But what do you often see missing here? I would say that the clarity and the culture are the most important. Not that curiosity isn't important. I do see a great deal of curiosity coming to the table these days with organizations and doing that experimentation that we referred to earlier.
[00:13:35] Of course, you will probably see that people who are born in the digital age and who are more AI native and digital native will probably be able to bring a greater deal of that curiosity. But in general, in my conversations, I don't see any dearth of curiosity. Clarity and culture is where organizations suffer, right? We talked about it. The cultural alignment is super important. And clarity and the vision needs to come from the top.
[00:14:01] And that tone on what are we going after and how are we intending to utilize any of the new technologies that are emerging, that has to be there. Only then the innovation, the ideation, the curiosity that is brought to the table will find its mark. And there will be many organizations out there that still claim to be data-driven, yet behind the scenes, decision-making often remains intuition-led at the top or left to the loudest voice in the meeting room.
[00:14:29] So how can leaders genuinely begin embedding analytics into executive decision processes without overwhelming teams? So they can be data-driven in every aspect. I believe so. In fact, I was talking about the two types of organizations, right, in response to the earlier question. I have seen companies where all decision-making is driven by data.
[00:14:56] It doesn't mean that you have to sit and analyze till the cows come home or there are reams of reports that are brought to the table for every decision. But simple analysis and logical explanation for where we want to go and why we want to go there, I think these are really important.
[00:15:13] And when organizations are able to embed that as a part of their culture, as a way of decision-making, where you say that every meeting, if somebody is going to make a point and they want to express a point of view, you need to back it up with data. If that becomes the culture, then that shift starts happening. Of course, it is a lot easier said than done.
[00:15:35] You do need to have the necessary mechanisms in place, whether it's the data foundations, whether it is the common definitions of the KPIs that we talked about. You need to bring it all together, right? So there is a process of enabling it and making it happen. Some of it is structural and strategy, right, in terms of how all of this is defined, the direction that's provided by the senior management. But a lot of it is also about the people that you bring in, the processes and the mechanisms that you put in place.
[00:16:02] You need to bring it all together, right, in order to make that happen, which is where I think organizations which have a clearer sense of the most important initiatives that they want to go after, the biggest metrics that they want to impact, they will benefit a lot more because it's easier for you to start with them and then move with and create a bit of a snowball effect as you move forward those initiatives. And I always try to give everybody listening a valuable takeaway.
[00:16:29] So if you were advising a CEO that is listening to our conversation today who's feeling a little overwhelmed, maybe they feel pressured to accelerate their AI adoption this year, what are those disciplined steps that you'd recommend to ensure that they really do drive measurable impact and make a real difference, rather than just adding more technology, more complexity to the organization without solving any problems or making a tangible difference? What steps would you offer there?
[00:16:57] I would say that the first and most important step would be to define the metrics that you believe are the most important, right, that matter. I mean, this would depend on what stage of evolution and maturity and industry context that organization is experiencing. So, for example, in a very high competitive intensity segment or space where market share is the most important metric,
[00:17:23] it could be that I want to figure out how data analytics can help drive market share within this geography or for this region. So defining that metrics that matter, that will be the most important first step. The second step, I would say, is to define a full end-to-end approach, meaning that if I know that this is the metric that matters and here are the decisions that I make around that metric, work all the way back. When we do work with clients, for example, we set up what are called these issue trees and hypothesis trees,
[00:17:51] which say that if this is the metric that I'm targeting and this is my hypothesis on what moves that metric, how do I set up an issue tree, break it down, and then collect all the data that is needed to do that analysis? So you've got to go and backward integrate into all the data that is needed, establish those sources of truth, and find the definitions that everybody can agree on. So those are the two important steps.
[00:18:14] Once you put them in place, the technology in terms of now what kind of AI or ML or agentic that you use in order to do the modeling and the work, I think there is plenty of input and choice that is available. I wouldn't worry too much about that. But most organizations would have those skills or they'll be able to find those skills, if not by themselves, at least by partnering with service providers like us, for example.
[00:18:39] But figuring out the most important metrics and the use cases and then making sure that there is a connect back to the data within their ecosystem, that will be the most important two steps that I would say that they need to focus on. So data engineering and creating that platform, that platform that has the data that everybody can believe in, that becomes a very important step right in the process. Fantastic advice.
[00:19:04] And at Latent View, you guys are indeed helping pave the way for businesses as well and help them along this journey. Anything you can share about what you're working on this year, what your focus is, or what excites you about this year and how you're helping your customers at Latent View? Yeah. Yeah. The conversation obviously is shifting a lot more to applying all the new evolving AI capabilities that we are hearing about. We have been doing a lot of work with traditional AI in the past,
[00:19:34] but the last two years, and in particular, the last six months, one year, have all been about generative and agentic, right? How do you bring those capabilities to the table? Everybody is talking about impacting not just the efficiency with which work is done, but also the effectiveness of the models, the effectiveness of the approach, right? In terms of their real impact on the metrics and also what we are calling velocity, right? How quickly can you get this done, right? Are you able to compress elapsed times for getting work done?
[00:20:04] So many of the initiatives that we are focused on are really around that, whether it's our partnership with Databricks, and Databricks has been really pushing the boundaries in terms of building their architecture, right? And bringing in AI and agentic offerings on their platform, or whether it is our focus on what we call our top diamond accounts, right? Where we believe that there is a lot of opportunity for those accounts to apply the power of data analytics to drive the transformation that we talked about, or whether it is our AI center of excellence.
[00:20:33] All of these are focused today around how do you bring the power of artificial intelligence and generative and agentic capabilities to create impact on those three types of metrics, efficiency, effectiveness, as well as velocity metrics. So that's what we are focused on at this time. There is obviously a lot of experimentation that we are doing. One way by which we are accomplishing this is we realize that we cannot build all of the solutions that our clients might need.
[00:20:59] There is also a huge explosion happening, right? In the space in terms of the number of startups in this ecosystem, the number of companies that are building solutions. So we are taking an approach where we will build a few solutions, but we'll also be partnering and we'll be investing in smaller companies that are making substantial shifts, right? On solving particular problems. So we want to bring that ecosystem to the table when we are having conversations. So that's really what is exciting because our people are benefiting
[00:21:27] by interacting with that ecosystem as well, right? And that's something that is driving a lot of interest both within the company and also with our clients and prospects. And for anybody listening inspired by your story and your approach there, especially that focus on clarity, culture and people, et cetera, I don't want to find out more about Leighton View, how you might be able to help them. You're already helping 50 of the Fortune 500, but anyone wanting to find out more, where should they go?
[00:21:56] Where would you like me to point everyone to them? Right. No, as you said, we are a public listed company in India. We did our IPO in November 21. So the best place to go to would be our website, LeightonView.com. There's a whole lot of information there available in terms of the industries that we serve, the type of business problems that we go after, the approaches and the assets and the solutions that we bring to the table. And to connect with me personally, please do reach out on LinkedIn. That's the best place.
[00:22:24] I mean, I monitor that quite a bit and I'll be very happy to kind of connect back with you. Awesome. Well, I'll add the link to the website and your LinkedIn. And I, for one, just love your rare point of view on AI minimalism there. And that focus on why leaders should be prioritizing clarity, curiosity and culture over just chasing every new piece of AI or emerging technology. But also how to scale AI responsibly, build data foundations,
[00:22:52] lead real transformation that delivers on that measurable difference, real impact inside an organization, real breath of fresh air for me. So I encourage people listening to check you guys out. But more than anything, thank you for sharing your story today and your insights. Thank you, Neil, for having us on the show. Appreciate you spending the time. Excited to keep the conversation going. So much to take away from that conversation now, especially that reminder that successful AI adoption starts long before any model is deployed.
[00:23:23] Whether it be defining the metrics that matter or building trust in data, or even creating a culture where decisions are backed by evidence. I think everything we talked about today was part of a timely discussion for any business leader that is currently feeling the pressure to move faster with AI. And I think my guest offered a welcome sense of perspective here,
[00:23:49] because growth in this space is seldom about doing everything at once. It's about knowing what matters most and then building it from there. So if today's conversation gave you something to think about, head over to latentview.com to learn more. You can also reach me at techtalksnetwork.com. And as always, let me know your thoughts. Are businesses making AI harder than it needs to be
[00:24:15] by chasing the next big shiny tool or adding complexity instead of clarity? Let me know your thoughts. So thank you for listening today. I'll be back again real soon with another guest. And hopefully I will get to speak with you all then. Bye for now.

