What if enterprise AI could move beyond answering questions and start explaining why things are happening in your business?
In this episode of Business Tech Perspectives, I sat down with Alberto Pan, Chief Technology Officer at Denodo, to explore how AI is shifting from surface-level responses to deeper, reasoning-driven insights. As organizations wrestle with fragmented data, governance challenges, and growing expectations around AI, this conversation gets to the heart of what meaningful progress actually looks like.

At the center of our discussion is Denodo’s DeepQuery, an AI reasoning agent designed to perform complex, open-ended research across an organization’s data landscape. Alberto explains how it goes far beyond traditional approaches like retrieval augmented generation by creating research plans, analyzing patterns, and even refining its own process along the way. The result is not just faster answers, but a more complete understanding of what is really happening beneath the surface.
We also unpack what this means for business intelligence teams. Rather than manually building dashboards and reports, analysts are stepping into a new role as guides, working alongside AI systems that can gather, analyze, and present insights in minutes. It raises an interesting question about how skills, roles, and expectations will evolve as these tools become more widely adopted.
A big part of the conversation focuses on data itself. Alberto shares how Denodo’s logical data layer allows organizations to access and govern data across multiple systems in real time, without creating new silos. That foundation becomes even more important as AI adoption accelerates, especially when accuracy, context, and explainability are all under increasing scrutiny.
We also touch on the growing importance of transparency in AI. With concerns around black box decision making continuing to rise, Alberto explains how DeepQuery provides full traceability, showing exactly how insights are generated and where the underlying data comes from. It is a practical step toward building trust in AI systems at scale.
Looking ahead, this episode offers a clear view into how research-driven AI could reshape decision making across industries. From finance to healthcare, the ability to move from static reports to dynamic, AI-assisted investigation has the potential to change how organizations operate on a daily basis.
So as AI becomes more embedded in business workflows, are you still asking your data what happened, or are you ready to understand why it happened and what to do next?
Useful Links
[00:00:01] Traditional Gen AI tools, yes they can give quick answers, but they rarely provide the full story. But Deep Query is aiming to change that by combining AI reasoning with Denodo's logical data layer. And this is going to give organisations the ability to perform open-ended research directly across live governed data.
[00:00:23] So it's not just another chatbot or analytics dashboard, we've seen far too many of them. This is more of an intelligent assistant that can execute multi-step analysis, identify hidden patterns and build a comprehensive report that explains real business outcomes.
[00:00:40] So today Alberto will share how this approach could redefine the role of analysts, make advanced research accessible to anyone in the business, and also discuss why Deep Query is different from RAG or Retrieve Augmented Generation and data lake houses that many other vendors are selling right now. And I also want to learn how it ensures transparency and audibility through detailed query logs, kind of guardrails that are put in place.
[00:01:08] And what this means for the future of explainable AI. So from real-world use cases like Central Bank using Deep Query to map economic trends, to the rise of Model Context Protocol, or MCP, for AI-to-Data Interactions. It's a fascinating look, this one, into how reasoning AI is shaping enterprise intelligence.
[00:01:32] So what happens when business intelligence becomes self-guarded, transparent, and truly AI native? Well, let's find out as I get my guests back onto the show. So a massive warm welcome back to the show. But for anyone that missed our last conversation, which was just around a year ago now, almost to the day, can you remind everyone listening with a little about who you are and what you do?
[00:02:00] Yes. Hello, Neil. It's great to be back. Yes, my name is Alberto Pan. I am the Chief Technology Officer at Denodo. And I have led the Denodo's research and development organization since the company started. Well, Denodo is a data integration and data management company. We now have offices in more than 25 countries. But I am still based on our very first office in Acorunia, a small city in the northwest of Spain.
[00:02:28] Fantastic. Well, it's a pleasure to have you back on here with me. And so much has changed in the last year. I think Agente KI has gone crazy this year. But I think traditional Gen AI tools often stop at providing surface-level answers. So how does your de-query move at Denodo? How does that move from answering what happened to uncovering why it happened? Because that distinction feels like something quite special, quite a big moment there.
[00:02:58] Yes. Well, Denodo Deep Query is an AI reasoning agent, right? That can leverage all the data in the internal systems of the organization to perform open-ended research tasks, right? So let me put you an example. For instance, you can ask the agent, analyze our sales for product-ex based on, I don't know, seasonality, region, customer segmentation, and also suggest other factors that might be relevant.
[00:03:26] Then maybe the agent will ask some follow-up questions to the user to clarify the goals. And after that, it will automatically create a plan to do the research. And the plan will typically start by querying the internal data systems of the organization to collect the data that is needed for the research. Then the agent will analyze the results to find patterns. The agent might also decide to add new steps to the plan. For instance, if in our example it sees something strange in the sales of one region,
[00:03:55] it can decide to collect additional data to study it. And so this process goes on for some minutes. And at the end, the agent generates a complete research report, showing the data, analyzing the key factors, illustrating the main points with new dashboards that are also generated automatically, and even proposing new questions and recommendations for deeper analysis. And when I was doing a little research on you guys,
[00:04:21] I read that you described Deep Query as bringing cognitive reasoning to enterprise AI, which sounds incredibly exciting. But what does this mean in practice for business leaders and decision makers that could be listening to this podcast today anywhere in the world, whether they're in finance, sales, or even HR? What does this mean for them? Well, I think that these systems will completely change the way business intelligence is done today.
[00:04:48] I think the role of the BI analyst will change a lot. It will no longer be about manually creating reports. Instead, I think the analyst will become a guide for an automated research assistant. The assistant will actively collect data, perform analysis, and propose insights, while the human focuses on guiding that process and interpreting the results. And it also means that the analysis process will be democratized,
[00:05:16] because creating dashboards in BI tools today is no doubt easier than it used to be, but it still requires expertise. But with this new way of doing business intelligence, that is no longer needed. So a very significant change, I think. 100% with you there. And I think many organizations this year have been struggling with fragmented data and governance challenges.
[00:05:39] So how does DeepQuery connect to live governed data across multiple systems without creating unnecessary new silos? Because fragmented data and data silos is a big enough problem on its own. So how do you get around that? Well, DeepQuery works on top of the nodo. And the nodo, as I said before, is a data integration and data management software that enables organizations to create a unified data access layer across multiple data sources
[00:06:07] without forcing to first consolidate all the data in a single system. So you can leave the data where it lives and access it in real time through this layer while still enforcing the same security and governance policies as if all the data were in a single place. And we often call this unified layer a semantic layer because it allows expressing the data in the language of the business.
[00:06:33] So it provides the business context that is needed not only by business users, also for AI agents, for instance, to avoid ambiguity and to avoid hallucinations. And in addition, this unified data layer can be created much faster than with traditional data integration methods because with traditional data integration methods, you first need to consolidate all data in a central system, which is typically costly. So DeepQuery can use the internal data organization
[00:07:02] because the nodo first creates this AI-ready logical data layer. And for the techies listening, what is it that sets DeepQuery apart from other common approaches out there like retrieval augmented generation or RAG or data lake house queries? This is what other vendors seem to be promoting at the moment. So what makes DeepQuery a little bit different here? Well, the RAG pattern and also the different variations of it,
[00:07:30] like query RAG and so on, are oriented to simple question-answer patterns, right? So the user asks a specific question. This question typically can be mapped to exactly one search or one query to your internal systems. And then the user gets the results of that query. But DeepQuery goes beyond that. As I said, it can perform complex, open-ended tasks that might require executing many queries over multiple systems,
[00:07:58] analyzing the results, adding new steps on the fly. So it's more complex than a question-answer paradigm. And with respect to lay houses, they have some limitations for AI. First, typically, the lay house only contains a subset of the data that is needed, so you need some way to access the rest of the data. And second, by definition, the data in the lay house is never 100% fresh. There is always some latency in the replication process.
[00:08:26] But many AI agents actually need a full, real-time view of the data. And third, lay houses typically offer limited or no support for specifying the semantics of the data, the business context of the data. And this is really, really important. For instance, if a user asks, what is the customer lifetime value in this region? You need a way to explain to the agent, to indicate to the agent, how this customer lifetime value metric is compute in your organization
[00:08:55] and how it maps to the data in your internal system. If you don't know that, not even a human could solve that. So this is really, really crucial for the accuracy of AI applications inside the organization. If anyone is interested in knowing more about how this, the Nodos logical data layer complementing lay houses, actually there is a recent report that I think is very interesting from the analyst firm Vector8. And this report is available in the Nodos website
[00:09:24] that compares several large organizations, several industries, provide, I think, very interesting data about the gains in ROI and time to data that can be achieved this way. And to bring to life what we're talking about here today, especially for business leaders listening, are you able to share maybe an example to bring that to life of just how a company might use DeepQuery to answer one of those complex cross-functional questions
[00:09:51] that would normally take analyst days to resolve? Because it's quite a time saver, isn't it? But it'd be great to bring that to life with a real story. Yes, absolutely. Many examples. For instance, one that I like is the Central Bank of a European country used DeepQuery to analyze the economic performance and also the geographical distribution of holdings in the country to identify trends in profitability, earnings,
[00:10:20] and other financial indicators. So as you can see, quite a specific need, very oriented to specialists. DeepQuery analyzed the requested factors, suggested new ones that were not in the initial definition, like, for instance, the impact of COVID on the trends, and finally created a report showing all the data and identifying the main trends. So that is a cool, real example. That said, don't get me wrong,
[00:10:48] I am not saying that the reports created by DeepQuery are always final responses to the problems. You should think about DeepQuery as an assistant that does some research for you, then you look at it, then make some changes, then maybe ask the assistant to do some additional research, and then after several iterations, you have the final report. But this is still much faster, and the quality is much better than with the traditional approach.
[00:11:15] And recently, I think there's been a lot of frustrations down to what many people call black box AI, and not knowing how the AI came to a conclusion. And the concept of explainable AI is gaining momentum, I'm noticing as well. So how does DeepQuery ensure that its insights are transparent, cited, and auditable for enterprise users? Because again, this is something that seems to be quietly gaining momentum, isn't it? Yes, absolutely.
[00:11:44] It's a very valid concern. Well, in the case of DeepQuery, the generated reports include an appendix that detail how all the data for the analysis was obtained, right? Because all the data that is used by DeepQuery for the analysis comes directly from the internal data systems of the organization. The agent decides what are the queries, what are the search that it needs to do to obtain the data, and you have an appendix in the report
[00:12:12] where you have exactly how all the data was obtained, what were the exact queries that were executed in your systems. Well, obviously, you have this both in the report, so the user can check this, and also you have that in technical logs, right? So also the administrators of the system and so on can see exactly what happened. So you have full traceability and auditability of the data that is used.
[00:12:41] And I also should highlight that I was reading that Denodo has introduced DeepQuery alongside support for the Model Context Protocol, or MCP. So how will that open standard shape maybe the next generation of agentic and enterprise AI applications? Anything that you see where this road will take us? Yes. Well, MCP is a standard initially proposed by Anthropic,
[00:13:09] but now it's being adopted by the main agents in the industry to specify how an AI agent can interact with external tools and with other agents, right? So, for instance, if your system is MCP compliant, that means that AI agents can potentially use it, can potentially leverage your system to perform their tasks. The node supports NCP, so it's fully NCP enabled.
[00:13:37] So agents and AI applications can interact with it very easily. But it's important to understand that NCP by itself does not solve the problem of unifying the data from multiple data sources, right? There's only a protocol for sending data, for allowing the AI agent to access data and tools from external applications. But it does not solve by itself the problem of unifying data from multiple data sources. It does not solve the semantics problem.
[00:14:05] It does not solve the unified governance problem. So NCP alone is great, but it's not enough. This layer that makes the data AI-ready is still needed. And looking ahead, what does the rise of deep research mean for the future of Gen AI, do you think? And again, how do you see Donodo's AI Accelerator program helping partners and clients get there first?
[00:14:34] Because again, another big topic right now. Well, as I said before, I think that deep query and research agents in general will completely change the way that business intelligence is done. It will be faster, it will be better, and will be accessible to more people. I really think that in a few years, when we look back, we will see a lot of changes in that area. And the Denodo's AI Accelerator program basically tries to help our customers and partners to get there as soon as possible.
[00:15:02] And of course, as I said before, the key for that is the data foundation, creating this data layer with strong semantics that I cannot stress that enough is crucial for accuracy. So with the AI Accelerator program, basically we put our experience to help our customers and partners to get to that vision as soon as possible.
[00:15:33] And you've done a brilliant job of explaining this in a language that everybody can understand. And obviously, you're right in the heart of this space. So just to really bring home what we're talking about here and how big a deal it is, anything that you could share about just how excited you are about where all this is heading and your work? What makes you want to jump out of bed in the morning? What excites you about this work? Well, I think the AI world now is in a really, really exciting state.
[00:16:03] As you say, almost every day, I am a very technical guy, so almost every day, I have new exciting papers to read, new versions or new evolutions of the frameworks that allow you to create agents. So the ecosystem is changing so fast, right? And in many cases, these new things are really, really exciting. So I think it's a great, great moment to be in tech, a great, great moment to be in data.
[00:16:32] I think from both sides, both from the side of the vendors and the developers and the researchers creating it, but also from the user side, right? That get the chance to be pioneers in using these tools that are really changing drastically the way that certain things are done. Well, thank you so much for sitting back down with me today and shining a light on all this stuff.
[00:17:01] But before I let you go, as you know, we have a tradition here on the podcast. I asked my guests to leave either a book that means something to them that we could add to our Amazon wishlist or a song that we can add to our Spotify playlist. I don't mind what it is, but what's that one final gift you're going to leave us with? Yes, well, I have been following the wishlist of the podcast since I came for the first time. I have to say it's great. And I noticed that there are many science fiction books there,
[00:17:30] which I guess is not surprising. But I did find it surprising that Ted Chiang, one of the best sci-fi writers of all time, is not represented yet. So I would like to fill that gap by recommending a book called Stories of Your Life and Others by Ted Chiang, which is a collection of short stories and several of them are actually connected to AI. For instance, there is one story there called The Evolution of Human Science that explores what happens
[00:18:00] when research is not longer done by humans, but instead by super intelligent beings and humans are no longer able to understand their discoveries. So quite timely, I would say. We are still not there, but quite related to the topic today. Well, we need to solve that problem straight away. Let's get that straight on the wishlist. I'm going to be checking that out myself. I'm incredibly intrigued. And for anybody listening wanting to find out more information
[00:18:28] about everything else we talked about today and to know those deep query, etc., where would you like to send everyone? Well, Aureli has recently released a book called The Rise of Logical Data Management written by Christopher Garner that does a great job going into the details of this concept of unified logical data layer that we have been discussing and how it applies to AI. Actually, you can download a free PDF copy of the book on the Denodo website.
[00:18:58] And of course, in addition to that, anyone interested can connect with me on LinkedIn or also check out the Denodo website at denodo.com. Awesome. Well, I will add links to everything you mentioned there. Make it easy for everyone to find you and just love chatting with you today, learning more about your approach at Denodo and how AI is true potentially in the enterprise. It's not just in generating responses, but getting to understand the full context behind them.
[00:19:28] I think it was your CEO said that recently and a great point. And I think it's somewhere we would all like to head to as we progress forward. But thank you for coming back on today. Thank you. Thank you. Thank you for having me again. So how often do we ask AI systems to explain themselves? I think that's the question that this conversation leaves behind. Alberto's explanation of deep query shows that accuracy in AI depends not just on the models,
[00:19:55] but on the data unity, the semantics and trust. And the idea of a research assistant that can justify every query, that can justify every query it runs while keeping data alive and governed, I think paints a refreshing picture of how AI might evolve inside real organisations. And I think it's also easy to see why Denodo's approach resonates with so many enterprises, because in a world where data lives everywhere, that ability to reason across it transparently
[00:20:25] could help define the next generation of business intelligence. And as Alberto reminded us there, this isn't science fiction. This is already happening right now. So ask yourselves, what will happen when your own organisation can finally ask why and get a real answer, an answer they can trust? Food for thought indeed. Tech blog writer at Outlook.com, techtalksnetwork.com if you want to leave me an audio message, or LinkedIn, Instagram,
[00:20:53] it's just at Neil C. Hughes. And a big thank you for all the emails coming in from a few concerned listeners saying, Neil, are you okay? Your voice sounds different. Have you got a new microphone? No, I've just got a good old-fashioned case of man flu. So my voice has been going, which is not ideal for a podcast host, but it will be back again soon. The show must go on. Thanks for listening.

