I recently had the pleasure of sitting down with Justin Borgman, CEO and co-founder of Starburst, for an episode of the Age of AI podcast, for what turned out to be one of the most enlightening conversations I’ve had about the intersection of data and AI. Justin’s insights into why most enterprises are stumbling with their AI initiatives hit right at the heart of what I’m seeing across the industry, and, as expected, it was a great conversation.
Watch the full episode here:
The Uncomfortable Truth About Enterprise AI
Here’s something that might surprise you: it’s not the algorithms or the latest language models that are holding back enterprise AI success. It’s something far more fundamental—and far more fixable. As Justin put it so perfectly during our conversation, “AI is fundamentally a data problem that needs to be solved.”
Think about it. While everyone’s rushing to implement the latest AI tools and chatbots, they’re completely overlooking the foundation that makes any of this possible: clean, accessible, governed data. And frankly, most enterprise data architectures just aren’t ready for what AI demands.
The Data Fragmentation Challenge We Can’t Ignore
We kicked off our conversation talking about data, which has everything to do with the potential for success in AI.
The research we conducted together last year in our study, The State of Data Management and Its Impact on AI Development, revealed some sobering statistics. Our survey respondents shared that, while there was a strong intent to adopt AI within the next 12 months, 52% of organizations reported struggling with structured data for machine learning, and 50% faced difficulties in organizing unstructured data for AI purposes. The biggest barriers aren’t technical — they’re centered on data privacy, security concerns, and simply managing the massive volumes of data that enterprises deal with daily.
Source: The State of Data Management and Its Impact on AI Development
What struck me most about my conversation with Justin was his refreshingly pragmatic approach to this challenge. Instead of promoting yet another “move everything to the cloud” solution, Starburst is taking a different path.
The Lakeside AI Revolution
A few weeks ago, Justin and the team at Starburst introduced a concept that I believe will reshape how we think about enterprise data architecture: Lakeside AI. This isn’t just another buzzword—it’s a fundamental shift in bringing AI closer to where data actually lives, rather than forcing data to move to where AI wants it.
“We think of the lake as sort of a center of gravity, but not the entirety of all the data that you have,” Justin explained. This resonated deeply with me because it acknowledges a reality that too many vendors ignore: you’re never going to centralize all your data, and that’s okay.
What Starburst has built around its Trino query engine is particularly impressive. They’ve created a platform that can federate across multiple data sources, allowing you to join tables that live in completely different systems. For AI applications, this means users can finally access all the data they need without the painful ETL processes that slow everything down.
Why the Economics of AI Matter More Than You Think
One theme that kept emerging in our conversation was cost, and Justin is right to keep hammering this point. As he bluntly stated, “You don’t want your AI strategy to bankrupt you.”
I’m seeing this concern everywhere, and certainly a part of every conversation about enterprise AI. Organizations are realizing that cloud-only strategies can become cost-prohibitive, especially when you’re dealing with AI workloads that require massive amounts of data processing. This is driving a fascinating trend back toward hybrid and even on-premises deployments.
Starburst’s Icehouse architecture, built on the Iceberg open format, addresses this head-on. By using inexpensive object storage and open formats, enterprises can own their data without vendor lock-in while keeping costs manageable. It’s the kind of practical approach that makes me optimistic about the future of enterprise AI.
The Agent-to-Agent Future Is Closer Than You Think
Perhaps the most exciting part of our conversation was Justin’s vision for specialized AI agents that communicate with each other, much like how we have specialized human roles in organizations. Starburst wants to be the data agent that other agents communicate with to get governed access to enterprise data.
This isn’t science fiction. It’s happening now, and it’s going to reshape how we think about both SaaS applications and enterprise software more broadly — at a time when AI is no doubt going to shake up both of these things.
The Path Forward
If there’s one takeaway from my conversation with Justin, it’s this: to achieve success with AI, stop chasing the latest AI trends and focus on building a solid data foundation. The enterprises that get this right, the ones that invest in governed, accessible, cost-effective data architectures, will be the ones that actually realize the benefits of AI’s transformative potential.
The future belongs to those who understand that AI success isn’t about having the fanciest models. It’s about having the right data, in the right place, at the right time, with the right governance. And that’s exactly what Starburst is helping enterprises achieve.
If you’re interested in reading The State of Data Management and Its Impact on AI Development, you can grab it here.
This article was originally published on LinkedIn.
Read more of my coverage here:
Zscaler Unveils Cutting-Edge AI Innovations to Securely Enable Business Transformation
Securing Generative AI: The New Threat Landscape Demands a New Approach — Insights from IBM and AWS