
Other recent research confirms this. In an October Cisco survey of over 8,000 AI leaders, only 35% of companies have clean, centralized data with real-time integration for AI agents. And by 2027, according to IDC, companies that don’t prioritize high-quality, AI-ready data will struggle scaling gen AI and agentic solutions, resulting in a 15% productivity loss.
“Every enterprise is talking about AI, but most aren’t AI ready because their data is fragmented and poorly cataloged,” says Brad Gastwirth, global head of research and market intelligence at Circular Technology, a supply chain consultancy. “If Everpure can help turn storage into a structured, intelligent data foundation, that could materially shorten the path from proof of concept to production AI.”
It’s not an easy process. It could take years to shift from being viewed primarily as a storage hardware company to a data platform company, Gastwirth says. “There is product integration to get right, but there is also a commercial shift. Sales teams need to sell differently, customers need to budget differently, and the market needs proof points.”
And there are many companies in the race to be the data platform for AI. “The difference is where it sits in the stack,” he says. “If Everpure can bake more intelligence directly into the core storage layer instead of layering tools on top, that can actually simplify things.”
Putting the control layer closer to the data can be helpful as companies deploy agentic AI. AI agents need to have good access to data to function well, whether as part of their training, in RAG embedding, or via MPC servers. But ensuring that agents only access the data they’re supposed to is a challenge.
“The shift to agentic AI is a big reason why you’d want to have your data intelligence tied to your data infrastructure,” says Zeus Kerravala, founder and principal analyst at ZK Research.
