The promise of AI remains immense — one factor could be limiting progress. “The infrastructure that powers AI today won’t sustain tomorrow’s demands,” an industry warning reads. “CIOs must rethink how to scale smarter – not just bigger – or risk falling behind.”
CrateDB shares that concern and is positioning itself as a solution by acting as a ‘unified data layer for analytics, search, and AI.’ The company argues that a single, fast data platform can reduce friction between operational systems and AI workloads, letting organizations extract value from live data at scale.
Stephane Castellani, SVP marketing, says the core issue is that many IT environments were designed around batch or asynchronous pipelines, so the interval between data production and data consumption has grown too large. He adds that CrateDB can surface the right data quickly and handle large volumes and complex formats in a matter of milliseconds, a change that shifts what analytics and models can do in real time.
A recent CrateDB blog describes a four-step path for linking operational data with AI systems: ingestion, real-time aggregation and insight, serving data into AI pipelines, and creating feedback loops between models and source data. The company frames that sequence as the connective tissue between day-to-day operations and model-driven decision making.
The platform’s speed and its tolerance for varied formats matter. Castellani points to query-time drops from minutes to milliseconds as an example. In factory settings, machines can stream telemetry continuously, producing high-velocity datasets that feed predictive maintenance models and improve failure forecasts with much faster learning cycles.
There is a practical support use case inside plants, Castellani says. “Some also use CrateDB in the factory for knowledge assistance,” he explains. He lays out an operational scenario where a machine shows a specific error and an operator asks, ‘I’m not an expert with this machine, what does it mean and how can I fix it?’ The staffer can query a knowledge assistant built on vector search that relies on CrateDB to fetch the correct manual pages, task steps and troubleshooting instructions in real time, helping teams respond on the shop floor.
AI’s pace of change keeps teams on their toes. “we don’t know what [it] is going to look like in a few months, or even a few weeks,” notes Castellani. Organizations are exploring more agentic AI workflows with greater autonomy, and recent PYMENTS Intelligence research finds manufacturing, which sits inside the wider goods and services sector, is trailing other industries on adoption.
To accelerate that shift, CrateDB has partnered with Tech Mahindra to build agentic AI solutions aimed at automotive, manufacturing and smart factory environments. The tie-up covers integration work and joint efforts to move prototypes into production at scale.
Castellani says he is enthusiastic about the Model Context Protocol (MCP), a standard that seeks to normalize how applications supply context to large language models (LLMs). He compares the move to the enterprise API surge roughly 12 years ago. CrateDB’s MCP Server is still experimental but is meant to act as a bridge linking LLM tooling with an analytics database. “When we talk about MCP it’s pretty much the same approach [as APIs] but for LLMs,” he explains.
Partnerships will sit alongside continued product work. “We keep focusing on our basics,” Castellani adds. “Performance, scalability… investing into our capacity to ingest data from more and more data sources, and always minimizing the latency, both on the ingestion and query side.”

