The promise of AI remains immense, but one factor may be holding it back. "The infrastructure that powers AI today won’t sustain tomorrow’s demands," read a recent article. "CIOs must rethink how to scale smarter – not just bigger – or risk falling behind."
CrateDB has adopted that diagnosis and is pitching its database as a "unified data layer for analytics, search, and AI." Stephane Castellani, SVP marketing, says many IT stacks were built around batch or asynchronous pipelines and now require a shorter gap between data production and consumption. "The challenge is that most IT systems are relying, or have been built, around batch pipeline or asynchronous pipeline, and now you need to reduce the time between the production and the consumption of the data," he explains. He says CrateDB can surface the right records at scale and process complex formats in a few milliseconds.
A company blog sets out a four-step route for CrateDB to act as connective tissue between operational data and AI systems: ingestion, real-time aggregation and insight, serving data into AI pipelines, and building feedback loops between models and datasets.
Those stages let operational systems feed models with fresher, varied inputs and let analysts run low-latency queries. With telemetry, logs, text and images arriving in mixed formats, a shared data layer reduces the need for custom connectors and cuts integration time. That change can shorten the cycle for retraining models and move detection of faults closer to real-time intervention on the shop floor. Demand is rising.
Speed and format variety matter, Castellani points out, with query latency falling from minutes down to milliseconds in some deployments. In factories, machine telemetry streamed continuously feeds predictive maintenance models and raises the frequency of learning cycles.
He adds a further factory use case around knowledge assistance. If an operator sees an error and asks "I’m not an expert with this machine, what does it mean and how can I fix it?", a knowledge assistant that uses CrateDB vectors can pull the correct manual and step-by-step instructions for immediate action.
AI moves fast and "we don’t know what [it] is going to look like in a few months, or even a few weeks", notes Castellani. Organizations are beginning to shift toward fully agentic AI workflows with more autonomy, yet recent PYMENTS Intelligence research finds manufacturing, within the broader goods and services sector, is behind. To address that gap, CrateDB has teamed with Tech Mahindra to build agentic AI solutions for automotive, manufacturing, and smart factories.
Castellani is enthusiastic about the Model Context Protocol (MCP), which standardizes how applications supply context to large language models. He compares MCP to the enterprise API trend from roughly 12 years ago and says CrateDB’s MCP Server, currently experimental, functions as a bridge between AI tooling and the analytics database. "When we talk about MCP it’s pretty much the same approach [as APIs] but for LLMs," he explains.
Tech Mahindra is one of several partnerships CrateDB highlights as it focuses on core performance and scale. Castellani says the company is "focusing on our basics" and then spells out priorities. "Performance, scalability… investing into our capacity to ingest data from more and more data sources, and always minimis[ing] the latency, both on the ingestion and query side," Castellani adds.

