Real-time monitoring is an important capability for many edge cloud applications. The secret to ensuring the monitoring systems can keep up with data flowing at tens of gigabits per second is not just network speed or processing power—it’s persistent storage.
Want to learn more about a practical implementation of this concept? Obinna Amalu, Senior Director of IT Operations & Infrastructure at Slalom, a business and technology consulting company, shared his insights at our booth at Google Cloud Next ’25. Read on to discover what he had to say.
Titled “Write fast, read now: The role of persistent storage in real-time monitoring,” Amalu’s case study stemmed from a real-world deployment involving Google Distributed Cloud (GDC) connected and a real-time monitoring platform for a manufacturing company. The challenge was formidable: one edge server was generating 12 gigabytes of data per second, and the cloud-based monitoring processing time was coming in at around 35 to 40 seconds—far too slow for real-time use.
To tackle this, Amalu’s team integrated Rakuten Cloud-Native Storage into the GDC connected Kubernetes stack. This edge-cloud solution was essential because, as Amalu explained, “GDC connected does not understand the concept of persistent storage. You can only use the local disk.” By introducing Rakuten Cloud software-defined storage into the environment, they dramatically reduced latency. "On initial deployment… It just simply took the previous 35 seconds and crunched it down to about nine to ten seconds," he said.
But the most significant breakthrough came when the team enabled low-latency mode within Rakuten Cloud-Native Storage. “When we deployed the patch, it took us from nine seconds to 0.8 seconds,” Amalu said. “That was almost 85% to 90% reduction in latency.”
The backend stack for this deployment was a robust blend of Kafka, Prometheus, InfluxDB, and Grafana, which supported ingestion, visualization, and KPI tracking—all in sub-second timeframes.
Amalu said that this performance wasn’t just an engineering feat. It was a strategic rethinking of how persistent storage should be used in edge environments where latency, reliability, and data sovereignty are all top concerns. In high-precision manufacturing sectors—such as semiconductors—there is simply no room for delay. “You want to use the same platform, the same storage, and everything with all the data in it to be able to detect faults in real time,” Amalu said.
For this case study, the team used a six-node server appliance running GDC connected. Each of those nodes included storage. GDC connected is a fully managed, hybrid cloud solution that extends Google Cloud's infrastructure and services to on-premises locations and edge environments. It allows organizations to run workloads, including containerized and virtual machine applications, closer to their data sources and users, while maintaining a connection to the central Google Cloud.
The only way to get persistent storage for Kubernetes-based GDC connected is by using Rakuten Cloud-Native Storage. The team implemented the Rakuten Cloud-Native Storage on three nodes so they could implement a three-way replication model using Rakuten Cloud’s storage classes to ensure data durability and high availability. “So, in case node one goes down, you still have your data in node two and node three. Once you bring this back up, everything synchronizes,” Amalu said.
A critical part of the success lay in how well Rakuten Cloud-Native Storage integrated into Kubernetes. “It deploys the Kubernetes Custom Resource Definitions (CRD) for Rakuten Cloud-Native Storage. It’s a seamless integration. You just have to enable the storage classes and make it your default storage class,” Amalu explained.
While the use case originated in manufacturing, Amalu highlighted broader implications for sectors like retail, telecom, and government. He cited McDonald’s, which was part of the keynote at the show, as an example where persistent edge storage could mitigate service disruptions: “If you have an edge node that has persistent storage that can withstand intermittent data fluctuations... once connectivity comes back, it syncs all the data to the cloud.”
One of the often-overlooked benefits of edge platforms like GDC connected is air-gapping—a capability that allows sensitive environments to remain operationally independent of a public cloud service.
While the sub-second response is good, Amalu believes that performance will get better. “I look at real-time like anything less than a second. To me, it’s considered real-time. That’s the reality we have to face until something else comes out and says no, that is no longer the benchmark,” he concluded.
His team’s work with Rakuten Cloud and Google proves that sub-second monitoring is not only achievable—it’s now essential. For companies operating at the edge of innovation, the mantra is clear: write fast, read now—or fall behind.
The full session is available now to watch at: https://youtu.be/DHbElQvwQhs?si=GihXyBBVWoj0fHVV