Quick facts
- Company: Eneco
- Industry: Energy (Renewable Integration & Grid Management)
- Date: 2019 - Present
- Our Involvement: Enterprise Kafka platform deployment with self-service governance, ongoing operational support for mission-critical energy data streams
About Eneco
Eneco is one of the Netherlands' largest energy companies, supplying electricity and gas to over 2 million business and residential customers. The company operates renewable energy assets including wind farms and solar parks, while managing complex energy trading operations and smart grid infrastructure across multiple European markets.
Goals & context
By 2019, Eneco faced a fundamental architectural challenge. The energy transition was no longer theoretical. Renewable energy sources created unpredictable generation patterns. Smart meters generated exponential data volumes. Customers expected real-time insights into their energy usage. Energy trading required millisecond-precision data for grid balancing decisions.
Eneco's existing batch-oriented architecture couldn't support these requirements. Data moved too slowly. Systems operated in silos. The gap between operational technology (SCADA, smart meters, grid sensors) and IT systems (customer platforms, trading systems, analytics) was becoming a business constraint.
The organization needed event streaming. They understood Apache Kafka was the foundation. But here's where most energy companies stumble: they assume deploying Kafka solves the problem. It doesn't. Kafka without governance becomes another silo. Kafka without proper authorization puts critical infrastructure at risk. Kafka managed by a central team creates bottlenecks that negate the speed benefits.
Eneco needed streaming infrastructure that could handle mission-critical energy operations while enabling teams to move fast. These requirements aren't easily reconciled with DIY Kafka implementations.
Strategic approach
Hypothesis: Event streaming could accelerate energy transition initiatives, but only if it provided both operational autonomy for development teams and governance controls required for critical infrastructure.
Core Principles:
- Real-time data processing as the foundation for grid management and customer experience
- Self-service capabilities to eliminate IT bottlenecks while maintaining security standards
- Open source foundation to avoid vendor lock-in while gaining enterprise governance
- Seamless integration between operational technology and IT systems
Operating Model: Axual's platform would provide the Kafka foundation with built-in governance, allowing energy domain teams to create and manage their own data flows within defined security boundaries. Conclusion would provide the operational expertise for running streaming infrastructure in an energy environment where downtime isn't an option.
200%
Faster data processing
30+
Energy savings
1M+
Messages sent per day
Implementation: From operational control to customer experience
Eneco faced two interconnected challenges that exposed the limitations of batch processing. Their district heating networks distributed heat to thousands of buildings based on weather forecasts and historical patterns, creating energy waste during over-supply and comfort issues during under-supply. Simultaneously, customers expected real-time insights into energy usage, but service representatives provided data that was 24 hours stale. The insight: you cannot optimize heat distribution with yesterday's data, and customers asking "why is my bill higher" want current information, not yesterday's snapshot. Both problems required the same solution: real-time event processing across operational and customer-facing systems.
The approach was to deploy event streaming infrastructure connecting operational technology (temperature sensors, flow meters, SCADA systems) with IT systems (customer platforms, billing, service management), processing these streams to enable both dynamic operational optimization and immediate customer data access.
The heating operations team used Axual's self-service environment to define topics for sensor data streams, set retention policies appropriate for operational data, and establish access controls ensuring only authorized systems could modify heating parameters. Integration with existing SCADA systems happened through Kafka Connect, eliminating custom integration code. Customer platform teams created parallel event streams capturing metering events, billing calculations, and service changes, feeding real-time dashboards and customer service tools. The self-service model meant both teams could iterate independently without infrastructure bottlenecks.
The platform processes sensor data and customer events with sub-second latency. Control systems receive updated parameters before conditions change significantly. Customer queries receive responses based on data less than one minute old. The system handles over 1 million messages per day with consistent performance during peak periods (cold weather, price changes). The architecture handles sensor failures gracefully through replicated data across brokers with automated failover.
Operational foundation: Why standard approaches failed
Most energy companies attempting similar transformations choose either DIY Kafka (which leads to operational complexity, topic sprawl, and security inconsistencies within six months) or managed cloud services (which fail on data residency requirements, latency for real-time control systems, and unpredictable costs). Eneco needed a third option: enterprise governance and operational simplicity on infrastructure they control. The partnership between Axual and Conclusion provided exactly that, with operational expertise specific to energy environments where system failures affect grid stability, not just revenue.
Results
- 200% faster data processing compared to legacy batch systems
- 30% energy savings in district heating operations through real-time optimization
- 1M+ messages per day processed with consistent sub-second latency
- Zero unplanned downtime affecting energy operations since deployment
- Self-service adoption: Multiple business units now manage their own data flows independently
"By having access to this real-time data, Eneco can use its data flexibly to help the organization and its customers even better."
Said Iem Smid | Project Manager Program and Portfolio Management Eneco
What this actually means
Eneco built event streaming infrastructure that handles mission-critical operations while enabling organizational agility. The real achievement isn't the technology deployment but the operating model: domain teams move fast within governance boundaries. Real-time data processing enabled decisions that were previously impossible. You cannot optimize heat distribution with yesterday's data.
For energy companies facing renewable integration, grid complexity, and rising customer expectations, the challenge isn't adopting event streaming. It's doing so in a way that serves both business agility and operational reliability.
Further information
Next step
If your organization faces similar challenges with real-time data processing in critical infrastructure, discuss your specific architecture requirements with our technical team. We focus on understanding your environment before proposing solutions.
{{tenbtn}}



