July 26, 2024

TenneT Kafka for grid monitoring | use case

TenneT, a leading European electricity Transmission System Operator, uses Axual's Kafka platform for real-time data streaming. This helps them monitor the grid better and make fast decisions. Axual's platform makes it easier for TenneT to handle large amounts of data, improve their technology, and work more efficiently.

On this page

TenneT is a leading European electricity Transmission System Operator (TSO) in the Energy market with activities in the Netherlands and Germany. With around 21,000 kilometers of high-voltage lines, TenneT ensures a reliable and uninterrupted supply of electricity in their high-voltage grid, connecting the 41 million end-users in the markets they serve.

TenneT’s first and most prominent responsibility is ensuring power delivery to all. But this guarantee can only be achieved if the grid is continuously monitored and balanced. Assets collect vast amounts of data from various sensors and systems across their high-voltage network. TenneT uses that information to achieve many things, from maintaining grid stability to performing advanced analytics and from monitoring power quality to detecting potential issues for real-time decision support. Because their initial setup was batch-driven, TenneT wasn’t able to respond to changes in real-time.

To meet the challenges posed, TenneT wanted a way to build real-time situational awareness and advanced analytics and control capabilities supported by AI. TenneT envisions Data Platforms and Digitalization as pillars to build these capabilities upon. The event streaming paradigm is foundational to architectures capable of handling large amounts of data coming at high speed, processed continuously, and acted upon in real-time.

In order to acquire such a streaming platform, they initiated a European Tender process in 2021. Besides the technology, TenneT envisioned a number of enterprise architecture goals that the platform should contribute to:

Data as a Product: Make data easily accessible and reusable, synchronize data that is stored in multiple enterprise information systems, and create a data platform that connect multiple sources into new data products.

Enable Advanced Analytics and Real-Time Decision Support: Shift from batch processing to an architecture that enables AI and real-time decision support.

Event Processing Architecture: Build systems that are loosely coupled, scalable, extensible, observable, and resilient, centered around the concept of business events.

Cloud Transformation: TenneT aims to transition from classic monolithic applications to modern cloud-native application architectures.

DevOps and Agile Transformation: Organize business-facing DevOps teams into independent cross-functional teams and offload generic platform capabilities to platform teams.

High Availability, Resilience, and Security: TenneT wants to ensure business continuity with a strategy allowing active-active deployment of components across data centers.

Infrastructure and DevOps Automation: Deploy infrastructure and applications via infrastructure and DevOps automation using a Continuous Delivery process and GitOps way-of-working.

Open-Source Strategy: Consume and contribute to open-source software, enabling shared innovation and open-source development.

Axual came out as the Economical Most Beneficial Proposal (Dutch: EMVI). Their own engineers took the responsibility to design and install the platform in the TenneT data centers.

Using Axual Platform, they were able to set up a platform that helps their grid to be controllable, scalable, easier to find bottlenecks, and manageable in costs both for development and maintenance. Axual provides just that, a reliable and scalable solution around Kafka that doesn’t need a lot of technical knowledge, so teams can focus on building solutions. The platform enables you to assign roles and visualize the data flows. This way, real-time data is efficiently collected, processed, and sent to the teams that need this for their use case.

To enable TenneT to really get going with Kafka, Axual provided support with the implementation and enabled teams by providing training to speed up their adoption path. One of our integration specialists helped set up the infrastructure and connections to make sure that data produced by the grid monitoring systems would be available to the people requesting this.

This implementation of the Axual Platform has resulted in an ongoing collaboration between TenneT and Axual. TenneT has scaled their proof of concept to an organization-wide adoption of the Axual platform.

Axual offers an out-of-the-box Kafka solution. Our platform combines the real-time data streaming capabilities of Strimzi, an open-source Kafka on Kubernetes framework, with a user-centric interface that enables large organizations to utilize Apache Kafka with enterprise features such as access control, data governance, and a visual way to see where data is flowing. The goal of our platform is to take the complexity out of Kafka so organizations such as TenneT can focus on building out their use case instead of maintaining Apache Kafka.

Download the Whitepaper

Download now
Table name
Lorem ipsum
Lorem ipsum
Lorem ipsum

Answers to your questions about Axual’s All-in-one Kafka Platform

Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.

What is a grid monitoring system?

Grid monitoring systems are essential for ensuring the reliable operation of electricity distribution networks. These advanced systems continuously track the flow of electricity, allowing utilities to quickly identify and resolve potential issues such as equipment failures, grid overloads, and voltage fluctuations. By leveraging real-time data and analytics, grid monitoring systems enhance operational efficiency, improve response times, and ultimately ensure a stable and reliable power supply for consumers. With the increasing complexity of modern energy grids, the importance of effective grid monitoring cannot be overstated.

What is grid management?

Grid management is the optimization of electricity generation, transmission, distribution, and usage for all stakeholders involved in the electrical grid. An electrical grid is a complex network that facilitates electricity generation, distribution, storage, and control. Effective grid management ensures a reliable and efficient flow of electricity, balancing supply and demand while enhancing overall grid performance. By integrating advanced technologies and data analytics, grid management systems can improve energy efficiency, reduce outages, and support the integration of renewable energy sources. This approach is essential for maintaining a sustainable and resilient energy infrastructure that meets the needs of consumers and businesses alike.

How is TenneT leveraging real-time data for grid management?

TenneT collects vast amounts of data from sensors and systems across its high-voltage network to maintain grid stability and monitor power quality. By transitioning from batch-driven processes to a real-time event streaming architecture with the help of the Axual platform, TenneT can now achieve real-time situational awareness, advanced analytics, and AI-supported decision-making, enabling them to respond promptly to changes and optimize their operations.

Rachel van Egmond
Senior content lead

Related blogs

View all
Richard Bosch
November 12, 2024
Understanding Kafka Connect
Understanding Kafka Connect

Apache Kafka has become a central component of modern data architectures, enabling real-time data streaming and integration across distributed systems. Within Kafka’s ecosystem, Kafka Connect plays a crucial role as a powerful framework designed for seamlessly moving data between Kafka and external systems. Kafka Connect provides a standardized, scalable approach to data integration, removing the need for complex custom scripts or applications. For architects, product owners, and senior engineers, Kafka Connect is essential to understand because it simplifies data pipelines and supports low-latency, fault-tolerant data flow across platforms. But what exactly is Kafka Connect, and how can it benefit your architecture?

Apache Kafka
Apache Kafka
Richard Bosch
November 1, 2024
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming

Apache Kafka is a powerful platform for handling real-time data streaming, often used in systems that follow the Publish-Subscribe (Pub-Sub) model. In Pub-Sub, producers send messages (data) that consumers receive, enabling asynchronous communication between services. Kafka’s Pub-Sub model is designed for high throughput, reliability, and scalability, making it a preferred choice for applications needing to process massive volumes of data efficiently. Central to this functionality are topics and partitions—essential elements that organize and distribute messages across Kafka. But what exactly are topics and partitions, and why are they so important?

Event Streaming
Event Streaming
Jimmy Kusters
October 31, 2024
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics

Strimzi Kafka offers an efficient solution for deploying and managing Apache Kafka on Kubernetes, making it easier to handle Kafka clusters within a Kubernetes environment. In this article, we'll guide you through opening a shell on a Kafka broker pod in Kubernetes and listing all the topics in your Kafka cluster using an SSL-based connection.

Strimzi Kafka
Strimzi Kafka