On this page
Sustainability is an essential topic in the energy sector. Eneco’s mission is “Sustainable energy for everyone” and wants to help its customers take the next step in their energy transition. This requires knowledge in two areas, specifically: knowledge of the market and of the customer. This knowledge is obtained from data. More data arise every year, and this data must be available and processed as fast as possible. Because “real-time data” plays an increasingly important role in energy transformation, Eneco decided in 2019 to set up an Event Streaming Platform. This project was awarded to the partnership of Conclusion Mission Critical and Axual.
Read the full use case about Eneco's Event Streaming Platform for energy transition
Download the Whitepaper
Download nowAnswers to your questions about Axual’s All-in-one Kafka Platform
Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.
An event streaming platform is a system that allows for the real-time processing and management of event data as it flows through a network. It enables organizations to capture, store, process, and analyze streams of events or data changes, facilitating the creation of data-driven applications and enabling real-time analytics.
Eneco is a Dutch energy company focused on the supply and production of sustainable energy. Founded in 1995 and headquartered in Rotterdam, it operates primarily in the Netherlands and Belgium and has a significant commitment to renewable energy sources. Eneco provides energy to residential and business customers, offering electricity, natural gas, and heat services. The company has focused on sustainable energy solutions, aiming to help customers transition to renewable energy.
The Energy Transition refers to the global shift from fossil fuel-based energy systems to more sustainable, low-carbon energy sources, particularly renewable energy. This transition is driven by the need to address climate change, reduce greenhouse gas emissions, and create a more resilient and sustainable energy infrastructure.
Related blogs
Apache Kafka has become a central component of modern data architectures, enabling real-time data streaming and integration across distributed systems. Within Kafka’s ecosystem, Kafka Connect plays a crucial role as a powerful framework designed for seamlessly moving data between Kafka and external systems. Kafka Connect provides a standardized, scalable approach to data integration, removing the need for complex custom scripts or applications. For architects, product owners, and senior engineers, Kafka Connect is essential to understand because it simplifies data pipelines and supports low-latency, fault-tolerant data flow across platforms. But what exactly is Kafka Connect, and how can it benefit your architecture?
Apache Kafka is a powerful platform for handling real-time data streaming, often used in systems that follow the Publish-Subscribe (Pub-Sub) model. In Pub-Sub, producers send messages (data) that consumers receive, enabling asynchronous communication between services. Kafka’s Pub-Sub model is designed for high throughput, reliability, and scalability, making it a preferred choice for applications needing to process massive volumes of data efficiently. Central to this functionality are topics and partitions—essential elements that organize and distribute messages across Kafka. But what exactly are topics and partitions, and why are they so important?
Strimzi Kafka offers an efficient solution for deploying and managing Apache Kafka on Kubernetes, making it easier to handle Kafka clusters within a Kubernetes environment. In this article, we'll guide you through opening a shell on a Kafka broker pod in Kubernetes and listing all the topics in your Kafka cluster using an SSL-based connection.