Resources Hub
Your one-stop destination for all the resources you need to kickstart your Kafka journey.

On-demand webinars
Watch and learn more about Axual

Hear about PTSB's journey, from their successful mainframe offloading proof of concept to launching a full payments engine in just 7 months.

The power of real-time data Join us as we dive into how PostNL leverages Apache Kafka to track millions of parcels in real-time, powering notifications and ensuring seamless customer experiences. In this session, we’ll break down the architecture, key strategies, and practical takeaways from PostNL's journey with event streaming. Key take aways: Unlock business value with real-time data: we will discuss different data integration patterns and explore how real-time data can accelerate business growth. How PostNL scaled Kafka through self-service, governance and security: discover how PostNL created a scalable event-driven architecture to track millions of parcels in real-time. See Axual’s platform in action: experience a live demo of Axual's event streaming platform, showcasing practical solutions and best practices.
Whitepapers
Explore our whitepapers and product collateral

Learn how to scale Kafka across the enterprise

Learn how to scale Kafka across the enterprise

Learn how to scale Kafka across the enterprise
Customer Case Studies and Use Cases
Read use cases and success stories from our customers

Read how PTSB launched real-time payments on a governance-first Kafka platform, cutting latency from 10 seconds to under 1 second and setting the foundation for enterprise-wide event streaming.

Learn how Axual powers real-time energy trading with 99.9% uptime, sub-second latency, unified data streaming, reduced operational complexity, and scalable support for increasing renewable energy data volumes.

See how Enexis uses Axual to combine Strimzi's open-source framework with enterprise features to transform smart metering with real-time data streaming.

Learn how TenneT moved from batch to real-time processing, gained continuous grid visibility, and scaled secure, governed monitoring across its network.

Learn how Logius modernized Dutch government communications with a governed, on-prem Kafka platform delivering real-time messaging, data sovereignty, and full auditability.

Learn how PostNL modernized its parcel tracking by implementing a centralized Kafka platform that processes over 100 million messages per day.

Learn how Axual supports 17,000 healthcare organizations across Norway with secure event streaming, sub-100ms clinical latency, full audit coverage, and 99.999% reliability.

Learn how ASN Bank freed 3 DevOps teams, enabled 50+ dev teams, and replaced fragmented systems with one centralized multi-AZ Kafka platform.

Read how Rabobank became an event-driven bank and scaled Kafka across 300+ DevOps teams.

Learn how Eneco uses Axual’s Enterprise Kafka platform with self-service governance and ongoing operational support for its mission-critical energy data streams.
Tutorials
Watch and learn more about all things Kafka

In this video, you'll learn how to efficiently get data in and out of a streaming platform. We’ll cover key techniques for ingesting data from various sources, processing it in real-time, and exporting the processed results to external systems. Whether you're dealing with Apache Kafka, AWS Kinesis, or another platform, you'll discover essential strategies for managing data streams seamlessly and integrating them into your workflows.
Video hub
Watch and learn more about all things Kafka

In this video Erik Lindeman will explain you everything about the road of Alliander to the effective use of Kafka for all its users and where to go from here. Alliander, a group of companies active in the utilities sector, has been using Kafka for about 4 years. It started with a saas application on-prem without a dedicated team to manage all the problems, questions, wishes, and updates that arose. After several years, the growth in demand for connections to Kafka was such that a new solution was needed. That’s why, two years ago, a migration to a new platform was started, supported by a new team dedicated only to Kafka. Erik will speak about Alliander's journey and how the team shifted from setting up the platform and migrating to improving the "customer experience" for all the users and all possible use cases.

We were thrilled to bring the Kafka enthusiast community together again, and it was a fantastic event! On October 8th, we hosted an incredible lineup of speakers covering all the latest information about Zookeeper to KRaft. In this video, Jakub Scholz explains everything about the latest ins and outs.

In this video, Daniël will add practice to the theory and perform a live migration of a Strimzi setup with Zookeeper to one using KRaft WITHOUT downtime for a Kafka client connecting to the cluster. You will learn more about the steps required, the necessary configuration, and some of the observability aspects of such a migration. And, of course, you will better understand how simple the migration can actually be. Perhaps, after following this talk, you feel confident about doing this migration yourself, too?
.jpg)




