Resources Hub
Your one-stop destination for all the resources you need to kickstart your Kafka journey.
On-demand webinars
Watch and learn more about Axual
About the event: The power of real-time data Join us as we dive into how PostNL leverages Apache Kafka to track millions of parcels in real-time, powering notifications and ensuring seamless customer experiences. In this session, we’ll break down the architecture, key strategies, and practical takeaways from PostNL's journey with event streaming. Key take aways: Unlock business value with real-time data: we will discuss different data integration patterns and explore how real-time data can accelerate business growth. How PostNL scaled Kafka through self-service, governance and security: discover how PostNL created a scalable event-driven architecture to track millions of parcels in real-time. See Axual’s platform in action: experience a live demo of Axual's event streaming platform, showcasing practical solutions and best practices.
See Axual’s platform in action: experience a live demo of Axual's event streaming platform, showcasing practical solutions and best practices.
Discover how PostNL created a scalable event-driven architecture to track millions of parcels in real-time.
Whitepapers
Explore our whitepapers and product collateral
Learn how to scale Kafka across the enterprise
Learn how to scale Kafka across the enterprise
Learn how to scale Kafka across the enterprise
Customer Case Studies and Use Cases
Read use cases and success stories from our customers
Becoming data-driven and customer-centric in a highly regulated environment
Read how Rabobank became an event-driven bank and scaled Kafka across 150+ DevOps teams
Read how Eneco is using Axual’s event streaming platform as the next step in their energy transition
Tutorials
Watch and learn more about all things Kafka
In this video, you'll learn how to efficiently get data in and out of a streaming platform. We’ll cover key techniques for ingesting data from various sources, processing it in real-time, and exporting the processed results to external systems. Whether you're dealing with Apache Kafka, AWS Kinesis, or another platform, you'll discover essential strategies for managing data streams seamlessly and integrating them into your workflows.
In this video i'll show you how you can request approval to a stream in Axual and how you can deny or approve that request.
In this third episode, Abhinav will show us how to produce and consume data from a data streaming application in the Axual Platform.
Video hub
Watch and learn more about all things Kafka
🎉 The first Kafka Meetup 2024 Utrecht was a blast; we can't wait to share the full experience with you! On October 8th, we brought together a fantastic lineup of Kafka experts, including Jakub Scholz from Red Hat, who gave an insightful talk on the much-anticipated change in Kafka: KRaft. We also had fantastic demos and discussions with Daniel Mulder (Axual) and Erik Lindeman (Alliander). This teaser is just a taste of what's to come. Stay tuned for the entire event video, which is coming soon!
The rise of Apache Kafka as the de-facto standard for event streaming has coincided with the rise of Kubernetes for cloud-native applications. While Kubernetes is a great choice for any distributed system, that doesn't mean it is easy to deploy and maintain a Kafka cluster running on it.
At Axual we love to help DevOps teams to apply Kafka topic security in an easier way. That’s why with great pride we are bringing SASL support to our Kafka event streaming platform. SASL support is implemented on multiple levels in the platform, from the Kafka Broker to the Self-Service UI. As soon as the platform operators or tenant admins have enabled SASL support on the broker, they configure it as an available authentication method in the Self-Service UI. Then, they can gradually add support for the instances by marking SASL as an enabled authentication method.