On-demand Webinars

Your one-stop destination for all the resources you need to kickstart your Kafka journey.

View all our Kafka webinars on-demand

Webinar - How PostNL scaled to 100M+ messages daily with Kafkaa webinar by Appythings & Axual

About the event: The power of real-time data Join us as we dive into how PostNL leverages Apache Kafka to track millions of parcels in real-time, powering notifications and ensuring seamless customer experiences. In this session, we’ll break down the architecture, key strategies, and practical takeaways from PostNL's journey with event streaming. Key take aways: Unlock business value with real-time data: we will discuss different data integration patterns and explore how real-time data can accelerate business growth. How PostNL scaled Kafka through self-service, governance and security: discover how PostNL created a scalable event-driven architecture to track millions of parcels in real-time. See Axual’s platform in action: experience a live demo of Axual's event streaming platform, showcasing practical solutions and best practices.

Webinar part 3 - See Axual’s platform in action - Mendelt Luinenburg - DevOps Axual

See Axual’s platform in action: experience a live demo of Axual's event streaming platform, showcasing practical solutions and best practices.

Webinar part 2 - How PostNL scaled to 100M+ messages daily with Kafka - Guy Hagemans Appythings

Discover how PostNL created a scalable event-driven architecture to track millions of parcels in real-time.

Webinar part 1 - Unlock Business Value with Real-Time Data - Jeroen van Disseldorp - CEO Axual

Unlock business value with real-time data: we will discuss different data integration patterns and explore how real-time data can accelerate business growth.

Apache Kafka | Scaling Across 200+ DevOps Teams - Self-Service & Data Governance

Event streaming empowers companies to service their customers faster and better and accelerate their business processes. It is no surprise to see that streaming engines such as Kafka gained popularity. In this talk, we will go through the 5 stages of event-driven architecture and share some insights into our journey at scaling Kafka at Rabobank. How can you make the adoption of Kafka a success in your organization? And, how can you scale data governance and security to stay in control?

Unlock the power of legacy systems with Apache Kafka

Legacy systems are the backbone of many organizations, encapsulating years of critical data. Despite the emergence of modern technologies, these systems remain indispensable. Migrating away from them is often not an option due to its complexity. Instead, the focus should be on unlocking the value trapped within these systems. That’s where Apache Kafka comes in. Join our webinar to learn: - Data unlocking techniques: discover how to break down data silos and make legacy system data accessible to your teams without disrupting existing operations. We'll cover technologies like Apache Kafka and how they can act as a central hub for data streams. - Implementing data governance and security: once data is accessible, maintaining its integrity and security is crucial. Learn best practices for data governance and security to ensure that your data remains compliant. - Integrating data back into legacy systems: understand the importance of integrating data changes back into legacy systems. We’ll explore how to set up topics and special integrations to process and apply requested changes efficiently. 00:00 Introduction 01:25 How IT landscapes build up over time 02:52 Who uses legacy systems? 05:46 Why are legacy systems still used? 08:53 When are legacy systems becoming troublesome? 09:47 When do systems become legacy? 13:10 The risk of keeping legacy systems and applications 16:00 Strategies for modernizing legacy systems 18:52 The two mechanisms to encapsulate legacy systems 19:17 Events vs APIs 24:48 When to prefer APIs, and when to prefer events 26:42 Examples of legacy systems API calls 27:44 Examples of legacy systems events 28:22 How to emit events from legacy systems 29:17 Expose your business events in 7 steps 35:05 Out of the box connectors supported by Axual 35:25 KSML 36:22 How Axual allows organizations to scale arround Kafka 37:05 Reference case: Eneco 38:33 Reference case: Permanent TSB 39:19 Reference case: PostNL 40:28 Looking to modernize your IT landscape 41:59 Question 1: Can you build this with open source Kafka? 42:10 Question 2: Can Kafka improve the performance of my legacy systems? 44:02 Question 3: Axual platform is Strimzi based, have you worked with environments that were running open source Strimzi before? 45:57 Question 4: Who decides who gets access to which data? 49:03 Question 5: How long on average does it take to get some first data to our systems?