On-demand Webinars

Your one-stop destination for all the resources you need to kickstart your Kafka journey.

View all our Kafka webinars on-demand

Apache Kafka | Scaling Across 200+ DevOps Teams - Self-Service & Data Governance

Event streaming empowers companies to service their customers faster and better and accelerate their business processes. It is no surprise to see that streaming engines such as Kafka gained popularity. In this talk, we will go through the 5 stages of event-driven architecture and share some insights into our journey at scaling Kafka at Rabobank. How can you make the adoption of Kafka a success in your organization? And, how can you scale data governance and security to stay in control?

Unlock the power of legacy systems with Apache Kafka

Legacy systems are the backbone of many organizations, encapsulating years of critical data. Despite the emergence of modern technologies, these systems remain indispensable. Migrating away from them is often not an option due to its complexity. Instead, the focus should be on unlocking the value trapped within these systems. That’s where Apache Kafka comes in. Join our webinar to learn: - Data unlocking techniques: discover how to break down data silos and make legacy system data accessible to your teams without disrupting existing operations. We'll cover technologies like Apache Kafka and how they can act as a central hub for data streams. - Implementing data governance and security: once data is accessible, maintaining its integrity and security is crucial. Learn best practices for data governance and security to ensure that your data remains compliant. - Integrating data back into legacy systems: understand the importance of integrating data changes back into legacy systems. We’ll explore how to set up topics and special integrations to process and apply requested changes efficiently. 00:00 Introduction 01:25 How IT landscapes build up over time 02:52 Who uses legacy systems? 05:46 Why are legacy systems still used? 08:53 When are legacy systems becoming troublesome? 09:47 When do systems become legacy? 13:10 The risk of keeping legacy systems and applications 16:00 Strategies for modernizing legacy systems 18:52 The two mechanisms to encapsulate legacy systems 19:17 Events vs APIs 24:48 When to prefer APIs, and when to prefer events 26:42 Examples of legacy systems API calls 27:44 Examples of legacy systems events 28:22 How to emit events from legacy systems 29:17 Expose your business events in 7 steps 35:05 Out of the box connectors supported by Axual 35:25 KSML 36:22 How Axual allows organizations to scale arround Kafka 37:05 Reference case: Eneco 38:33 Reference case: Permanent TSB 39:19 Reference case: PostNL 40:28 Looking to modernize your IT landscape 41:59 Question 1: Can you build this with open source Kafka? 42:10 Question 2: Can Kafka improve the performance of my legacy systems? 44:02 Question 3: Axual platform is Strimzi based, have you worked with environments that were running open source Strimzi before? 45:57 Question 4: Who decides who gets access to which data? 49:03 Question 5: How long on average does it take to get some first data to our systems?

Introducing Axual Governance - turn Kafka into a competitive advantage

Virtual launch event of Axual Governance. With Axual Governance, organizations are able to scale their development teams around a central Kafka. Our platform comes with built-in self-service and governance functionalities to unlock the full potential of Kafka for your development teams.

Next Generation Control Rooms | Using Apache Kafka to manage energy grid IT infrastructure

In this webinar, Jeroen van Disseldorp (Founder & CEO Axual) and Laurent Schmitt (President of Digital4Grids) delve into the following key topics: 1. Evolution from SOA to Event Streaming: Discover why it's essential to transition from traditional Service-Oriented Architectures to the agility and scalability of event streaming frameworks. 2. Adoption of Open-Source Platforms: Explore the advantages of embracing open-source platforms like Apache Kafka for seamless and effective data streaming in the energy sector. 3. TSOs and DSOs as Technology Integrators: Understand the critical role of TSOs and DSOs as integrators of cutting-edge technologies, driving innovation and efficiency in the industry. 00:00 Introduction 01:32 Historic and current paradigms of energy grids 05:48 Key use cases for grid expansion 08:36 Integration beyond traditional control room boundaries 09:48 Enabling interoperability through standardized data structures 12:10 Why should you move to new OT architectures? 16:57 What is Kafka4Power? 21:30 Kafka4Power deployment for TSO-DSO coordination 25:49 Reference case TenneT 27:21 Reference case EDDIE 30:10 Reference case Eneco 32:21 How to get started with Kafka4Power 36:17 Question 1 "Will there be a standardized data payload for all use cases?" 38:32 Question 2 "Are digital twins a typical use case for Apache Kafka?" 40:23 Question 3 "We already use Kafka, can we still use the Kafka4Power environment?" 41:26 Question 4 "Who are the competitors within the Kafka energy space?" 42:32 Question 5 "How are you aligning open-source with Kafka4Power?" 43:49 Question 6 "CIM data is heavy compared to SCADA, do you believe Kafka can handle it form a scalability and performance perspective?" 46:42 Question 7 "How do you decouple and standardize between different parties?" 49:17 Question 8 "What are the other references beyond the three mentioned cases?" 50:47 Question 9 "Is this only for TSOs to share data internally?" 51:41 Question 10 "What is the average time to deploy such a system?" 54:03 Question 11 "Is it possible to build enterprise dashboards using Kafka4Power?" 55:04 Question 12 "Is Kafka fast enough for real-time dashboards, does it need to consume directly from topics?" 56:35 Question 13 "Are there any technical limitations that you see regarding Kafka?"

Running Kafka in Kubernetes : a practical guide

The rise of Apache Kafka as the de-facto standard for event streaming has coincided with the rise of Kubernetes for cloud-native applications. While Kubernetes is a great choice for any distributed system, that doesn't mean it is easy to deploy and maintain a Kafka cluster running on it. At IBM we have hands-on experience with running Kafka in Kubernetes and in this session I will share our top tips for a smooth ride. I will show an example deployment of Kafka on Kubernetes and step through the system to explain the common pitfalls and how to avoid them. This will include the Kubernetes objects to use, resource considerations and connecting applications to the cluster. Finally, I will discuss useful Kafka metrics to include in Kubernetes liveness and readiness probes.

5 stages of event-driven architecture (EDA) with Kafka for the finance industry

A lot of financial services companies want to move towards an event-driven architecture (EDA) and real-time data integration. But, where do you start? Watch this webinar as we discuss: - The 5 stages of event-driven architecture and how to go from experiments to centralized event streaming - How Rabobank and Volksbank, two leading banks in the Netherlands, scaled Kafka across 200+ teams with self-service - Live 15-minute demo of Axual's all-in-one Kafka platform for enterprises