Resources Hub

Your one-stop destination for all the resources you need to kickstart your Kafka journey.

On-demand webinars

Watch and learn more about Axual

View all
Apache Kafka | Scaling Across 200+ DevOps Teams - Self-Service & Data Governance

Event streaming empowers companies to service their customers faster and better and accelerate their business processes. It is no surprise to see that streaming engines such as Kafka gained popularity. In this talk, we will go through the 5 stages of event-driven architecture and share some insights into our journey at scaling Kafka at Rabobank. How can you make the adoption of Kafka a success in your organization? And, how can you scale data governance and security to stay in control?

Apache Kafka | Scaling Across 200+ DevOps Teams - Self-Service & Data Governance
Unlock the power of legacy systems with Apache Kafka

Legacy systems are the backbone of many organizations, encapsulating years of critical data. Despite the emergence of modern technologies, these systems remain indispensable. Migrating away from them is often not an option due to its complexity. Instead, the focus should be on unlocking the value trapped within these systems. That’s where Apache Kafka comes in. Join our webinar to learn: - Data unlocking techniques: discover how to break down data silos and make legacy system data accessible to your teams without disrupting existing operations. We'll cover technologies like Apache Kafka and how they can act as a central hub for data streams. - Implementing data governance and security: once data is accessible, maintaining its integrity and security is crucial. Learn best practices for data governance and security to ensure that your data remains compliant. - Integrating data back into legacy systems: understand the importance of integrating data changes back into legacy systems. We’ll explore how to set up topics and special integrations to process and apply requested changes efficiently. 00:00 Introduction 01:25 How IT landscapes build up over time 02:52 Who uses legacy systems? 05:46 Why are legacy systems still used? 08:53 When are legacy systems becoming troublesome? 09:47 When do systems become legacy? 13:10 The risk of keeping legacy systems and applications 16:00 Strategies for modernizing legacy systems 18:52 The two mechanisms to encapsulate legacy systems 19:17 Events vs APIs 24:48 When to prefer APIs, and when to prefer events 26:42 Examples of legacy systems API calls 27:44 Examples of legacy systems events 28:22 How to emit events from legacy systems 29:17 Expose your business events in 7 steps 35:05 Out of the box connectors supported by Axual 35:25 KSML 36:22 How Axual allows organizations to scale arround Kafka 37:05 Reference case: Eneco 38:33 Reference case: Permanent TSB 39:19 Reference case: PostNL 40:28 Looking to modernize your IT landscape 41:59 Question 1: Can you build this with open source Kafka? 42:10 Question 2: Can Kafka improve the performance of my legacy systems? 44:02 Question 3: Axual platform is Strimzi based, have you worked with environments that were running open source Strimzi before? 45:57 Question 4: Who decides who gets access to which data? 49:03 Question 5: How long on average does it take to get some first data to our systems?

Unlock the power of legacy systems with Apache Kafka
Introducing Axual Governance - turn Kafka into a competitive advantage

Virtual launch event of Axual Governance. With Axual Governance, organizations are able to scale their development teams around a central Kafka. Our platform comes with built-in self-service and governance functionalities to unlock the full potential of Kafka for your development teams.

Introducing Axual Governance - turn Kafka into a competitive advantage

Whitepapers

Explore our whitepapers and product collateral

View all
5 stages of event-driven architecture with Kafka

Learn how to scale Kafka across the enterprise

5 stages of event-driven architecture with Kafka
Data Governance & the Road to Compliance with Apache Kafka

Learn how to scale Kafka across the enterprise

Data Governance & the Road to Compliance with Apache Kafka
Future Control Room Architectures for Grid and Market Integrated Operation

Learn how to scale Kafka across the enterprise

Future Control Room Architectures for Grid and Market Integrated Operation

Transform your Kafka ecosystem with Axual’s Strimzi Incident Support services

Tranfrom your Kafka ecosystem with Axual's Strimzi incident Support services

Get the support you need

Customer Case Studies and Use Cases

Read use cases and success stories from our customers

View all
De Volksbank

Becoming data-driven and customer-centric in a highly regulated environment

Rabobank

Read how Rabobank became an event-driven bank and scaled Kafka across 150+ DevOps teams

Eneco

Read how Eneco is using Axual’s event streaming platform as the next step in their energy transition

Tutorials

Watch and learn more about all things Kafka

View all
How to get data in and out of a streaming platform?

In this video, you'll learn how to efficiently get data in and out of a streaming platform. We’ll cover key techniques for ingesting data from various sources, processing it in real-time, and exporting the processed results to external systems. Whether you're dealing with Apache Kafka, AWS Kinesis, or another platform, you'll discover essential strategies for managing data streams seamlessly and integrating them into your workflows.

How to get data in and out of a streaming platform?
Self service tutorial - request approval (S1E3)

In this video i'll show you how you can request approval to a stream in Axual and how you can deny or approve that request.

Self service tutorial - request approval (S1E3)
How to Produce and Consume Data from your Data Streaming Application?

In this third episode, Abhinav will show us how to produce and consume data from a data streaming application in the Axual Platform.

How to Produce and Consume Data from your Data Streaming Application?

Video hub

Watch and learn more about all things Kafka

View all
Running Kafka in Kubernetes : a practical guide

The rise of Apache Kafka as the de-facto standard for event streaming has coincided with the rise of Kubernetes for cloud-native applications. While Kubernetes is a great choice for any distributed system, that doesn't mean it is easy to deploy and maintain a Kafka cluster running on it.

Running Kafka in Kubernetes : a practical guide
Axual feature showcase - SASL support for Kafka

At Axual we love to help DevOps teams to apply Kafka topic security in an easier way. That’s why with great pride we are bringing SASL support to our Kafka event streaming platform. SASL support is implemented on multiple levels in the platform, from the Kafka Broker to the Self-Service UI. As soon as the platform operators or tenant admins have enabled SASL support on the broker, they configure it as an available authentication method in the Self-Service UI. Then, they can gradually add support for the instances by marking SASL as an enabled authentication method.

Axual feature showcase - SASL support for Kafka
Product Showcase - Self service - Part 3

Product Showcase - Self service - Part 3

Transform your Kafka ecosystem with Axual’s Strimzi Incident Support services