Blog

Discover articles about Apache Kafka, event streaming, real-time data processing, Apache Kafka use cases, and Strimzi for Kafka that inform, inspire, and provide the latest insights.
Start your Apache Kafka journey here, and don’t miss out on expert advice.

Latest Apache Kafka Blogs

Release Blog 2026.1 - The Spring Release

The Axual 2026.1 release builds on the improvements in governance, observability, and self-service introduced in 2025.4, and takes things a step further. This release adds audit event coverage across platform resources, giving teams more visibility and control over what’s happening in the platform. We’ve also extended OAuth support to all data plane components, making security more consistent end to end. On top of that, updates to Connector management and the Overview Graph make the platform easier to use and give clearer insight into platform activity.

Axual Product
MCP with Kafka

In this blog, we explore how the Model Context Protocol (MCP) enables AI applications to move beyond text generation and interact with systems like Kafka through structured actions. We will look at how natural language intent can be translated into real operations such as managing topics, schemas, and streaming applications, while also touching on governance, security, and the role of declarative approaches like KSML.

Apache Kafka
Release Blog 2025.4 - The Winter Release

Axual 2025.4, the Winter Release, expands on the governance and self-service foundations of 2025.3 with improved KSML monitoring and state management, an enhanced Schema Catalog, and usability improvements across Self-Service and the platform.

Axual Product

Browse by category:

Read more about Real Time Data & Event Streaming

Richard Bosch
November 29, 2024
Deep Dive into Kafka Connect Clusters: Structure, Scaling, and Task Management
Deep Dive into Kafka Connect Clusters: Structure, Scaling, and Task Management

This blog dives deep into Kafka Connect clusters, unraveling their structure, scaling strategies, and task management processes. Whether you're designing a high-availability system, troubleshooting task distribution, or scaling your pipeline for performance, this article provides a comprehensive look at how Kafka Connect clusters operate.

Apache Kafka Use Cases
Daniel Mulder
November 28, 2024
Introduction Zookeeper to KRaft
Introduction Zookeeper to KRaft

For years, Zookeeper has been integral to Kafka deployments as a reliable metadata management system. However, with its limitations and Kafka’s evolution through KIP-500, the shift to KRaft—a self-managed metadata quorum—marks a new era. This transition is critical as Zookeeper’s deprecation accelerates, with its removal planned in Kafka 4.0. Adapting now ensures your Kafka clusters remain future-ready and efficient.

Apache Kafka
Rachel van Egmond
November 19, 2024
Optimizing Healthcare Integration with Kafka at NHN | Use case
Optimizing Healthcare Integration with Kafka at NHN | Use case

Norsk Helsenett (NHN) is revolutionizing Norway's fragmented healthcare landscape with a scalable Kafka ecosystem. Bridging 17,000 organizations ensures secure, efficient communication across hospitals, municipalities, and care providers.

Apache Kafka Use Cases
Richard Bosch
November 12, 2024
Understanding Kafka Connect
Understanding Kafka Connect

Apache Kafka has become a central component of modern data architectures, enabling real-time data streaming and integration across distributed systems. Within Kafka’s ecosystem, Kafka Connect plays a crucial role as a powerful framework designed for seamlessly moving data between Kafka and external systems. Kafka Connect provides a standardized, scalable approach to data integration, removing the need for complex custom scripts or applications. For architects, product owners, and senior engineers, Kafka Connect is essential to understand because it simplifies data pipelines and supports low-latency, fault-tolerant data flow across platforms. But what exactly is Kafka Connect, and how can it benefit your architecture?

Apache Kafka
Richard Bosch
November 1, 2024
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming

Apache Kafka is a powerful platform for handling real-time data streaming, often used in systems that follow the Publish-Subscribe (Pub-Sub) model. In Pub-Sub, producers send messages (data) that consumers receive, enabling asynchronous communication between services. Kafka’s Pub-Sub model is designed for high throughput, reliability, and scalability, making it a preferred choice for applications needing to process massive volumes of data efficiently. Central to this functionality are topics and partitions—essential elements that organize and distribute messages across Kafka. But what exactly are topics and partitions, and why are they so important?

Event Streaming
Jimmy Kusters
October 31, 2024
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics

Strimzi Kafka offers an efficient solution for deploying and managing Apache Kafka on Kubernetes, making it easier to handle Kafka clusters within a Kubernetes environment. In this article, we'll guide you through opening a shell on a Kafka broker pod in Kubernetes and listing all the topics in your Kafka cluster using an SSL-based connection.

Strimzi Kafka
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.