December 24, 2021

Axual Release Update 2021.3

2021 is almost a wrap, but not until we have announced our last major release of this year: Axual Platform 2021.3. Read all about it in this blog.

On this page

Axual 2021.3 is here!

2021 is almost a wrap, but not until we have announced our last major release of this year: Axual Platform 2021.3. Read all about it in this blog.

log4j2, the vulnerability that scared us all

Of course we should address the elephant in the room. On December 10th the whole IT industry was facing a challenge with the just discovered vulnerability in log4j2. If you haven’t read about it yet, you can do that here. Now, what about Axual? The good news is that we were not actively using this library in our platform’s components. However, the vulnerable library was transiently loaded or loadable on the classpath of the application, e.g. through Spring Boot , a framework we rely on heavily for building our platform microservices. As part of the 2021.3 platform release, we have made sure each and every platform component affected actively excludes the dependency on the vulnerable library to completely rule out its use. For customers using the 2021.2 version of the platform we have released security patches.

SELF-SERVICE FOR KAFKA JUST GOT BETTER

In the 2021.3 release again we are introducing a lot of usability improvements for Self-service. In the following paragraphs we will be going a bit more in detail.

1. Human readable headers

You might be using Kafka headers in your messages, e.g. to provide your messages with valuable business or technical metadata or lineage information.  In Self-service, we are now also showing Kafka headers in a human readable format when browsing or searching messages on Kafka.

2. Kafka Connect Updates

It has now become a bit easier to manage connectors using our Self-service interface. We have now introduced the possibility to set and unset specific connector properties, without having to delete the entire connector configuration. This makes it easy to experiment with connector settings.

Also, to understand better which connector plugin you are dealing with, we are showing the connector plugin version on the Application page. For an example, see the screenshot to the right.

3. Easier configuration of topics and applications

Release 2021.3 also brings an exciting new feature which makes it a lot easier to handle your applications’ and topic configurations, especially when you are dealing with a lot of environments. Where you previously had to scroll to a large list of environments, you will now first see the environments for which the application or topic has been configured, followed by unconfigured applications. Of course, the search function will allow you to search in both collections.

Use the new environment switcher for topics and applications

4. Force-apply an incompatible Avro schema

This sounds quite dangerous, doesn’t it? Well when you are developing an application, e.g. to produce messages to a new topic, and you are using Avro to define schemas for your topic, you might not get your schema right the first time. This means that when you create new versions of your schema, you introduce incompatibilities with previous versions. This is especially the case in development environments, but also in production you might deal with these situations.

As of 2021.3 you now have the option to force apply these changes, of course *after* you have seen a big warning to make sure you do this consciously.

5. Human friendly error messages

The Self-service interface now shows more user friendly error messages when you are uploading certificates for your application or schemas for your topic. This makes it a lot easier to debug issues with them.

And many more…!

Many more usability improvements have made it to the 2021.3 release. Read about all updates in the release notes.

UNDER THE HOOD: A LOT OF SECURITY UPDATES, SUPPORT FOR MYSQL

The things we work on under the hood are often less sexy to talk about, but they do matter for organizations who want to work with Kafka securely and well governed. As part of the latest platform release we have updated all of the platform’s base docker images to the most recent versions as well as the libraries which we use in our platform components.  With this almost invisible but important update we have laid the groundwork for a feature coming in the 2022.1 release: support for other CPU architectures, such as ARM64.

Also, we now fully support MySQL as the backing database for our Self-service interface, where MariaDB was already supported.

WHAT’S NEXT?

Thanks for diving into the 2021.3 release. You can find more details about the release in our release notes, part of the documentation here. But the best thing is to try out for yourself. We invite you to request a trial, you can do that at the bottom of this page.

Download the Whitepaper

Download now
Table name
Lorem ipsum
Lorem ipsum
Lorem ipsum

Answers to your questions about Axual’s All-in-one Kafka Platform

Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.

Rachel van Egmond
Senior content lead

Related blogs

View all
Richard Bosch
November 12, 2024
Understanding Kafka Connect
Understanding Kafka Connect

Apache Kafka has become a central component of modern data architectures, enabling real-time data streaming and integration across distributed systems. Within Kafka’s ecosystem, Kafka Connect plays a crucial role as a powerful framework designed for seamlessly moving data between Kafka and external systems. Kafka Connect provides a standardized, scalable approach to data integration, removing the need for complex custom scripts or applications. For architects, product owners, and senior engineers, Kafka Connect is essential to understand because it simplifies data pipelines and supports low-latency, fault-tolerant data flow across platforms. But what exactly is Kafka Connect, and how can it benefit your architecture?

Apache Kafka
Apache Kafka
Richard Bosch
November 1, 2024
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming

Apache Kafka is a powerful platform for handling real-time data streaming, often used in systems that follow the Publish-Subscribe (Pub-Sub) model. In Pub-Sub, producers send messages (data) that consumers receive, enabling asynchronous communication between services. Kafka’s Pub-Sub model is designed for high throughput, reliability, and scalability, making it a preferred choice for applications needing to process massive volumes of data efficiently. Central to this functionality are topics and partitions—essential elements that organize and distribute messages across Kafka. But what exactly are topics and partitions, and why are they so important?

Event Streaming
Event Streaming
Jimmy Kusters
October 31, 2024
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics

Strimzi Kafka offers an efficient solution for deploying and managing Apache Kafka on Kubernetes, making it easier to handle Kafka clusters within a Kubernetes environment. In this article, we'll guide you through opening a shell on a Kafka broker pod in Kubernetes and listing all the topics in your Kafka cluster using an SSL-based connection.

Strimzi Kafka
Strimzi Kafka