On this page
Since our last release we have worked on many improvements in all layers of the platform. In this article we are going through them one by one.
Up-to-date platform components
We have been updating a few dependencies through our platform for the last couple of months. The biggest two of them were the upgrade of Strimzi and Keycloak. Axual Platform 2022.2 brings Keycloak 18 to users of our platform.
Strimzi upgrade
Axual Platform is based on the Strimzi Operator. For the 2022.2 release we have upgraded to Strimzi 0.29.0. It allows for an upgrade to the 3.2.0 release of Apache Kafka. The operator instructions contain upgrade steps related to Axual Operator 0.6.3 and 0.7.0 to upgrade to Kafka 3.0.0 and Kafka 3.2.0 respectively. You can find the operations documentation here.

Application/stream owner insights
Nowadays, businesses more than ever need to interact in real-time with their customers and partners. As you all know, (event) streaming has proven to be an excellent way to facilitate fast and seamless communication between applications. As applications rely more on a streaming data backbone, it becomes increasingly important to guarantee its availability. This brings a responsibility to platform operators, who instrument the platform to continuously monitor and log platform events and make sure they are alerted appropriately.
The responsibility, however, does not stop with the platform operators. DevOps engineers are responsible for their applications from DEV to PROD. They are used to instrument their applications as the operators do for the platform. Specifically looking at Kafka application developers, among other options, they can instrument their consumer/producer or streams applications by enabling JMX and scrape the application’s monitoring interfaces, e.g. by adding a JMX exporter to it and configure Prometheus to scrape it.
But, that is not really enough for producer/consumer applications interacting with a streaming platform. If you depend on a topic either because you produce to it or you depend on its data for your use case, you want to know how that topic is doing from the perspective of the platform. E.g, how many messages per second are produced to a topic? How many messages in total are there on my topic? This introduces a challenge to the platform team who does not necessarily want to expose in full detail all Kafka metrics and to offer and maintain this “observability interface” for teams connecting to the streaming platform.
This is where our newest kid on the block, Metrics Exposer, comes in. Metrics Exposer is an extension of Axual Platform which offers a REST API which can be queried by application and stream owners who want to get more insights into how their stream (topic) or application is doing.
Available metrics
At the introduction of this new API, we are offering two metrics and we aim to continuously expand the API with metrics which bring the most value to users of the API. The following two metrics will be supported:
- Message Rate: the number of messages per second on a topic
- Partition Size: the number of messages on a topic partition
OpenAPI specification, authentication
We have released an OpenAPI specification to help developers understand what the API is offering and to speed up the development of a client or configure any observability tool which understands REST. As can be read in the public API documentation, Metrics Exposer API is secured by the OAuth2 protocol. All requests to the API should provide a valid JWT token via the Authorization header.
Operator instructions
Instructions for operators on how to make this API available to end users have been provided as part of the “Upgrading to 2022.2” docs here.
Beta state
It is important to know that the API offered currently is in a beta state and is likely to change based on feedback from users and as metrics and functionality is introduced. We will always communicate any (breaking) change in the changelog of the API documentation.
Sneak peek: revamped UI
We were getting a bit used too much to the current UI look and feel, and we think it is time to freshen it up a bit. That’s why we have started revamping the Self-Service UI for Kafka which not only offers you a pleasant look and feel, but we will introduce a lot of usability improvements at the same time. The aim is to introduce this revamped UI in the 2022.3 release, for now we are just sharing a sneak preview in the form of a couple of screenshots.



Other improvements
As with every product release we have upgraded many dependencies in the platform components and fixed minor bugs that were affecting the operator or developer experience. Lastly, we have improved the Getting Started section to better guide the user through their initial steps on our platform. You can expect many more initiatives on the onboarding experience in future releases.
For now, I can only invite you to try out the new platform features by requesting a trial or asking your stream team to upgrade your Axual Platform installation as soon as possible :-).
Happy streaming!
The Axual Team
Answers to your questions about Axual’s All-in-one Kafka Platform
Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.
Related blogs

Axual 2025.1 is here with exciting new features and updates. Whether you're strengthening security, improving observability, or bridging old legacy systems with modern event systems, like Kafka, Axual 2025.1 is built to keep you, your fellow developers, and engineers ahead of the game.

Consumer group offsets are essential components in Apache Kafka, a leading platform for handling real-time event streaming. By allowing organizations to scale efficiently, manage data consumption, and track progress in data processing, Kafka’s consumer groups and offsets ensure reliability and performance. In this blog post, we'll dive deep into these concepts, explain how consumer groups and offsets work, and answer key questions about their functionality. We'll also explore several practical use cases that show how Kafka’s consumer groups and offsets drive real business value, from real-time analytics to machine learning pipelines.

Apache Kafka is a powerful event-streaming platform, but does your enterprise need to go all in from day one? In this blog, we explore why starting small with Kafka is the best strategy. Learn how an incremental approach can help you reduce complexity, and scale efficiently as your needs grow. Whether you're new to Kafka or looking for a practical implementation strategy, this guide will set you on the right path.