On this page
As promised, here is the second scheduled release update from Axual, the 2020.2, and it is an exciting one! Continue reading the blog or check out the release video for a detailed overview.
We are open sourcing
We are excited to announce that we will start open-sourcing more components of our platform, making them publicly available.
We have been relying on the help of a great community for a number of components of our platform, so we think that contributing back will be a great way to show appreciation and reciprocate the communal effort of open sourcing.
Thanks to the choice of open-source, users of Axual platform will be able to more easily debug issues, report them and even introduce their own fixes.
Open sourcing: Java and .NET Client Libraries
The first components we will open-source under the Apache 2.0 license are Java and .NET client libraries. These will allow you to focus on the business value your streaming app should bring, reducing the complexity and efforts to bring those apps into production.
Both libraries are available on Gitlab along with useful examples ready for you to try out by yourself. Here you will find more components that we open source in the wake. In the future, you can expect more libraries, and components to be open-sourced, so stay tuned!
Now, let’s dive into the other Axual 2020.2 updates!
Connect: Easily Integrate External Systems to Axual
The value you derive from your streaming platform is largely defined by the number of streaming data sources you manage to onboard.
This is where Connect comes in: Connect allows you to integrate Axual easily with external systems to either load data to or from the streaming platform. There are more advantages than just saving time compared to a custom developed application:
- You can easily scale the number of workers or tasks, to accommodate the load.
- You can perform standard operations and transformations on messages, such as filtering, masking or routing.
The way we look at connectors is not very different from a regular application. To us, an application is just a piece of software that either produces or consumes streaming data. This is why we believe our Self Service concept for applications can be applied to connectors as well. This means the ownership of connectors is registered and as a DevOps team, you can manage the entire connector lifecycle yourself.
Managing connectors in a self-service interface
As of release 2020.2, it is now possible to manage connectors in the Self Service interface by creating an app in the Axual platform. For a step by step practical demonstration, have a look at our video.
Conclusion
Now, you’ve seen how you can hook up external data sources and sinks with Axual without coding anything. In the video demo, we’ve shown 2 connector plugins, but we are planning to offer support for more plugins continuously and a list of plugins we currently support is available in our online documentation
Other updates
Of course, the latest release contains much more updates to improve the usability, security and stability of the product. You can read more about this in our online documentation.
Answers to your questions about Axual’s All-in-one Kafka Platform
Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.
Related blogs

Axual 2025.1 is here with exciting new features and updates. Whether you're strengthening security, improving observability, or bridging old legacy systems with modern event systems, like Kafka, Axual 2025.1 is built to keep you, your fellow developers, and engineers ahead of the game.

Consumer group offsets are essential components in Apache Kafka, a leading platform for handling real-time event streaming. By allowing organizations to scale efficiently, manage data consumption, and track progress in data processing, Kafka’s consumer groups and offsets ensure reliability and performance. In this blog post, we'll dive deep into these concepts, explain how consumer groups and offsets work, and answer key questions about their functionality. We'll also explore several practical use cases that show how Kafka’s consumer groups and offsets drive real business value, from real-time analytics to machine learning pipelines.

Apache Kafka is a powerful event-streaming platform, but does your enterprise need to go all in from day one? In this blog, we explore why starting small with Kafka is the best strategy. Learn how an incremental approach can help you reduce complexity, and scale efficiently as your needs grow. Whether you're new to Kafka or looking for a practical implementation strategy, this guide will set you on the right path.