On this page
As promised, here is the second scheduled release update from Axual, the 2020.2, and it is an exciting one! Continue reading the blog or check out the release video for a detailed overview.
We are open sourcing
We are excited to announce that we will start open-sourcing more components of our platform, making them publicly available.
We have been relying on the help of a great community for a number of components of our platform, so we think that contributing back will be a great way to show appreciation and reciprocate the communal effort of open sourcing.
Thanks to the choice of open-source, users of Axual platform will be able to more easily debug issues, report them and even introduce their own fixes.
Open sourcing: Java and .NET Client Libraries
The first components we will open-source under the Apache 2.0 license are Java and .NET client libraries. These will allow you to focus on the business value your streaming app should bring, reducing the complexity and efforts to bring those apps into production.
Both libraries are available on Gitlab along with useful examples ready for you to try out by yourself. Here you will find more components that we open source in the wake. In the future, you can expect more libraries, and components to be open-sourced, so stay tuned!
Now, let’s dive into the other Axual 2020.2 updates!
Connect: Easily Integrate External Systems to Axual
The value you derive from your streaming platform is largely defined by the number of streaming data sources you manage to onboard.
This is where Connect comes in: Connect allows you to integrate Axual easily with external systems to either load data to or from the streaming platform. There are more advantages than just saving time compared to a custom developed application:
- You can easily scale the number of workers or tasks, to accommodate the load.
- You can perform standard operations and transformations on messages, such as filtering, masking or routing.
The way we look at connectors is not very different from a regular application. To us, an application is just a piece of software that either produces or consumes streaming data. This is why we believe our Self Service concept for applications can be applied to connectors as well. This means the ownership of connectors is registered and as a DevOps team, you can manage the entire connector lifecycle yourself.
Managing connectors in a self-service interface
As of release 2020.2, it is now possible to manage connectors in the Self Service interface by creating an app in the Axual platform. For a step by step practical demonstration, have a look at our video.
Conclusion
Now, you’ve seen how you can hook up external data sources and sinks with Axual without coding anything. In the video demo, we’ve shown 2 connector plugins, but we are planning to offer support for more plugins continuously and a list of plugins we currently support is available in our online documentation
Other updates
Of course, the latest release contains much more updates to improve the usability, security and stability of the product. You can read more about this in our online documentation.
Download the Whitepaper
Download nowAnswers to your questions about Axual’s All-in-one Kafka Platform
Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.
Related blogs
Apache Kafka has become a central component of modern data architectures, enabling real-time data streaming and integration across distributed systems. Within Kafka’s ecosystem, Kafka Connect plays a crucial role as a powerful framework designed for seamlessly moving data between Kafka and external systems. Kafka Connect provides a standardized, scalable approach to data integration, removing the need for complex custom scripts or applications. For architects, product owners, and senior engineers, Kafka Connect is essential to understand because it simplifies data pipelines and supports low-latency, fault-tolerant data flow across platforms. But what exactly is Kafka Connect, and how can it benefit your architecture?
Apache Kafka is a powerful platform for handling real-time data streaming, often used in systems that follow the Publish-Subscribe (Pub-Sub) model. In Pub-Sub, producers send messages (data) that consumers receive, enabling asynchronous communication between services. Kafka’s Pub-Sub model is designed for high throughput, reliability, and scalability, making it a preferred choice for applications needing to process massive volumes of data efficiently. Central to this functionality are topics and partitions—essential elements that organize and distribute messages across Kafka. But what exactly are topics and partitions, and why are they so important?
Strimzi Kafka offers an efficient solution for deploying and managing Apache Kafka on Kubernetes, making it easier to handle Kafka clusters within a Kubernetes environment. In this article, we'll guide you through opening a shell on a Kafka broker pod in Kubernetes and listing all the topics in your Kafka cluster using an SSL-based connection.