April 4, 2020

Axual Release Update 2020.1

Here it is, the latest version of Axual, the 2020.1 release. The new version introduces new features that allow you to speed up development, do testing and incident resolution with “Stream Browse & Search”, hook up external systems to Axual REST proxy and maintain schemas for your streams using Schema Upload.

On this page

Here it is, the latest version of Axual, the 2020.1 release. The new version introduces new features that allow you to speed up development, do testing and incident resolution with “Stream Browse & Search”, hook up external systems to Axual REST proxy and maintain schemas for your streams using Schema Upload.

Also, we have now adopted a new version scheme. We are moving from an arbitrary version number, being 5.5.1 for the latest release to a less arbitrary version number. Our version numbers will now contain the year of the release in 4 digits, followed by a sequence number. With 4 releases planned yearly, you can expect versions 2020.1 to 2020.4 this year. This new version number is easier to understand and remember, and you can more easily identify how much you are behind on the latest release. Now, let’s explore the wonders of Axual 2020.1.

NEW IMPROVEMENTS AND LATEST FEATURES OF AXUAL PLATFORM

STREAM BROWSE & SEARCH

In the latest release of our platform, we have added a Stream Browse & Search feature. The question is why did we introduce this new feature?

Why Introduce Stream Browse & Search?

When you are building an application, whether it is an API or just another business application, you move your app through the Software Development Life Cycle or SDLC in short. In this lifecycle, your DevOps team is responsible for testing the app in various environments, moving your app from DEV to PROD. In these environments, your app needs to pass essential tests before you move to the next environment. Now, when you are testing an API, you might do some automated REST calls to your app to verify it’s working in a particular environment. However, if you are building a streaming application, this is a different story.

To put this in perspective, let’s take an example of Team Rocky, a development team which is working on a streaming use case. Team “Rocky” is building a producer app, producing alert events to a stream. Consuming the message from the stream would give them confidence that what is produced ended up on the platform. But, building a consumer app just to verify this, requires a lot of effort. Wouldn’t it be great if Team Rocky could be able to verify this themselves, without building anything? We think so, just like you do. That’s why we introduced the Stream Browse & Search feature, which can be found on the “Stream detail” page.

How does Stream Browse & Search Work?

Stream Browse & Search allows you to browse or search for messages on a stream in a given time frame, using a keyword you specify. The search results are shown in a list, which can be paginated. By clicking a single record, it is expanded to show a more detailed view of the message contents. You will also find header information there, which can be useful for debugging purposes. Any header that you insert, using your client application, will be visible here. So, any team can now easily determine whether the messages they produced have reached the platform by using Stream Browse & Search.

Access to the messages of a stream is granted based on the following permissions:

  • Stream owner: you can see messages on streams you own
  • Tenant admin: you can see  messages on any stream in any environment
  • Application owner of a connected application: you can see messages on any stream, in authorized environments

We have highlighted the advantage of this feature for a producer, but a consumer would also benefit from it. First of all, before they even start building their consumer app, they can get a sense of what data they can expect by inspecting the stream’s schema, or some actual messages. Second, as soon as they are live with their consumer app, and the app crashes because of an unexpected message, they can use Stream Browse & Search to easily find the message which is causing a problem by narrowing the time window.

SCHEMA UPLOAD

If you are using a streaming platform, it is advisable to use a schema as a contract for your data stream. You can do so with multiple formats, Avro currently being the most popular and widely used. For a very long time, Axual has provided support to use Avro schemas for your data streams. When creating a stream, you can select the schema to be used, and depending on the environment you can even select a different version of this schema.

Now those schemas are coming from a collection, a .jar file, which, up until now, was uploaded to Self Service using a REST call, which is a bit cumbersome. Axual 2020.1 has solved this.

The Way Forward is our New Upload Functionality

The new upload functionality in Axual 2020.1 makes life easy and all you need to do include the following few steps:

  • You click “Schema upload” in the main menu
  • You select the Avro source file, usually with extension .avsc.
  • The schema is then shown in a convenient viewer.

In the background, the Avro syntax is checked and a compatibility check is done to make sure the new schema is backwards compatible with the previous one.

  • Using the viewer, you verify that the contents of the schema are according to your expectations
  • You Enter a description of the schema which helps users to understand what payload the schema is describing
  • You put in a version of the particular schema and you click “Upload” to upload the schema.

From this moment the schema has become available for registration. When you navigate back to the stream configuration, you will see that the newest uploaded schema for “AlertEvent” has become available for use on a stream. It is displayed in the dropdown. With this method, you not only have the opportunity to upload entirely new schemas, but also new versions of existing schemas. With the automatic compatibility check, you are sure that consumers remain working whenever you decide to change a schema version for a particular stream.


PRODUCING AND CONSUMING VIA HTTP: REST PROXY

There are many ways to get data in and out of your streaming platform. How you do this, really depends on a lot of things, one of which is whether there is support for custom development. In some cases, there is no possibility to add custom code or components to an existing system, which makes it hard to let that system produce data to a streaming platform. Think of cases such as handheld, lightweight IoT or legacy mainframe systems. For such systems to communicate with the web, they use plain HTTP. Implementing applications on such systems is often limited by hardware specifications beyond the developer’s reach. Now, the brokers in an Apache Kafka cluster do not support posting data via HTTP directly. If you want to do this, you need another component that performs the translation. Here is where the REST proxy comes in.

With REST proxy, you can produce and consume a stream’s message, and register or fetch schemas for a particular stream, in any environment. Authentication and authorization of REST clients is no different from regular, custom apps that connect to Axual directly. It is based on mutual TLS, so they should use a certificate to connect to the REST proxy, and trust the issuer of the REST proxy certificate.

REST proxy has full support for Avro as well, you can read more about this in our updated documentation. Another point worthy of note is that compared to existing REST interfaces for Kafka, you can create the producer and send the payload in just one request. This makes REST proxy much easier to use.

CONCLUSION

There you go! You are officially welcome to Axual 2020.1 with new fantastic features already discussed in this article. It is indeed a result of great teamwork. More things have been added, you will learn more about it in our online documentation.

All of those features should really help you in developing your streaming use cases and, more importantly, help you to create business value from streaming data. All of the features you have seen have already been made available to Axual Cloud customers. We are willing to welcome you on board as well. You can learn more about Axual by subscribing to our YouTube channel or contacting us. Together, we can make your business achieve a competitive edge by getting value from your streaming data.

Download the Whitepaper

Download now
Table name
Lorem ipsum
Lorem ipsum
Lorem ipsum

Answers to your questions about Axual’s All-in-one Kafka Platform

Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.

Joris Meijer
Security Officer, Customer Success

Related blogs

View all
Richard Bosch
November 12, 2024
Understanding Kafka Connect
Understanding Kafka Connect

Apache Kafka has become a central component of modern data architectures, enabling real-time data streaming and integration across distributed systems. Within Kafka’s ecosystem, Kafka Connect plays a crucial role as a powerful framework designed for seamlessly moving data between Kafka and external systems. Kafka Connect provides a standardized, scalable approach to data integration, removing the need for complex custom scripts or applications. For architects, product owners, and senior engineers, Kafka Connect is essential to understand because it simplifies data pipelines and supports low-latency, fault-tolerant data flow across platforms. But what exactly is Kafka Connect, and how can it benefit your architecture?

Apache Kafka
Apache Kafka
Richard Bosch
November 1, 2024
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming

Apache Kafka is a powerful platform for handling real-time data streaming, often used in systems that follow the Publish-Subscribe (Pub-Sub) model. In Pub-Sub, producers send messages (data) that consumers receive, enabling asynchronous communication between services. Kafka’s Pub-Sub model is designed for high throughput, reliability, and scalability, making it a preferred choice for applications needing to process massive volumes of data efficiently. Central to this functionality are topics and partitions—essential elements that organize and distribute messages across Kafka. But what exactly are topics and partitions, and why are they so important?

Event Streaming
Event Streaming
Jimmy Kusters
October 31, 2024
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics

Strimzi Kafka offers an efficient solution for deploying and managing Apache Kafka on Kubernetes, making it easier to handle Kafka clusters within a Kubernetes environment. In this article, we'll guide you through opening a shell on a Kafka broker pod in Kubernetes and listing all the topics in your Kafka cluster using an SSL-based connection.

Strimzi Kafka
Strimzi Kafka