April 4, 2020

Axual Release Update 2020.1

Here it is, the latest version of Axual, the 2020.1 release. The new version introduces new features that allow you to speed up development, do testing and incident resolution with “Stream Browse & Search”, hook up external systems to Axual REST proxy and maintain schemas for your streams using Schema Upload.

link-icon
Linkedin icon
X icon
Facebook icon

On this page

Here it is, the latest version of Axual, the 2020.1 release. The new version introduces new features that allow you to speed up development, do testing and incident resolution with “Stream Browse & Search”, hook up external systems to Axual REST proxy and maintain schemas for your streams using Schema Upload.

Also, we have now adopted a new version scheme. We are moving from an arbitrary version number, being 5.5.1 for the latest release to a less arbitrary version number. Our version numbers will now contain the year of the release in 4 digits, followed by a sequence number. With 4 releases planned yearly, you can expect versions 2020.1 to 2020.4 this year. This new version number is easier to understand and remember, and you can more easily identify how much you are behind on the latest release. Now, let’s explore the wonders of Axual 2020.1.

NEW IMPROVEMENTS AND LATEST FEATURES OF AXUAL PLATFORM

STREAM BROWSE & SEARCH

In the latest release of our platform, we have added a Stream Browse & Search feature. The question is why did we introduce this new feature?

Why Introduce Stream Browse & Search?

When you are building an application, whether it is an API or just another business application, you move your app through the Software Development Life Cycle or SDLC in short. In this lifecycle, your DevOps team is responsible for testing the app in various environments, moving your app from DEV to PROD. In these environments, your app needs to pass essential tests before you move to the next environment. Now, when you are testing an API, you might do some automated REST calls to your app to verify it’s working in a particular environment. However, if you are building a streaming application, this is a different story.

To put this in perspective, let’s take an example of Team Rocky, a development team which is working on a streaming use case. Team “Rocky” is building a producer app, producing alert events to a stream. Consuming the message from the stream would give them confidence that what is produced ended up on the platform. But, building a consumer app just to verify this, requires a lot of effort. Wouldn’t it be great if Team Rocky could be able to verify this themselves, without building anything? We think so, just like you do. That’s why we introduced the Stream Browse & Search feature, which can be found on the “Stream detail” page.

How does Stream Browse & Search Work?

Stream Browse & Search allows you to browse or search for messages on a stream in a given time frame, using a keyword you specify. The search results are shown in a list, which can be paginated. By clicking a single record, it is expanded to show a more detailed view of the message contents. You will also find header information there, which can be useful for debugging purposes. Any header that you insert, using your client application, will be visible here. So, any team can now easily determine whether the messages they produced have reached the platform by using Stream Browse & Search.

Access to the messages of a stream is granted based on the following permissions:

  • Stream owner: you can see messages on streams you own
  • Tenant admin: you can see  messages on any stream in any environment
  • Application owner of a connected application: you can see messages on any stream, in authorized environments

We have highlighted the advantage of this feature for a producer, but a consumer would also benefit from it. First of all, before they even start building their consumer app, they can get a sense of what data they can expect by inspecting the stream’s schema, or some actual messages. Second, as soon as they are live with their consumer app, and the app crashes because of an unexpected message, they can use Stream Browse & Search to easily find the message which is causing a problem by narrowing the time window.

SCHEMA UPLOAD

If you are using a streaming platform, it is advisable to use a schema as a contract for your data stream. You can do so with multiple formats, Avro currently being the most popular and widely used. For a very long time, Axual has provided support to use Avro schemas for your data streams. When creating a stream, you can select the schema to be used, and depending on the environment you can even select a different version of this schema.

Now those schemas are coming from a collection, a .jar file, which, up until now, was uploaded to Self Service using a REST call, which is a bit cumbersome. Axual 2020.1 has solved this.

The Way Forward is our New Upload Functionality

The new upload functionality in Axual 2020.1 makes life easy and all you need to do include the following few steps:

  • You click “Schema upload” in the main menu
  • You select the Avro source file, usually with extension .avsc.
  • The schema is then shown in a convenient viewer.

In the background, the Avro syntax is checked and a compatibility check is done to make sure the new schema is backwards compatible with the previous one.

  • Using the viewer, you verify that the contents of the schema are according to your expectations
  • You Enter a description of the schema which helps users to understand what payload the schema is describing
  • You put in a version of the particular schema and you click “Upload” to upload the schema.

From this moment the schema has become available for registration. When you navigate back to the stream configuration, you will see that the newest uploaded schema for “AlertEvent” has become available for use on a stream. It is displayed in the dropdown. With this method, you not only have the opportunity to upload entirely new schemas, but also new versions of existing schemas. With the automatic compatibility check, you are sure that consumers remain working whenever you decide to change a schema version for a particular stream.


PRODUCING AND CONSUMING VIA HTTP: REST PROXY

There are many ways to get data in and out of your streaming platform. How you do this, really depends on a lot of things, one of which is whether there is support for custom development. In some cases, there is no possibility to add custom code or components to an existing system, which makes it hard to let that system produce data to a streaming platform. Think of cases such as handheld, lightweight IoT or legacy mainframe systems. For such systems to communicate with the web, they use plain HTTP. Implementing applications on such systems is often limited by hardware specifications beyond the developer’s reach. Now, the brokers in an Apache Kafka cluster do not support posting data via HTTP directly. If you want to do this, you need another component that performs the translation. Here is where the REST proxy comes in.

With REST proxy, you can produce and consume a stream’s message, and register or fetch schemas for a particular stream, in any environment. Authentication and authorization of REST clients is no different from regular, custom apps that connect to Axual directly. It is based on mutual TLS, so they should use a certificate to connect to the REST proxy, and trust the issuer of the REST proxy certificate.

REST proxy has full support for Avro as well, you can read more about this in our updated documentation. Another point worthy of note is that compared to existing REST interfaces for Kafka, you can create the producer and send the payload in just one request. This makes REST proxy much easier to use.

CONCLUSION

There you go! You are officially welcome to Axual 2020.1 with new fantastic features already discussed in this article. It is indeed a result of great teamwork. More things have been added, you will learn more about it in our online documentation.

All of those features should really help you in developing your streaming use cases and, more importantly, help you to create business value from streaming data. All of the features you have seen have already been made available to Axual Cloud customers. We are willing to welcome you on board as well. You can learn more about Axual by subscribing to our YouTube channel or contacting us. Together, we can make your business achieve a competitive edge by getting value from your streaming data.

Table name
Lorem ipsum
Lorem ipsum
Lorem ipsum

Answers to your questions about Axual’s All-in-one Kafka Platform

Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.

Joris Meijer
Joris Meijer
Security Officer, Customer Success

Related blogs

View all
Jeroen van Disseldorp
Jeroen van Disseldorp
April 4, 2025
Release blog 2025.1 - The Spring Release
Release blog 2025.1 - The Spring Release

Axual 2025.1 is here with exciting new features and updates. Whether you're strengthening security, improving observability, or bridging old legacy systems with modern event systems, like Kafka, Axual 2025.1 is built to keep you, your fellow developers, and engineers ahead of the game.

Axual Product
Axual Product
February 21, 2025
Kafka Consumer Groups and Offsets: What You Need to Know
Kafka Consumer Groups and Offsets: What You Need to Know

Consumer group offsets are essential components in Apache Kafka, a leading platform for handling real-time event streaming. By allowing organizations to scale efficiently, manage data consumption, and track progress in data processing, Kafka’s consumer groups and offsets ensure reliability and performance. In this blog post, we'll dive deep into these concepts, explain how consumer groups and offsets work, and answer key questions about their functionality. We'll also explore several practical use cases that show how Kafka’s consumer groups and offsets drive real business value, from real-time analytics to machine learning pipelines.

Apache Kafka
Apache Kafka
Rachel van Egmond
Rachel van Egmond
February 14, 2025
Starting Small with Kafka: Why It’s the Right Choice for Your Enterprise
Starting Small with Kafka: Why It’s the Right Choice for Your Enterprise

Apache Kafka is a powerful event-streaming platform, but does your enterprise need to go all in from day one? In this blog, we explore why starting small with Kafka is the best strategy. Learn how an incremental approach can help you reduce complexity, and scale efficiently as your needs grow. Whether you're new to Kafka or looking for a practical implementation strategy, this guide will set you on the right path.

Apache Kafka for Business
Apache Kafka for Business