July 19, 2023

Release blog 2023.2 – the summer release

Another quarter, another fresh Axual Platform release. AsyncAPI, Terraform Provider update and, last but not least, connector logging are the highlights of the 2023.2 release of Axual Platform. Keep reading to find out how those features enable you to manage your Apache Kafka and Kafka Connect cluster(s) effectively.

On this page

AsyncAPI – an API spec for your Kafka topics

If you expose an API, whether it is a REST API or a Kafka topic, you want the users of your API to understand how they can interact with it, both from a technical perspective, as well as a functional perspective. Let’s take an example: you expose a hotel room booking API. UI developers in your organization building an interface for hotel room booking need to understand:

  • Whether the API which is suggested to them is going to help them build the hotel booking interface
  • The endpoint to interact with, like hostname, port and protocol
  • The payload they need to send in order to create a booking for the hotel room
  • The response they can expect in the  happy and not so happy flows
  • Who to contact whenever they have detailed questions about the API

In a larger organization with many software teams building and exposing APIs, it is important to have this information available centrally, to reduce any delay in software design & development. For RESTful APIs the OpenAPI initiative is widely adopted as a means to standardize describing APIs. OpenAPI is extremely useful for describing APIs where synchronous interaction it taking place one-to-one,  as described in the UI ←→ API example or in the example of an API interacting with another API. However,  when a Kafka topic is your API, the interaction is asynchronous and for the one API (topic) you are exposing, you have many consumers.

The AsyncAPI Initiative is the industry standard for describing asynchronous APIs and has gained a lot of attention in recent years. It has the goal to make working with Event-Driven Architectures (EDAs) as easy as working with REST APIs.

AsyncAPI specifications contain the technical and functional details users of the API need to have to get going. It is typically a .yaml document with a couple of standard elements, such as “servers” (technical connection information)  and “channels” (e.g. topic, queue).

As of Axual Platform 2023.2 we are adding AsyncAPI support. This means that you can view and download an AsyncAPI spec for a topic in an environment. How this works can be seen in the video below.

Using the AsyncAPI spec, a developer can more easily create an application to interact with the topic.

Connector logging – what is my connector doing?

If you are integrating Apache Kafka with an external system, such as database systems, message queues or similar, Kafka Connect is the tool you need. Using Self-Service , you can easily create and configure connectors and hook them up to the topics you want to read from (SINK) or write to (SOURCE).

Up until now, you could only see the status of a particular connector in the Self-Service interface including a stack trace for when a connector crashed unexpectedly. As part of Axual Platform 2023.2, we improve the observability of your connector applications by providing Connector Logging. It has become an integral part of Self-Service and can be found on the “Logs” tab of the connector application detail page.

Eat your own dog food, that is a statement often heard in the IT industry. Well, Axual is eating her own dog food by writing the connector log entries to a Kafka topic. This brings additional benefits to you as a user of our platform. First of all, as an application owner you can easily define the retention time of the logging. The default retention is 1 week, but you can quite easily change that to whatever you want. The second and more important benefit is that you can offload the logging to a system of your choice, e.g. by creating another (SINK) connector.

Browsing and searching the connector logs is as easy as browsing data on your topics. Enter a search term (of search without), select an interval and the environment to search for logging.

Topic graph improvements

In the last two platform releases we have revamped and improved the topic graph. In this release we improved its usability even more. When you search for an entity in the topic graph, topics and applications are returned. Clicking a topic or an application will now highlight the selected topic or application and show its details and connected applications or topics. This helps you to find out immediately whether that’s the topic/application you are looking for.

Terraform provider: topic approval request support

We have recently released the latest version of the Terraform Provider for Axual which introduces support for:

  • Managing public environments: you are now able to  manage public environments from terraform files
  • Creating, approving/denying and deleting application access requests: create an authorization request for your application to produce/consume from a topic.

The latest version of the Terraform Provider for Axual is 1.1.0 and it is available in the Terraform Registry. Be sure to check out the examples, especially the application_access_grant and application_access_grant_approval documentation pages, which should help you understand how to create and approve requests. Did you find any issues while using the provider? Please let us know.

Start your Kafka journey with Axual

Got excited from reading this post? Are you struggling with Kafka in an enterprise setting? We are here to help. Here are a couple of next steps you could take:

– Start a free trial

– Request a demo and talk to our experts

– View other content and customer case studies

Download the Whitepaper

Download now
Table name
Lorem ipsum
Lorem ipsum
Lorem ipsum

Answers to your questions about Axual’s All-in-one Kafka Platform

Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.

Joris Meijer
Security Officer, Customer Success

Related blogs

View all
Richard Bosch
November 12, 2024
Understanding Kafka Connect
Understanding Kafka Connect

Apache Kafka has become a central component of modern data architectures, enabling real-time data streaming and integration across distributed systems. Within Kafka’s ecosystem, Kafka Connect plays a crucial role as a powerful framework designed for seamlessly moving data between Kafka and external systems. Kafka Connect provides a standardized, scalable approach to data integration, removing the need for complex custom scripts or applications. For architects, product owners, and senior engineers, Kafka Connect is essential to understand because it simplifies data pipelines and supports low-latency, fault-tolerant data flow across platforms. But what exactly is Kafka Connect, and how can it benefit your architecture?

Apache Kafka
Apache Kafka
Richard Bosch
November 1, 2024
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming

Apache Kafka is a powerful platform for handling real-time data streaming, often used in systems that follow the Publish-Subscribe (Pub-Sub) model. In Pub-Sub, producers send messages (data) that consumers receive, enabling asynchronous communication between services. Kafka’s Pub-Sub model is designed for high throughput, reliability, and scalability, making it a preferred choice for applications needing to process massive volumes of data efficiently. Central to this functionality are topics and partitions—essential elements that organize and distribute messages across Kafka. But what exactly are topics and partitions, and why are they so important?

Event Streaming
Event Streaming
Jimmy Kusters
October 31, 2024
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics

Strimzi Kafka offers an efficient solution for deploying and managing Apache Kafka on Kubernetes, making it easier to handle Kafka clusters within a Kubernetes environment. In this article, we'll guide you through opening a shell on a Kafka broker pod in Kubernetes and listing all the topics in your Kafka cluster using an SSL-based connection.

Strimzi Kafka
Strimzi Kafka