March 31, 2023

How are you managing Kafka? How the Axual ownership model makes your job easier

Apache Kafka has become the default solution when it comes to enabling event streaming. Operating an Apache Kafka deployment and managing connections and content can interrupt development processes and slow up time. In this blog we explain how the Axual approach to ownership helps to speed up the development of new topics and applications.

On this page

Axual offers two solutions for enterprise event streaming, Axual Platform and Governance.

  • Axual Platform provides a self service governance component for managing topics, schemas and applications on top of a full streaming stack, including Kafka clusters, Connect, Schema Registry, multi region data replication and more.
  • Axual Governance provides the self service governance component as an addition on top of an existing Kafka deployment.

Both solutions provide organizations with extensive governance features for their streaming platform. One of the most important features is how the Axual solutions handles ownership.

What is ownership, why do we need it?

Ownership can have many different meanings when applied to different fields.
When it comes to data and application governance, ownership refers to the person or group responsible for the creation, maintenance, accuracy, and security of data and the applications that use that data within an organization.
This includes defining data policies and standards, ensuring compliance with relevant regulations and standards, and ensuring that applications are secure, reliable, and scalable.

The Axual model for resources and ownership

Within Axual groups are assigned as owners of a resource instead of users. A single user as owner of a resource could make this user into a bottleneck if someone needs access to the resource and the user is unavailable.
Assigning a group as owner makes it possible to have more than one person that can share the responsibilities of owning the resource.
Axual currently has three types of resources that require explicit ownership: Topic, Application and Environment.

Environment

Environment is an abstraction that allows topic and application owners to specify configurations and access for different purposes. An environment is mapped against one or more Kafka clusters. Resources configured for an environment are immediately applied to the relevant clusters when a configuration changes.
The most commonly used approach is to define environments is the Development, Test, Acceptance and Production approach, where each development team can have their own environment to develop and test applications in isolation.
Resources on the Kafka clusters use a name-spacing policy to prevent name collisions on the cluster. This allows multiple environments to be used on the same Kafka cluster.

An environment owner is responsible for the following things

  • Determining on which Kafka clusters the environment should be available
  • Determining if automatic approval of access requests is enabled
  • Determining if the environment can be seen and used by users outside of the owners group

Multi environment support is coming to Axual Governance soon. Read more about Axual Governance it in the launch blog or start a free, 30-day trial with your own Kafka cluster.

Topic

Topic is the resource that defines global attributes of a topic that are the same regardless of which environment is used (this resource is called Stream in Axual Platform).
These global attributes contain owner details, a description of the topic, the schemas used for key and values. The topic owner can also add freeform key-value fields for additional information, like the AIC classification.
Environment specific settings are the retention time, number of partitions and specific schema versions.
Creating or updating a configuration for an Environment will create or update the topic on all Kafka clusters attached to the environment.

A topic owner is responsible for the following things:

  • The correctness of the contents of the topic
  • The schema definitions used the events published on the topic
  • Compatibility checks if new versions are created
  • Making sure that the data is classified correctly according to the standards and policies of the organization
  • Approving or denying applications access to the topic on different environments
  • Communicating changes to the topic data to the application owners in time
  • Keeping application owners informed about any additional requirements that applications need to adhere to when using the topic data. This can be things like using encrypted storage because of private data inside the events on the topic

Application

Application is the resource that defines the global attributes of a consuming or producing application, regardless of which environment it is deployed on. These global attributes contain the application identifier, owner details and a description of the application.
The environment specific settings consist of the identifying credentials used and access requests to produce and/or consume data from a topic.
Creating or updating a configuration for an Environment will apply the proper access control lists on all Kafka clusters attached to the environment.

An application owner is responsible for:

  • The correct processing of events
  • Making sure the data is processed according to the defined standards or guidelines
  • Generating and updating credentials
  • Requesting access to the topics

Advantages of the Axual ownership model

The Axual ownership model allows application developers and operators to prepare the Kafka environments for their applications and keep the application operating with only a minimal dependency on the topic owners.
The topic owners keep control of who is producing and consuming the data, and can update their schemas and other configuration settings without interfering with the applications.
The topic owner can still revoke access to an application when it is producing invalid data or is not adhering to the defined standards or guidelines.

The Axual solutions also provides separate roles that allow users to create new environments, streams or applications.

Download the Whitepaper

Download now
Table name
Lorem ipsum
Lorem ipsum
Lorem ipsum

Answers to your questions about Axual’s All-in-one Kafka Platform

Are you curious about our All-in-one Kafka platform? Dive into our FAQs
for all the details you need, and find the answers to your burning questions.

What is Axual’s Ownership Model and why is it important?

Axual’s ownership model assigns groups, rather than individual users, as owners of key resources such as Topics, Applications, and Environments. This group-based ownership avoids bottlenecks caused by relying on a single user and ensures that multiple people can share responsibilities. Ownership in data governance is crucial because it defines who is responsible for maintaining the accuracy, security, and compliance of data and applications within an organization.

What responsibilities do environment owners have in Axual?

Environment owners in Axual are responsible for determining which Kafka clusters the environment should be available on, managing access requests, and deciding whether the environment can be seen or used by external users. They configure resources like Topics and Applications for different environments (Development, Test, Production) and ensure that changes are applied across all Kafka clusters connected to the environment.

How does Axual’s ownership model benefit organizations using Kafka?

Axual’s ownership model streamlines the management of Kafka environments by decentralizing control. Application owners can manage their environment and access with minimal dependency on Topic owners, while Topic owners maintain control over data access and compliance. This separation enhances collaboration, reduces dependencies, and ensures that applications and data comply with organizational policies, making the system more scalable and secure.

Richard Bosch
Developer Advocate

Related blogs

View all
Richard Bosch
November 12, 2024
Understanding Kafka Connect
Understanding Kafka Connect

Apache Kafka has become a central component of modern data architectures, enabling real-time data streaming and integration across distributed systems. Within Kafka’s ecosystem, Kafka Connect plays a crucial role as a powerful framework designed for seamlessly moving data between Kafka and external systems. Kafka Connect provides a standardized, scalable approach to data integration, removing the need for complex custom scripts or applications. For architects, product owners, and senior engineers, Kafka Connect is essential to understand because it simplifies data pipelines and supports low-latency, fault-tolerant data flow across platforms. But what exactly is Kafka Connect, and how can it benefit your architecture?

Apache Kafka
Apache Kafka
Richard Bosch
November 1, 2024
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming
Kafka Topics and Partitions - The building blocks of Real Time Data Streaming

Apache Kafka is a powerful platform for handling real-time data streaming, often used in systems that follow the Publish-Subscribe (Pub-Sub) model. In Pub-Sub, producers send messages (data) that consumers receive, enabling asynchronous communication between services. Kafka’s Pub-Sub model is designed for high throughput, reliability, and scalability, making it a preferred choice for applications needing to process massive volumes of data efficiently. Central to this functionality are topics and partitions—essential elements that organize and distribute messages across Kafka. But what exactly are topics and partitions, and why are they so important?

Event Streaming
Event Streaming
Jimmy Kusters
October 31, 2024
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics
How to use Strimzi Kafka: Opening a Kubernetes shell on a broker pod and listing all topics

Strimzi Kafka offers an efficient solution for deploying and managing Apache Kafka on Kubernetes, making it easier to handle Kafka clusters within a Kubernetes environment. In this article, we'll guide you through opening a shell on a Kafka broker pod in Kubernetes and listing all the topics in your Kafka cluster using an SSL-based connection.

Strimzi Kafka
Strimzi Kafka