Skip to content

Manikandan-t/kafka-trifecta

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

7 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Kafka Trifecta: Secure Kafka Setup with Multiple Authentication Methods

🧠 What is Kafka?

Apache Kafka is a distributed event streaming platform used for building real-time data pipelines and streaming applications. It enables the publishing, subscription, storage, and processing of streams of records in a fault-tolerant and scalable way.

Kafka works on a producer-consumer model:

  • Producers send data to Kafka topics
  • Consumers subscribe to those topics and process the data

🐘 What is Zookeeper?

Kafka uses Zookeeper to manage:

  • Broker metadata
  • Leader election for partitions
  • Configuration synchronization across brokers

While modern Kafka versions support a KRaft (Kafka Raft) mode that eliminates the need for Zookeeper, it's still commonly used in development and legacy setups.


πŸ“ Project Structure

This repository demonstrates how to set up and run Kafka locally with three different security configurations:

kafka-trifecta/
β”œβ”€β”€ auth/                # SASL/PLAIN Authentication
β”‚   β”œβ”€β”€ Auth-README.md
β”‚   β”œβ”€β”€ client-ui/
β”‚   β”‚   └── kafka_ui.py
β”‚   β”œβ”€β”€ config/
β”‚   β”‚   β”œβ”€β”€ client.properties
β”‚   β”‚   └── kafka_server_jaas.conf
β”‚   β”œβ”€β”€ docker-compose.yml
β”‚   β”œβ”€β”€ kafka_auth.py
β”‚   └── requirements.txt
β”‚
β”œβ”€β”€ no-auth/             # No Authentication
β”‚   β”œβ”€β”€ No-Auth-README.md
β”‚   β”œβ”€β”€ client-ui/
β”‚   β”‚   └── kafka_ui.py
β”‚   β”œβ”€β”€ docker-compose.yml
β”‚   β”œβ”€β”€ kafka_no_auth.py
β”‚   └── requirements.txt
β”‚
└── sasl_ssl_auth/       # SASL_SSL Authentication with Certificates
    β”œβ”€β”€ SSL-Auth-README.md
    β”œβ”€β”€ client-ui/
    β”‚   β”œβ”€β”€ kafka_streamlit_app.py
    β”‚   └── ui_start_command.txt
    β”œβ”€β”€ config/
    β”‚   β”œβ”€β”€ consumer.properties
    β”‚   β”œβ”€β”€ kafka.env
    β”‚   β”œβ”€β”€ kafka_server_jaas.conf
    β”‚   β”œβ”€β”€ producer.properties
    β”‚   └── server.properties
    β”œβ”€β”€ cert/
    β”‚   β”œβ”€β”€ ca-cert.pem
    β”‚   β”œβ”€β”€ kafka.p12
    β”‚   β”œβ”€β”€ keystore/
    β”‚   β”‚   └── kafka.keystore.jks
    β”‚   β”œβ”€β”€ script.sh
    β”‚   └── truststore/
    β”‚       β”œβ”€β”€ ca-key
    β”‚       └── kafka.truststore.jks
    β”œβ”€β”€ docker-compose.yml
    └── requirements.txt

πŸ” Kafka Trifecta Overview

This repository demonstrates how to set up and run Kafka locally with three different security configurations:

  1. No Authentication - No-Auth-README.md
  2. Username/Password Authentication (SASL/PLAIN) - Auth-README.md
  3. SASL_SSL Authentication with Certificates - SSL-Auth-README.md

Each setup includes:

  • Docker Compose files for Kafka + Zookeeper
  • FastAPI producer endpoints
  • Streamlit UI for interactive Kafka messaging

πŸ§ͺ Setup Instructions

Prerequisites

  • Docker installed and running
  • Docker Compose installed (v2+ recommended)
  • Java JDK (for certificate generation)
  • Python 3.7+ with pip
  • OpenSSL (for certificate generation)

Step 1: Clone the Repository

git clone https://github.com/Manikandan-t/kafka-trifecta.git
cd kafka-trifecta

Step 2: Choose Your Configuration

You can choose to set up one of the three configurations:

  1. No Authentication
  2. SASL/PLAIN Authentication
  3. SASL_SSL Authentication with Certificates

Let's go through each setup in detail.


πŸ” 1. SASL/PLAIN Authentication Setup

πŸ› οΈ Setup Instructions

1. Start Kafka with Authentication

docker compose -f auth/docker-compose.yml up -d

This will start:

  • Zookeeper on port 2181
  • Authenticated Kafka broker on port 9092

2. Create a Kafka Topic

kafka-topics.sh --bootstrap-server localhost:9092 \
  --topic mock_json --create \
  --partitions 1 --replication-factor 1 \
  --command-config auth/config/client.properties

3. Set Up Python Environment

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r auth/requirements.txt

4. Run the Services

# Start FastAPI Kafka Auth API
python auth/kafka_auth.py

# Start Streamlit UI
streamlit run auth/client-ui/kafka_ui.py

πŸ“‘ API Endpoints

  • POST /produce: Send a message to Kafka

    POST /produce
    Content-Type: application/json
    {
      "topic": "mock_json",
      "message": "{\"referenceId\": \"AUTH-001\", \"documentId\": \"1234567890\"}"
    }
  • GET /consume: Read messages from Kafka

    GET /consume

    Returns:

    {
      "messages": [
        "{\"referenceId\": \"AUTH-001\", \"documentId\": \"1234567890\"}"
      ]
    }

πŸ–₯️ Streamlit UI Features

  • Refresh messages with a button
  • Parse and display valid JSON messages
  • Deduplicate already shown messages
  • Display raw messages if JSON decoding fails

πŸ“¦ 2. No Authentication Setup

πŸ› οΈ Setup Instructions

1. Start Kafka Without Authentication

docker compose -f no-auth/docker-compose.yml up -d

This will start:

  • Zookeeper on port 2181
  • Kafka broker on port 9092 (external), 29092 (Docker), 9999 (JMX)

2. Create a Kafka Topic

kafka-topics --create --topic mock_json --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1

3. Set Up Python Environment

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r no-auth/requirements.txt

4. Run the Services

# Start FastAPI Kafka API
python no-auth/kafka_no_auth.py

# Start Streamlit UI
streamlit run no-auth/client-ui/kafka_ui.py

πŸ“‘ API Endpoints

  • POST /produce: Send a message to Kafka

    POST /produce
    Content-Type: application/json
    {
      "topic": "mock_json",
      "message": "{\"referenceId\": \"REF-111\", \"documentId\": \"5354356788\"}"
    }
  • GET /consume: Read messages from Kafka

    GET /consume

    Returns:

    {
      "messages": [
        "{\"referenceId\": \"REF-111\", \"documentId\": \"5354356788\"}"
      ]
    }

πŸ–₯️ Streamlit UI Features

  • Refresh messages with a button
  • Parse and display valid JSON messages
  • Deduplicate already shown messages
  • Display raw messages if JSON decoding fails

πŸ” 3. SASL_SSL Authentication with Certificates Setup

πŸ› οΈ Setup Instructions

1. Generate SSL Certificates

First, you need to generate the SSL certificates using the provided script:

cd sasl_ssl_auth/cert
chmod +x script.sh
./script.sh

This script will:

  • Create a CA certificate and private key
  • Generate a keystore with a key pair and self-signed certificate
  • Sign the certificate with the CA
  • Create truststore with the CA certificate

2. Create Docker Network

docker network create kafka

3. Start Kafka Using Docker Compose

docker compose -f sasl_ssl_auth/docker-compose.yml up -d

4. Verify Kafka Logs

Check the logs to ensure Kafka is running properly:

docker logs example.kafka.com

You should see messages indicating the broker is ready.

5. Add Host Mapping

Edit your /etc/hosts file and add:

172.20.0.2 example.kafka.com

6. Create a Kafka Topic

kafka-topics.sh \
  --create \
  --bootstrap-server example.kafka.com:9092 \
  --command-config sasl_ssl_auth/config/producer.properties \
  --replication-factor 1 \
  --partitions 1 \
  --topic mock_json_topic

7. Set Up Python Environment

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r sasl_ssl_auth/requirements.txt

8. Run the Streamlit UI

export KAFKA_BROKER="example.kafka.com:9092"
export KAFKA_TOPIC="mock_json_topic"
export KAFKA_SASL_USERNAME="user"
export KAFKA_SASL_PASSWORD="bitnami"
export KAFKA_CA_LOCATION="../cert/ca-cert.pem"
streamlit run sasl_ssl_auth/client-ui/kafka_streamlit_app.py

πŸ“‘ API Endpoints

The FastAPI endpoints are not used in this configuration. Instead, you interact with Kafka directly through the Streamlit UI or CLI tools.

πŸ–₯️ Streamlit UI Features

  • Connect to Kafka broker with SSL authentication
  • Consume messages from a specified topic
  • View message details including timestamp, partition, offset, and value
  • Display raw JSON messages or plain text
  • Configure connection settings via sidebar

πŸ“‹ CLI Commands

Listing Topics

# For no-auth configuration
kafka-topics --list --bootstrap-server localhost:9092

# For auth configuration
kafka-topics.sh --list --bootstrap-server localhost:9092 --command-config auth/config/client.properties

# For SASL_SSL configuration
kafka-topics.sh --list --bootstrap-server example.kafka.com:9092 --command-config sasl_ssl_auth/config/consumer.properties

Publishing a Message

# For no-auth configuration
kafka-console-producer.sh --topic mock_json --bootstrap-server localhost:9092

# For auth configuration
kafka-console-producer.sh --bootstrap-server localhost:9092 --topic mock_json --producer.config auth/config/client.properties

# For SASL_SSL configuration
kafka-console-producer.sh --topic mock_json_topic --bootstrap-server example.kafka.com:9092 --producer.config sasl_ssl_auth/config/producer.properties

Consuming Messages

# For no-auth configuration
kafka-console-consumer.sh --topic mock_json --bootstrap-server localhost:9092 --from-beginning

# For auth configuration
kafka-console-consumer.sh --topic mock_json --bootstrap-server localhost:9092 --consumer.config auth/config/client.properties --from-beginning

# For SASL_SSL configuration
kafka-console-consumer.sh --topic mock_json_topic --bootstrap-server example.kafka.com:9092 --consumer.config sasl_ssl_auth/config/consumer.properties --from-beginning

πŸ§ͺ Cleanup

To stop and remove all containers:

# For no-auth configuration
docker compose -f no-auth/docker-compose.yml down

# For auth configuration
docker compose -f auth/docker-compose.yml down

# For SASL_SSL configuration
docker compose -f sasl_ssl_auth/docker-compose.yml down

πŸ“Œ Notes

  1. Certificate Management: For the SASL_SSL configuration, the ca-cert.pem file is required for the Streamlit UI to trust the CA. Ensure this file is accessible when running the UI.

  2. Docker Network: The SASL_SSL configuration requires a Docker network named kafka to be created before starting the containers.

  3. Host Mapping: The /etc/hosts entry is necessary for the SASL_SSL configuration to resolve the hostname correctly.

  4. Security Best Practices:

    • Never use the provided default credentials (deploy/deploy@123, user/bitnami) in production
    • Always rotate certificates and credentials regularly
    • Use separate certificates for different environments (development, staging, production)
  5. Troubleshooting Tips:

    • If you encounter connection issues, check Kafka logs with docker logs example.kafka.com
    • Verify that the ssl.ca.location is correctly pointing to the CA certificate
    • Ensure all required ports are open (9092 for SASL_SSL, 9092 for no-auth, 9092 for auth)

πŸ™ Acknowledgements

This project was inspired by the need to have a simple, secure way to test and develop with Kafka locally.

About

Local Kafka playground: simple, auth, and SSL setups, plus Streamlit for quick testing.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors