Exporting Data to a File with Kafka Connect

30 minutes
  • 2 Learning Objectives

About this Hands-on Lab

Kafka Connect provides several connectors to help you move data to and from a variety of external sources. In this lab, you will implement a basic connector to copy Kafka data to a file on the local disk. This simple exercise will give you some hands-on experience interacting with Kafka Connect, and this introduction will prepare you to work with more advanced use cases.

Learning Objectives

Successfully complete this lab by achieving the following learning objectives:

Create a Connector to Export Data from the `inventory_purchases` Topic to a File
  1. Create a FileStreamSinkConnector:

    curl -X POST http://localhost:8083/connectors -H "Content-Type: application/json" -d '{
    "name": "file_sink",
    "config": {
      "connector.class": "org.apache.kafka.connect.file.FileStreamSinkConnector",
      "tasks.max": "1",
      "file": "/home/cloud_user/output/output.txt",
      "topics": "inventory_purchases",
      "key.converter": "org.apache.kafka.connect.storage.StringConverter",
      "value.converter": "org.apache.kafka.connect.storage.StringConverter"
  1. We should see data from the topic appearing in the output file:

    cat /home/cloud_user/output/output.txt
Publish a New Purchase of Plums to the `inventory_purchases` Topic
  1. Start up a console producer:

    kafka-console-producer --broker-list localhost:9092 --topic inventory_purchases
  2. Publish some data representing a purchase of plums:

  3. Check the file and verify that the purchase of plums shows up in the file data:

    cat /home/cloud_user/output/output.txt

    Note: It may take a few moments for the connector to process the new data.

Additional Resources

Your supermarket company is using Kafka to manage some data. They want to export data from a topic to a data file on the disk for analysis. You have been asked to set up a connector to automatically export records from the inventory_purchases topic to a file located at /home/cloud_user/output/output.txt.

Use the following information as you implement a solution:

  • The connector class org.apache.kafka.connect.file.FileStreamSinkConnector can be used to export data to a file.
  • Set the number of tasks to 1.
  • The data in the topic is string data, so use org.apache.kafka.connect.storage.StringConverter for key.converter and value.converter.

Here is an example of a connector configuration for a FileStream Sink Connector:

"connector.class": "org.apache.kafka.connect.file.FileStreamSinkConnector",
"tasks.max": "1",
"file": "<file path>",
"topics": "<topic>",
"key.converter": "<key converter>",
"value.converter": "<value converter>"

Once you have set up the connector, publish a new record to the topic for a purchase of plums:

kafka-console-producer --broker-list localhost:9092 --topic inventory_purchases


Check the file to verify that the new record appears:

cat /home/cloud_user/output/output.txt

If you get stuck, feel free to check out the solution video, or the detailed instructions under each objective. Good luck!

What are Hands-on Labs

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?