
Importing Data from Kafka into SingleStore using Pipelines
Notebook

Input Credentials
Define the BOOTSTRAP_SERVER, PORT, TOPIC,SASL_USERNAME,SASL_MECHANISM,SECURITY_PROTOCOL, and SASL_PASSWORD variables below for integration, replacing the placeholder values with your own.
In [1]:
1
BOOTSTRAP_SERVER = 'bootstrap-server-url'2
PORT = kafka-broker-port3
TOPIC = 'kafka-topic'4
SASL_USERNAME = 'username'5
SASL_MECHANISM = 'sasl-mechanism'6
SECURITY_PROTOCOL = 'security-proptocol'7
SASL_PASSWORD = 'password'
This notebook demonstrates how to create a sample table in SingleStore, set up a pipeline to import data from Kafka Topic, and run queries on the imported data. It is designed for users who want to integrate Kafka data with SingleStore and explore the capabilities of pipelines for efficient data ingestion.
Pipeline Flow Illustration

Creating Table in SingleStore
Start by creating a table that will hold the data imported from Kafka.
In [2]:
1
%%sql2
/* Feel free to change table name and schema */3
4
CREATE TABLE IF NOT EXISTS my_table (5
id INT,6
name VARCHAR(255),7
age INT,8
address TEXT9
);
Create a Pipeline to Import Data from Kafka
You'll need to create a pipeline that pulls data from Kafka topic into this table. This example assumes you have a JSON Message in your Kakfa topic.
Ensure that: You have access to the Kafka topic. Proper IAM roles or access keys are configured in SingleStore. The JSON message has a structure that matches the table schema.
Using these identifiers and keys, execute the following statement.
In [3]:
1
%%sql2
CREATE OR REPLACE PIPELINE kafka_import_pipeline AS LOAD DATA KAFKA '{{BOOTSTRAP_SERVER}}:{{PORT}}/{{TOPIC}}'3
CONFIG '{4
"sasl.username": "{{SASL_USERNAME}}",5
"sasl.mechanism": "{{SASL_MECHANISM}}",6
"security.protocol": "{{SECURITY_PROTOCOL}}",7
"ssl.ca.location": "/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem"8
}'9
CREDENTIALS '{10
"sasl.password": "{{SASL_PASSWORD}}"11
}'12
INTO TABLE my_table13
FORMAT JSON ;
Start the Pipeline
To start the pipeline and begin importing the data from the Kafka topic:
In [4]:
1
%%sql2
START PIPELINE kafka_import_pipeline;
You can see status of your pipeline Click Here
Select Data from the Table
Once the data has been imported, you can run a query to select it:
In [5]:
1
%%sql2
SELECT * FROM my_table LIMIT 10;
Check if all data of the data is loaded
In [6]:
1
%%sql2
SELECT count(*) FROM my_table
Conclusion
We have shown how to insert data from a Kafka topic using Pipelines
to SingleStoreDB. These techniques should enable you to
integrate your Kafka topic with SingleStoreDB.
Clean up
Remove the '#' to uncomment and execute the queries below to clean up the pipeline and table created.
Drop Pipeline
In [7]:
1
%%sql2
#STOP PIPELINE kafka_import_pipeline;3
4
#DROP PIPELINE kafka_import_pipeline;
Drop Data
In [8]:
1
%%sql2
#DROP TABLE my_table;

Details
About this Template
This notebook demonstrates how to create a sample table in SingleStore, set up a pipeline to import data from Kafka topic.
This Notebook can be run in Standard and Enterprise deployments.
Tags
License
This Notebook has been released under the Apache 2.0 open source license.
See Notebook in action
Launch this notebook in SingleStore and start executing queries instantly.