I’m able to create a Pipeline which will connect to Kafka end point and load the data. Now, I’m trying to send Kafka details dynamically. Is there anyway without hardcode the Kafka parameters in Memsql pipeline?
CREATE PIPELINE mypipeline AS
LOAD DATA KAFKA '127.0.0.1/my-topic’
INTO TABLE my_table;
Not within MemSQL directly. You could use a bash script with env variables and substitute them out programmatically, see attached mini-example:
create_pipeline_template.sql
use db;
CREATE OR REPLACE PIPELINE test_pipeline
AS LOAD DATA KAFKA '127.0.0.1/KAFKA_TOPIC'
INTO TABLE my_table;
start pipeline test_pipeline;
update_pipeline.sh
#!/bin/bash
CREATE_PIPELINE_TEMPLATE_PATH=$(readlink -f create_pipeline_template.sql)
# use your logic here to set topic name
export KAFKA_TOPIC=mytopic
# Use perl rather than sed to avoid problems with special characters,
# e.g. '/', in replacement text.
#
cat $CREATE_PIPELINE_TEMPLATE_PATH | \
perl -pe 's/KAFKA_TOPIC/$ENV{KAFKA_TOPIC}/g;' \
| tee /dev/stderr \
| mysql -u root -h 0 -P 3306