Hi,
I tested the SELECT INTO KAFKA sql statement and it seems the integration with confluent kafka is working ok using:
SELECT to_json(telemtry_test.*)
FROM telemtry_test
INTO KAFKA ‘[broker-host-and-port]/[topic-name]’
CONFIG ‘{
“security.protocol” : “SASL_SSL”,
“sasl.mechanism” : “PLAIN”,
“sasl.username” : “…”,
“ssl.ca.location” : “/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem”}’
CREDENTIALS ‘{
“sasl.password” : “…”}’;
My question: how can i configure how to populate the kafka key in the SQL?
for example, in my table above, i have 3 columns: sensor_id, timestamp and value.
I would like the sensor_id with addition of some hard coded prefix would be used as the kafka key for each message. how to configure that?