The requirement is dictionary data from kafka should be loaded into a table. If the dictionary misses a column key-value pair then null should be inserted for that column. As of now when tried it shows column mismatch error. Our tables are dynamic so we can’t have column mapping with default value for a column during pipeline creation. Can I get a solution for this? We’re ok with inserting data into table or using a procedure.
You could have the SP receive the input as JSON and generate dynamic SQL to do the INSERT, then EXECUTE IMMEDIATE the result. Also, you can use an INSERT that names the fields for which you provide values; then the other values will default to NULL. Like
singlestore> create table t(a int, b int, c int);
Query OK, 0 rows affected (0.03 sec)
singlestore> insert into t(a,c) values(1,3);
Query OK, 1 row affected (0.23 sec)
singlestore> select * from t;
+------+------+------+
| a | b | c |
+------+------+------+
| 1 | NULL | 3 |
+------+------+------+
1 row in set (0.03 sec)