I did not manage to learn about this from the documentation.
is it possible using FS or S3?
You can’t continuously ingest the contents of a single file that gets continuously updated using pipelines. Pipelines wait for full files to arrive in a source folder, then load those. Similarly for Kafka pipelines, which load whole messages at a time.
If you can change the program that is appending to the file, you can make it write multiple files. Otherwise, consider writing a program to monitor the file and split it into multiple files in a different folder as the data arrives, then use pipelines to load from there.
1 Like
If you want to replace the entire file, you can set the offset to earliest and restart the pipeline, but it’s a bit clunky.
1 Like