How to avoid partial data load in FS pipeline

Hi,
I am using memSQL 6.8.12. I have created a FS pipeline for ingesting a CSV file from a defined location.
Sometimes the file created is of bigger size (100 to 500 MB) and it takes few seconds to few minutes to complete the writing of the file.
But the pipeline ingests the data as soon as the file is created with the partial data, before the data writing to the file is completed.

Is there any way to ensure that the file is read once writing is complete.
One option is to create the file at some other location and after creation move to the final location, and pipeline will read the complete file.
But wanted to know, if it is being written directly to the final location, anyway to read complete data by the pipeline?

Thanks

You can specify files by extension using this sytax:

So specify *.done as extension in the create pipeline syntax and write files into *.tmp and then rename into *.done once you have fully written the file.