I would like to store large binary files (mongodump bson files) in a Google Cloud Storage bucket mounted on a Google Compute instance through gcsfuse. Everything is working well, except that fuse is copying every file to a temporary folder every time I write a file. My use case is about storing files up to 4TB on GCS to reduce our storage costs, but if I have to keep a disk large enough for the temporary files, GCS does not reduce my costs as I have to keep both kind of storage (disk and GCS).
Is there a way to write large files to a mounted GCS bucket without having all that temporary space used on the GC instance?