0

I would like to store large binary files (mongodump bson files) in a Google Cloud Storage bucket mounted on a Google Compute instance through gcsfuse. Everything is working well, except that fuse is copying every file to a temporary folder every time I write a file. My use case is about storing files up to 4TB on GCS to reduce our storage costs, but if I have to keep a disk large enough for the temporary files, GCS does not reduce my costs as I have to keep both kind of storage (disk and GCS).

Is there a way to write large files to a mounted GCS bucket without having all that temporary space used on the GC instance?

1 Answers1

0

Here is a way to achieve the same result but without mounting the GCS bucket. Simply use the streaming upload capability from gsutil to directly pipe the output from mongodump into the bucket:

mongodump -d DBNAME -c COLLECTIONNAME --out - | gsutil cp - gs://cold-backups/daily/COLLECTIONNAME.bson

See https://cloud.google.com/storage/docs/streaming for additional information.