1

If I want to use a S3 clone to host something somewhat sensitive (probably digitalocean, since it's cheapest and probably has a perfectly good quality), is it sensible to do it this way: https://s3clone.example.com/Ki3mCdA3eFrC5haIVvUkZaUOuceisGmv85ZXt4qgXoZuMwOL1IKiG21Cm6i4u83wMpQZNqEibh4CHNUr61s6rCwvAK0IUsCCwZO8MTZa4Lwbif6MosSXzG0XNQXvYBqS/file.pdf ?

The reason to do this over using the API For proper access control is that then I can cache these files with Cloudflare and save even more on bandwidth costs. The only concern that I can think of is that the URLs might get stored somewhere inadvertently (e.g. browser history on a public computer), but maybe there's a way to prevent this? I also can't find any articles/old questions/etc. about this.

john01dav
  • 215
  • 1
  • 6
  • I would much prefer signed URLs that have limited time duration. https://www.digitalocean.com/docs/spaces/resources/s3-sdk-examples/#generate-a-pre-signed-url-to-download-a-private-file – Z.T. Feb 05 '21 at 21:02
  • @Z.T. Are you suggesting a system to cache the pre-signed URL, and reuse it to get Cloudflare's caching ability, or just generating a separate URL each time, sacrificing caching? – john01dav Feb 05 '21 at 21:15
  • I'm suggesting to sacrifice caching. If you need caching and a link that doesn't expire and you can't invalidate by disabling the access key is not a problem, your solution would work. Then disabling the links would involve renaming the files and waiting for the cache to expire. – Z.T. Feb 05 '21 at 21:40
  • relevant to the size of the token: *F.Grieu '20 https://crypto.stackexchange.com/a/80441* – brynk Feb 07 '21 at 03:58

0 Answers0