0

I have a specialized software that I have created that can generate gigabytes worth of random data in seconds. I want to put all of these randomized, large files into a location to where it can be accessed by 3-4 different computers in my workplace. These files are for fuzzing, so that the computers can take those files and put them directly into the program (it's okay if they need to download it, but it's not preferred.) It can be a local server (e.g. a RAID array on the network) or a remote server (dropbox). How would I go about setting up a local or remote server, and which one would be more efficient?

The actual, clear question: How can I go about setting up a local or remote server for mass file storage, and which would be easier for my team?

noodles
  • 83
  • 4

1 Answers1

0

An Apache server can be hosted and the random file can be targeted to be created in the /var/www/html folder with name random, hence all other machines can access the random file with wget command like the following:-

wget random-machine-ip/random
Aayush
  • 557
  • 6
  • 17