0

I have a wiki based on dokuwiki. I made a backup of it by archiving all files. I can automate testing it extracting files and manually see that website works. But what is the right way to check that all backup is correct? I thought it should be like that: while making backup script should go to all pages, which it can reach, compute and store the hash of retrieved page. And while testing backup script should do the same and compare hashes. Is it right?

Ishayahu
  • 157
  • 1
  • 1
  • 9
  • Theoretically you would need to hash whatever it's running on which is why docker might be a best case scenario as opposed to hashing the filesystem of an entire native operating system. Hashing the environment of docker is more reliable but moreover using containerization such as docker removes the need to hash said environment in the in first place. Not that I am an advocate of containerization but such constraints led me to recommend otherwise. – Joseph Persie III Jan 14 '19 at 00:16
  • I don't need backup of all filesystem,only wiki's data – Ishayahu Jan 14 '19 at 04:27

0 Answers0