0

I have app wrote in php that saves and reads sensitive information into file /sensitive/sensitive.txt. File that is saving and reading sensitive information is user.php. The whole app is in a separate folder.

Ideally only root user could read the sensitive file. I want to make sure that even if there is some malware file inside ftp created by apache or ftp user in past, it won't be able to read the sensitive file through file_get_contents function.

The problem is that the file is created and read with user.php on http request and as result the file owner becomes user apache. Even if I could change owner of this file to root, the next problem is that user.php won't be able to read the file when http request is triggered because this will be apache user executing the user.php file and thus apache user reading the sensitive file.

Is there any solution to this problem? How could I let user.php to read/write the sensitive file on http request but not allowing any other file to read it?

1 Answers1

2

It is possible to make a setup to allow reading from the file only for certain web server instances. However, the setup is quite complicated:

It requires setting up nginx web server and using PHP-FPM with different application pools with different Unix users to control access to resources.

The simple approach is to use a separate MySQL database for storing the data, and use MySQL access control to restrict the access to the data to the desired user. That is, create a new MySQL user that can only access the database, and use that MySQL user in your PHP script.

Tero Kilkanen
  • 34,499
  • 3
  • 38
  • 58
  • I already use also database, I split the string and save half of it into file and partially into database. The problem with database is that if somebody could read the files he could also find credentials for that database that need to be somewhere if I want php script to connect into the database. Tnx for the tip about setup. – John Clark Aug 04 '19 at 10:02