3

There is a requirement to setup a SFTP server. This will be used by some of user b2b agents to upload files.

Such kind of setup is always doubtfully risky in production environment where user can upload the malicious content, I am not sure how this could be achieved but I'd rather not take any risk.

I need recommendations to get back to product team & would appretiate if this setup could be implemented in a more secure way. What would be the key pointers to achieve this setup?

EDIT:

OS: CentOS.

Shritam Bhowmick
  • 1,602
  • 14
  • 28

2 Answers2

3

Anywhere you provide the facility to upload content, there is a risk of uploading malware. Either manage the risk or don't allow uploads.

Regarding the protocol, unlike HTTP(S) and SMTP servers, I'm not aware of any SFTP server which allows you to deal with the files as a synchronous operation triggered by the upload. However it is possible to do it asynchronously on Linux with minimal latency using inotify (but there are some complications over handling transfers that take a while to complete). You can also poll the directories or monitor the SFTP log to trigger processing on the gateway.

Regardless of the method you use, you should keep uploads and downloads separate, and keep each users uploads separate. And not make content available as a download until you've checked its content.

As to the processing required to deem that a file is safe....using an anti-virus product is one solution, however:

  1. they are only effective against known exploits
  2. most configure themselves to run on-access. This is nightmare for a gateway type scenario where you want to control the dataflow programmatically
  3. the ones that can be configured to scan on-demand frequently do so as a single process - i.e. the program needs to load the entire virus definition file for each operation, uses a lot of memory, disk I/O and time

(a notable exception to 2 and 3 is clamdscan which acts as a frontend to a scanning daemon).

symcbean
  • 18,278
  • 39
  • 73
  • Client wants to use our SFTP for upload/Download the Employee data and SAP Data for HRMS/SAP integration. If this is the data which is involved, do you see the aforementioned as a feasible solution? – Shritam Bhowmick Apr 11 '16 at 16:43
  • Yes, the content doesn't really make a lot of difference. How it's encoded matters. I infer that this will be structured data - CSV or similar. But don't use a CSV parser to validate the content - these just abstract away the meta data when you should be checking it alongside the data structure. Time to brush up on your awk skills (hint: use igawk, and be rigourous about your file naming convention and method for handover of files). – symcbean Apr 11 '16 at 22:05
0

Provided that any type of external access given to third parties may imply some level of security risk, the real danger may occur when both SSH (shell/command execution) and SFTP (file transfer, especially upload) are enabled at the same time.

Basically if you only allow SSH and the server is hardened properly (chroot, limited user privileges, and so on...) then the risk is limited. Similarly if you only allow SFTP (no shell, no command execution) then the risk is also limited because whatever users upload they cannot run/execute it anyway.

But when a user has both Shell and SFTP access over SSH, then you are taking the risk of allowing a user to upload a potentially malicious file and then run it.

So, the recommendation is to allow either Shell or SFTP on your SSH server, but do not allow both for any user.

FjodrSo
  • 321
  • 1
  • 5