Anywhere you provide the facility to upload content, there is a risk of uploading malware. Either manage the risk or don't allow uploads.
Regarding the protocol, unlike HTTP(S) and SMTP servers, I'm not aware of any SFTP server which allows you to deal with the files as a synchronous operation triggered by the upload. However it is possible to do it asynchronously on Linux with minimal latency using inotify (but there are some complications over handling transfers that take a while to complete). You can also poll the directories or monitor the SFTP log to trigger processing on the gateway.
Regardless of the method you use, you should keep uploads and downloads separate, and keep each users uploads separate. And not make content available as a download until you've checked its content.
As to the processing required to deem that a file is safe....using an anti-virus product is one solution, however:
- they are only effective against known exploits
- most configure themselves to run on-access. This is nightmare for a gateway type scenario where you want to control the dataflow programmatically
- the ones that can be configured to scan on-demand frequently do so as a single process - i.e. the program needs to load the entire virus definition file for each operation, uses a lot of memory, disk I/O and time
(a notable exception to 2 and 3 is clamdscan which acts as a frontend to a scanning daemon).