5

We provide a freemium service to upload large files and download them later on.

Using ClamAV, we scan every file for viruses after it is uploaded and before it can be downloaded. The file is deleted if a virus was found and a HTTP 404 is returned.

Still Google Safe Browsing keeps detecting malware on our site:

...the last time suspicious content was found on this site was on 2015-08-30...

(Which was yesterday as of writing).

While my logs show that viruses/malware are detected from time-to-time, I also see that these files are downloaded zero times and are being deleted automatically as expected.

I've read through "What tools does the Google safe browsing service rely on?" and the linked resources.

Still I'm not sure why I cannot get my site to be "Google Safe Browsing clean". Maybe:

  • the used virus scanner is not good enough and lets some viruses pass?
  • I'm doing fundamental mistakes due to my limited knowledge of security architecture?

My question:

Do you have any idea what to do in order to get a clean Google Safe Browsing track?

(If this is possible at all, even Dropbox or Google itself are listed there)

Update 1:

As Schroeder says, there may be other content on our site, not the files. We also do provide Windows and Mac clients. These files are all scaned and detected as virus-free.

I'm confused…

Update 2: (2015-09-24)

I've installed "Sophos Server Security" on the server in question and see lots and lots of malicious uploads being deleted by Sophos now.

So the ClamAV detection rate seems to be a lot lower than the one of Sophos.

Hopefully with the help of this new anti-virus solution my server never ever again serves as a virus source.

Uwe Keim
  • 2,686
  • 2
  • 15
  • 25
  • 1
    The Google Safe Browsing report says that the malware "being downloaded and installed without user consent" which seems to indicate that it isn't the files, but some other portion of your site. – schroeder Aug 31 '15 at 05:20
  • 1
    Have you tried using the Google Webmaster Tools links? – schroeder Aug 31 '15 at 05:21
  • 1
    VirusTotal reports your site as clean: https://www.virustotal.com/en/url/0e392d448292b32a5c4071f2d21986a5274c17ad7a10e98988cf6dcb12c1278d/analysis/1440998388/ – schroeder Aug 31 '15 at 05:22
  • @schroeder Thanks. Yes, I did use the GWT links. They pointed to 404 URLs. Still I marked them in GWT als "resolved". – Uwe Keim Aug 31 '15 at 05:22
  • 4
    I would not rely on ClamAV alone to find malware. While commercial antivirus is definitely not foolproof ClamAV is is my experience far behind when detecting new malware i.e. which does not match existing signatures. – Steffen Ullrich Aug 31 '15 at 05:28
  • Thanks, @SteffenUllrich any recommendation on something affordable that also runs on Windows Server? – Uwe Keim Aug 31 '15 at 05:29
  • 2
    I can not recommend something specific. I also don't know your budget and this is not the place for product recommendations anyway. The best would be to combine several scanners, maybe interface to virustotal too (if possible), restrict the types of files you accept for upload etc. – Steffen Ullrich Aug 31 '15 at 05:35

2 Answers2

9

There are several scenarii you may think about. I am not going to be exhaustive, but I will mention briefly the milestones that can come to my mind:

  • Your website is effectively clean

    Let us suppose your website is clean as you say. So why Google still blacklists it?

    There may be several reasons for this situation:

    1. Reason #1

      Google effectively did not find anything wrong with your website by now, but it still does not trust you because it can not understand your intentions as you may be someone who wants to attack your website's visitors, so Google needs time to be sure that you are not playing a game over its head by just pretending you are not attacking any body for the moment but as soon as Google whitelists you then you come back to your nefarious habit(s) (This is said, I am not accusing you to attack people through your website, I just say why Google still does not trust you even if your website is innocuous for the moment)

      1.1. Solution for Reason #1:

      You need to scan more times, and in separate days using Google Webmaster Tools until Google trusts that your change (clean-up) is permanent.

    2. Reason #2

      Google believes that you have cleaned-up your website from malicious content but you did not really finish that task as you did not request from it (Google) a malware review.

      2.1. Solution for Reason #2:

      After you cleaned your website, request a review from Google.

    3. Reason #3

      You cleaned infected webpages from their malicious content and you submitted a review request from Google that still does not trust you.

      3.1.Solution for Reason #3:

      You need to delete those pages, not only their malicious content. Yes, it is a radical solution but the only one that will make Google trust you in this particular case because Google will think you have intention to use those pages to attack your clients and that is why you are keeping them there.

  • Your website is not as innocuous as you might think

    1. Your domain has served as a bridge to deliver malware

    Google says that Over the past 90 days, zeta-uploader.com did not appear to function as an intermediary for the infection of any sites. But given what discussed above, you are forced not to trust this statement.

    1.1. Why Google did not detect my domain as being a malicious intermediary one?

    Several reasons are there to explain why Google failed to detect that:

    • attackers obfuscate links in your webpages to domains that attack using drive-by download attacks (which is reported by Google concerning your website).
    • an other reason is that Google detected the obfuscated link but can not follow and determine the malicious chain of intermediary domains
    • Google found the obfuscated link to the malicious domain but the attacker replaces such links frequently to new ones that may be are difficult for Google to detect so it can not be sure what decision to take

    1.2. What to do then?

    I do not have much ideas for this situation apart from searching for such links manually yourself or, if it is possible, to hash your webpages and develop files/directories modification notifications.

  • What is the situation about my real case now?

It is not that good. Under the section What happened when Google visited this site?, Google says you have: Malicious software includes 96 trojan(s), 14 exploit(s), 5 backdoor(s). and that 3 page(s) resulted in malicious software being downloaded and installed without user consent (drive-by download attacks).

Only backdoors can explain what really happened to your website.

What's the solution then?

This is too broad to answer because you are the owner of the website, you know its security architecture, so you are the only one who can assess that and take decisions depending on different scenarii/discoveries you may encounter. You may start by checking this: Google Webmasters help for hacked sites and see which method to use to discover, remove and mitigate backdoors on your domain... do not forget to review your scripts related to file upload ...

Good luck.

P.S.

The question you linked to, I asked it a year ago when I was developing a software for the sake of a web vulnerability scanner to detect drive-by download attacks and malicious contents on websites.

Matthew
  • 27,233
  • 7
  • 87
  • 101
6

Why does Google Safe Browsing keep detecting malware on my website?

Probably because there is or was malware on this site.

We provide a freemium service to upload large files and download them later on.

Unfortunately these kind of services get easily abused by anybody which likes to spread malware. They are looking for services which are not (yet) in some kind of blacklists so the chances are higher that their malware can reach the target. Services which allow larger files might be especially attractive since lots of commercial firewall vendors severely limit the size of the files they scan and let everything else pass through. Typical limits are 1..15MB only.

Using ClamAV, we scan every file for viruses after it is uploaded and before it can be downloaded.

While ClamAV is free its detection rate is not very good compared to the better commercial antivirus solutions. One reason is that malware authors can easily tune their malware to bypass the detection algorithms ClamAV uses, since these algorithms are publicly known (open software). Also ClamAV relies on the community to develop the product and keep it up-to-date and does not have the manpower and the access to new threats as commercial vendors do.

But commercial vendors miss a lot of new malware too. To detect most of the malware you better combine several engines and tune them so they prefer more false positives instead of let some malware through. You could also check with virustotal.

You should also severely limit the types of files you allow for upload and not allow any types which typically contain malware. This includes any kind of executable files but also office documents or PDF files. Since such limitations might easily make the service you provide unsuitable for several users you might at least try to limit the impact of such files by requiring manual action of the user before downloading the file, i.e. inform the user of the possible dangers and require some kind of captcha or click on a checkbox to continue. This way your service can not be used for drive-by-downloads because the content can not be directly downloaded from the link but needs an explicit action by the user. And if properly done robots will not see any remaining malware since manual action is required to get the file (but beware: some robots execute Javascript and can click checkboxes).

Steffen Ullrich
  • 184,332
  • 29
  • 363
  • 424