I recently noticed an unusual spike in my web server's traffic. Looking at the web stats showed that the small set of large binary files on my site had been downloaded in rapid succession by a group of seemingly-related IP addresses. I used urlquery.net to find out who owns those IPs and found them all to be Google's.
I came here looking for answers, but in reading what others have said, I realized that Google may be scanning binaries for malware, or at least submitting them to malware detection services for scanning. We know that Google detects and flags malware on web sites, so it's reasonable to assume that doing this involves downloading the files in question.
Google's 'If your site is infected' page says this: 'Use the Fetch as Google tool in Webmaster Tools to detect malware'.
Note also that the files in question do not appear in Google's search results, presumably because I use robots.txt to disallow indexing those files. Assuming I'm right, when Google finds a binary file that is linked from a public web page, it will scan the file for malware, regardless of robots.txt, but will only index the file if it's allowed by robots.txt. I think this is exactly what they should be doing, as long as the scanning is infrequent.
Update: Google seems to be doing this every ten days or so. This is going to affect my bandwidth limits.