13

I am using s3cmd to upload to S3:

# s3cmd put 1gb.bin s3://my-bucket/1gb.bin
1gb.bin -> s3://my-bucket/1gb.bin  [1 of 1]
  366706688 of 1073741824    34% in  371s   963.22 kB/s

I am uploading from Linode, which has an outgoing bandwidth cap of 50 Mb/s according to support (roughly 6 MB/s).

Why am I getting such slow upload speeds to S3, and how can I improve them?


Update:

Uploading the same file via SCP to an m1.medium EC2 instance (SCP from my Linode to the instance's EBS drive) gives about 44 Mb/s according to iftop (any compression done by the cipher is not a factor).


Traceroute: Here's a traceroute to the server it's uploading to (according to tcpdump).

# traceroute s3-1-w.amazonaws.com.
traceroute to s3-1-w.amazonaws.com. (72.21.194.32), 30 hops max, 60 byte packets
 1  207.99.1.13 (207.99.1.13)  0.635 ms  0.743 ms  0.723 ms
 2  207.99.53.41 (207.99.53.41)  0.683 ms  0.865 ms  0.915 ms
 3  vlan801.tbr1.mmu.nac.net (209.123.10.9)  0.397 ms  0.541 ms  0.527 ms
 4  0.e1-1.tbr1.tl9.nac.net (209.123.10.102)  1.400 ms  1.481 ms  1.508 ms
 5  0.gi-0-0-0.pr1.tl9.nac.net (209.123.11.62)  1.602 ms  1.677 ms  1.699 ms
 6  equinix02-iad2.amazon.com (206.223.115.35)  9.393 ms  8.925 ms  8.900 ms
 7  72.21.220.41 (72.21.220.41)  32.610 ms  9.812 ms  9.789 ms
 8  72.21.222.141 (72.21.222.141)  9.519 ms  9.439 ms  9.443 ms
 9  72.21.218.3 (72.21.218.3)  10.245 ms  10.202 ms  10.154 ms
10  * * *
11  * * *
12  * * *
13  * * *
14  * * *
15  * * *
16  * * *
17  * * *
18  * * *
19  * * *
20  * * *
21  * * *
22  * * *
23  * * *
24  * * *
25  * * *
26  * * *
27  * * *
28  * * *
29  * * *
30  * * *

The latency looks reasonable, at least until the server stopped responding to ping requests.

Tom Marthenal
  • 2,106
  • 7
  • 25
  • 37
  • Start troubleshooting. What does your CPU usage look like while you're uploading? What does a `traceroute` look like? Is the send queue on your side of the TCP connection nearly full or nearly empty? With a few very simple tests you can quickly narrow down the problem. – David Schwartz Apr 08 '12 at 21:53

3 Answers3

11

Just in case anyone stumbles upon this....

I had an issue where it was really slow to upload from an ec2 instance to an s3 bucket, turned out to be really simple, the region of the bucket! I was using ec2 instances in north california, when the bucket was created as us standard it was really slow to transfer, with the bucket set to north california it was heaps faster.

rabs
  • 211
  • 2
  • 3
10

Just because you can go up to 50mbps does not mean you will always get 50mbps; The network path to S3 as well as the latency are also important.

If you are able to use multi-part upload, you will be able to break a file down into multiple pieces and upload it using multiple threads, possibly increasing the upload speed.

gekkz
  • 4,219
  • 2
  • 20
  • 19
  • 1
    Uploading the same file to an m1.medium EC2 instance via SCP from my Linode runs at about 44 Mb/s (according to `iftop`, so compression doesn't matter), which is a lot closer to the 50 Mb/s outgoing bandwidth cap. S3 uploads are about a fifth as fast. The latest `s3cmd` support multi-part uploads, but doesn't upload them concurrently, but instead consecutively. Is there a better utility to upload to S3? – Tom Marthenal Apr 08 '12 at 18:39
  • I've added speed statistics from SCP (my Linode -> EC2 instance) to my question. – Tom Marthenal Apr 08 '12 at 18:42
  • Have you tried using other tools similar to s3cmd to perform tests? This is to determine it isn't actually s3cmd that is slow. Also, in case s3cmd is somehow uploading through HTTPS, you may want to try to switch that to HTTP. Another idea is that S3 itself just happens slower to upload to, since you seem to have no problems with EC2. – gekkz Apr 08 '12 at 19:13
  • 4
    Try s3multiput. – EEAA Apr 08 '12 at 19:13
1

I have seen many threads in a forum about slow upload to S3 using various clients, such as the free command line tools (written in Python, btw) and commercial ones.

Although I have no hard evidence for you, switching the S3 client (e.g. Cloud Berry Explorer) might solve your problem. Try it! :)

PythonLearner
  • 1,022
  • 2
  • 12
  • 29