2

When I try to upload a huge file size (approx 2GB), the server cpu usage goes really high. What should I do to fix this?

I just use standard html form and php, for file upload.

I'm sorry if I post on the wrong forum. Please point me to the right direction

here is the result of "top" command during uploading 4 files (18mb, 38mb, 60mb, 33mb)

 1904 apache    20   0 33504 5740 1952 R 28.3  0.2   0:02.19 httpd
 1905 apache    20   0 33504 5740 1952 R 28.3  0.2   0:01.99 httpd
 1903 apache    20   0 33232 6968 3060 R 28.0  0.2   0:01.98 httpd
 1910 apache    20   0 33240 6020 2248 S 11.5  0.2   0:02.85 httpd
 2133 root      20   0  2656 1124  896 R  1.6  0.0   0:00.71 top
    1 root      20   0  2864 1404 1188 S  0.0  0.0   0:03.99 init

the code for chunking, although eventhough I don't use this code (just simple file upload), it still cause that high cpu usage

        function sendRequest() {       
            //clean the screen
            //bars.innerHTML = '';


            var file = document.getElementById('fileToUpload');   

            for(var i = 0; i < file.files.length; i++) {      
                var blob = file.files[i];
                var originalFileName = blob.name;
                var filePart = 0

                const BYTES_PER_CHUNK = 100 * 1024 * 1024; // 10MB chunk sizes.
                var realFileSize = blob.size;

                var start = 0;
                var end = BYTES_PER_CHUNK;

                totalChunks = Math.ceil(realFileSize / BYTES_PER_CHUNK);

                alert(realFileSize);
                while( start < realFileSize ) { 

                    if (blob.webkitSlice) {
                        //for Google Chrome
                        var chunk = blob.webkitSlice(start, end); 
                    } else if (blob.mozSlice) {
                        //for Mozilla Firefox
                        var chunk = blob.mozSlice(start, end);

                    }       

                    uploadFile(chunk, originalFileName, filePart, totalChunks, i);

                    filePart++;
                    start = end;
                    end = start + BYTES_PER_CHUNK;
                }
            }                
        }
Frederik
  • 3,293
  • 3
  • 30
  • 46
bosiang
  • 21
  • 3
  • 4
    I'm certainly not a PHP/HTML expert, but I'm gonna suggest that uploading a 2GB file (!!!!!) over a webform with PHP is what's causing your issues. It's not meant for that. A coding site (like stackoverflow) might have tips on how to code around that, but they might also suggest that you're insane for trying such a thing. Kinda like asking why the engine in my Toyota has performance problems when I get it up to 200 MPH. – HopelessN00b Jun 28 '12 at 19:19
  • Is the CPU usage high all the time? – Mircea Vutcovici Jun 28 '12 at 19:38
  • No, only during uploading file. when I do "top" command, it shows that there are 5170 apache 20 0 34052 6104 2272 R 15.0 0.2 1:26.41 httpd, something like that, and there are 6 httpd, so it sum up about 90%ish.. the six httpd is understandable since there are multiple file, but I don't understand why it takes so much %cpu, – bosiang Jun 28 '12 at 19:43
  • 1
    Its no surprise, why on earth are you trying to upload a 2gb file via POST?! – Ben Lessani Jun 28 '12 at 19:54
  • @sonassi I'm trying to create a site something like megaupload style. and using html5 I chunk that 2GB file into smaller piece (35mb), but it still cause high usage. any other solution? no java and flash allowed. (I have the site on java already, I'm just trying to rewrite this) – bosiang Jun 28 '12 at 19:58
  • http:// oder https://? – d135-1r43 Jun 29 '12 at 10:25
  • @d135-1r43 http for now – bosiang Jun 29 '12 at 16:30
  • 1
    Can you check what is using CPU? I mean, the break down between System / User / IOWait. If it's IOWait your CPU is blocked writing to disk – Fredi Feb 09 '16 at 13:16
  • I wonder if you're running out of RAM and hitting swap. Can you post the output of `free -m` before, during, and after the transfer? – virtex Jan 12 '17 at 21:37
  • I would do local benchmarks outside of php first. Copy from ram to disk to simulate network upload. Then multiple copies at the same time. If you dont see high CPU, then start profiling your php code. I've run sites that allowed uploading 20GB+ files without issue and that was decades ago. There is nothing unusual about this. So, first rule out your storage, then look at code. – Aaron Jun 19 '17 at 22:10

1 Answers1

1

If you're doing many file operations, this makes sense. You should post some code of the file splitting operations.

MitziMeow
  • 119
  • 3
  • eventhough the file is not split, it's still cause that problem.. take a look at the "top" command result I just edit my post – bosiang Jun 28 '12 at 20:38
  • You wrote it's chunked into smaller pieces, right? If you'd show some code and explain why you are doing this, we might help (this is a coding related question IMHO though) – MitziMeow Jun 28 '12 at 20:40
  • I post the code already, although I did try without chunking, I mean simple file upload, it still cause that problem high cpu on httpd – bosiang Jun 28 '12 at 20:43
  • You tried using a simple file copy and the CPU is overloaded after the upload from the browser is finished? Or is it loaded _durin_ the upload itself? – MitziMeow Jun 28 '12 at 20:45
  • it's during the uploading itself. so, if I try to upload 60MB files, then during that time it will use high cpu usage, and if someone else try to upload file also from different place, it will even burden the server more, to the point I can't do simple command like ls without several second delay – bosiang Jun 28 '12 at 20:47
  • Also, if you could use `strace -p PID(e.g, 1904)` you would get a clear picture of what apache is doing during the file write and see whether it's a legitimate load or not. – MitziMeow Jun 28 '12 at 20:47
  • I can bet he has some kind of RAID + partitions config issue which causes IOWait. Please read that http://serverfault.com/questions/573396/centos6-and-long-wait-io-time-on-jbd2-dm-0-8 – JackTheKnife Jan 12 '17 at 21:57