0

i have a highly used server (running plesk).

I have some long scripts that take a while to process (huge mysql database). I have found then in 1 browser, i run the script and while it is loading i cannot view any other parts of the site until the script finishes, it seems that all the requests go off, but they don't get served until the initial script finishes.

i thought this may be a server wide issue, but it is not. If i use another computer i can view the site fine, even on the same computer with a different browser i can navigate fine, while the script still loads. I think it much limit the number of requests per session.

Is this correct? is there any way to configure this to allow for 2-3 other requests per session?

It is really bad that when i am on the phone to a client, i have just run a long report, but cannot use the site or follow what they are saying until the page has loaded?

Chris

2 Answers2

0

Why don't you load that time-consuming SQL through a AJAX type request after the page has loaded.

 <BODY onLoad="ajax_request_for_large_query();">

    <script type="text/javascript">

    function ajax_request_for_large_query() {

      document.getElementById(‘mydiv’)= "Loading....";        

      if (window.XMLHttpRequest) { // Mozilla, Safari, IE7...
          http_request = new XMLHttpRequest(); 
      } else if (window.ActiveXObject) { // IE6 and older
          http_request = new ActiveXObject("Microsoft.XMLHTTP"); 
      } 
      http_request.onreadystatechange = Contents; 
      http_request.open(‘GET’, 'test.html', true); 
      http_request.send(null);

    }

    function Contents() { 
      if (http_request.readyState == 4) {
          if (http_request.status == 200) {
               document.getElementById(‘mydiv’)= http_request.responseText;
          } else {
              alert(‘There was a problem with the request.’);
          }
      } 
    }

    </script>

This doesn't work, but something like that.

The javascript function fetches the data from the requested url which performs your huge query and returns the result after which a div in the html gets updated but only after website was loaded.

  • I have a similar page that ajax's the data in but stil, other pages have to wait untill the ajax finishes. –  Mar 24 '10 at 16:09
  • I'm sorry I don't understand why other pages have to wait. Open in new tab. –  Mar 24 '10 at 16:36
0

This depends on the browser. While most modern browsers allow about 6 concurrent HTTP 1.1 connections, older browsers only allowed two. Please specify the browser you use.

  • If you use IE, see http://support.microsoft.com/kb/183110).
  • If you use Firefox, go to "about:config", and set the value "network.http.max-persistent-connections-per-server" or maybe even "network.http.max-connections-per-server"

If you can change the web page: Often it helps to reorder scripts and images. Here's a good link that explains the concept: http://code.google.com/speed/page-speed/docs/rtt.html#PutStylesBeforeScripts

Chris Lercher
  • 3,982
  • 9
  • 34
  • 41
  • I dont think it is a browser issue, i have just tried doubling that number (from 15 to 30 which seems far more than needed) i think this may be a server side restriction in the apache configuration –  Mar 24 '10 at 16:05
  • But you said earlier: "even on the same computer with a different browser i can navigate fine". So it's most likely a browser issue. Are you on Firefox? Which version? Usually you should tune the "network.http.max-persistent-connections-per-server" (not the other setting, which is about HTTP 1.0 connections). Setting it to >=6 should be enough. If it's already set to 6, then you should change the web page: Reorder the scripts (and of course, don't embed them in the page). If that's not enough, you can even consider switching to AJAX as Marcel recommends. – Chris Lercher Mar 24 '10 at 17:53