Basic Proxy Server config fails for some websites. TCP_MISS and TCP_MISS_ABORTED ISSUES?

3

I have a basic installation of squid working on Ubuntu 14.04. All I want really is to log requests made. I don't really care for ACL or caching. Perhaps later. I looked at the long squid.conf file which is mostly commented. I "grep"ed the non-commented lines and got a new conf file. Then I just commented out what I thought was not needed and added a few lines that I thought were needed. Here is my conf file without the commented lines:

http_access allow all
http_port 3128
coredump_dir /var/spool/squid3
cache deny all
dns_nameservers 202.148.202.4 202.148.200.3
positive_dns_ttl 72 hours
negative_dns_ttl 30 seconds

The problem I have is that clients of the proxy server have problems with only some sites though mostly it works for other sites.

For example, http://locator.intel.in/find-reseller/

This one is really bizzare... The page displays the error message "Trying to get property of non-object" but through another client which does not using this proxy server or any proxy server, the page is displayed properly.

The entry in the proxy "access.log" file is:

430748230.547   2440 192.168.1.5 TCP_MISS/500 63048 GET http://locator.intel.in/find-reseller/ - HIER_DIRECT/198.175.66.130 text/html

Now the TCP_MISS/500 I presume is that there is a miss... no caching so no hit. Fine. The 500 is the internal http error? How did the proxy initiate an error on the http server? Was it a bad-formed HTTP request initiated by the proxy?

For another link which fails to display the page, http://www.incentre.net/tech-support/other-support/ethernet-cable-color-coding-diagram/, the cache.log entries are, like:

1430749180.834  60659 192.168.1.5 TCP_MISS/503 4111 GET http://www.incentre.net/tech-support/other-support/ethernet-cable-color-coding-diagram/ - HIER_DIRECT/206.75.231.199 text/html
1430749194.846  12853 192.168.1.5 TCP_MISS_ABORTED/000 0 GET http://www.incentre.net/favicon.ico - HIER_DIRECT/206.75.231.199 -

What configuration among the 300+ Squid configuration directives am I missing?

Sunny

Posted 2015-05-04T14:30:06.267

Reputation: 331

1

This is an annoying issue. Start here: http://serverfault.com/questions/358754/how-to-make-squid-work-like-proxy-only-without-caching-anything

– Alex Atkinson – 2015-05-04T15:06:48.233

@Alex Atkinson Tried your suggestions. Did not help. What is "annoying" indeed is that some links (not the ones shown above) work when retried. Without the proxy, none of these problems exist. In fact, I have even disabled iptables for now, as this is a test system anyway, to isolate the problem and make analysis easy. – Sunny – 2015-05-04T15:44:33.000

1Oh Squid... The service was restarted after the conf changes? Other than this, you can get some more information by attempting the same failing urls from various browsers. This may provide another clue. Do you receive the same behavior when fetching the urls via command line with wget or curl? What do you get when running 'squidclient mgr:info' from the CLI? You can also -h to retrieve the info remotely. – Alex Atkinson – 2015-05-04T17:05:23.847

@AlexAtkinson Yes, very much so... I mean I did restart the service the crude way (sudo service squid3 restart) rather than having it read the new config. I will try curl. I tried this through IE on Windows and Chrome on Ubuntu-desktop. Will look into other suggestions too. I read somewhere that this could be a IPv6 issue but the solution provided did not work out. The key is that these sites work OK if the proxy is turned of. – Sunny – 2015-05-04T17:09:35.243

@AlexAtkinson Also... the error in the browsers (Firefox and Chrome) is the same: The system returned: (110) Connection timed out – Sunny – 2015-05-04T17:24:55.560

1Add 'dns_v4_first on' to the conf and 'service squid3 restart'. Do your clients have issues performing nslookup queries against those failing sites? Do the 'squidclient -h <server> -p 3128 mgr:info'. You could run tcpdump on the squid server and see if you can sherlock anything from the output. – Alex Atkinson – 2015-05-04T17:28:12.920

@AlexAtkinson Had tried setting dns_v4_first on earlier. Did not solve the problem. Squidclient -h... tried with -v option... fails to connect after successful DNS search... nslookup works fine... gets the IP address. Problem is at connection level... perhaps tcpdump or wireshark will help... have not used in a long time. Will try and see... If you think of anything else, do let me know. Some tcp-related options in squid perhaps? – Sunny – 2015-05-04T17:46:25.927

No answers