0

Saw this in the journalctl for a service I have:

jul 29 12:39:05 ubuntu-18 node[796]: GET http://www.123cha.com/ 200 147.463 ms - 8485
jul 29 12:39:10 ubuntu-18 node[796]: GET http://www.rfa.org/english/ - - ms - -    
jul 29 12:39:10 ubuntu-18 node[796]: GET http://www.minghui.org/ - - ms - -     
jul 29 12:39:11 ubuntu-18 node[796]: GET http://www.wujieliulan.com/ - - ms - -    
jul 29 12:39:11 ubuntu-18 node[796]: GET http://www.epochtimes.com/ 200 133.357 ms - 8485    
jul 29 12:39:14 ubuntu-18 node[796]: GET http://boxun.com/ - - ms - -

These GET requests are not coming from any code I've written.

"Correct" entries look like this:

jul 29 12:41:46 ubuntu-18 node[796]: GET / 304 128.329 ms - -
jul 29 12:41:47 ubuntu-18 node[796]: GET /stylesheets/bootstrap.min.css 304 0.660 ms - -
jul 29 12:41:47 ubuntu-18 node[796]: GET /stylesheets/font-awesome-4.7.0/css/font-awesome.min.css 304 0.508 ms - -
jul 29 12:41:47 ubuntu-18 node[796]: GET /img/250x250/deciduous_tree_5.thumb.png 304 0.548 ms - -
jul 29 12:41:47 ubuntu-18 node[796]: GET /stylesheets/style.css 304 7.087 ms - -
jul 29 12:41:47 ubuntu-18 node[796]: GET /img/logos/250x250/brf_masthugget.250x250.jpg 200 0.876 ms - 9945

The server is a nodejs instance v8.10.0, running on nginx v1.14.0, running on up to date Ubuntu server 18.04.

The ubuntu is a Digital Ocean droplet.

I've tried generating similar requests from a javascript console, but my the browser blocks access to http (not allowing mixed http and https); if I try https I get cross-origin error - which is good :)

I'm puzzled as to how these GET requests are being generated/sent?

minisaurus
  • 103
  • 2
  • 3
    We get a lot of "what do these logs mean?" Request and they are generally off topic. We're not a log analysis service because, and if we were we wouldn't be able to handle the potential question volume! – Conor Mancone Jul 29 '20 at 12:08
  • 4
    I would add one more important point though, that you may be missing: the browser is just one of many possible HTTP clients, and it is by far the most limited one. If you are going to do some testing don't use your browser. Use something like curl or postman because then you will be able to make arbitrary requests, and you won't run into things like mixed origins or cross origin errors, and that alone may allow you to answer your question – Conor Mancone Jul 29 '20 at 12:10
  • 2
    In nginx, you can setup your site's server block with server_name and setup an extra server block `server { listen 80 default_server; server_name ""; return 444; }`. This will cause requests without domain name or for a domain name that is not one of the domains of one of the server blocks to be rejected, and not reach the webapp. Same thing with `listen 443 ssl` and SNI. – Z.T. Jul 29 '20 at 13:30
  • 1
    It looks like indeed some of the requests are successful, because of the 200 after epochtimes and 123cha... indicates 200 OK.... – john doe Jul 31 '20 at 21:12
  • 1
    Does this answer your question? [My web server is hit by strange traffic, what is that?](https://security.stackexchange.com/questions/9185/my-web-server-is-hit-by-strange-traffic-what-is-that), [Weird request to NodeJS Webserver](https://security.stackexchange.com/questions/99743/weird-request-to-nodejs-webserver). – Steffen Ullrich Aug 01 '20 at 18:03

2 Answers2

1

This format is used by a kind of HTTP proxy.

Unless your server is actually misconfigured or hacked to be used as a proxy (you should get rid of the browser limitation and test it out), it may have the following possibilities:

  • Someone is scanning a wide range of IP addresses for open proxies.
  • Someone is testing the availability of an open proxy once existed before you created this server and this IP address is allocated to it.
  • Someone who happens to know your website is testing the keyword filter (note: not the only filter) between mainland China and the rest of the world.
  • Someone tries to get your server blocked from a specific IP address in China by a few minutes. Usually not a big problem because it could only block the sender's own IP address (as I know). But I've heard rumors that a malicious software maker uses this method to block an overseas potential competitor (but never knew a confirmed case). In this case there must be a lot of such requests from different IP addresses in China, which may already be a kind of DDoS to make it actually working, unless your users are from a very small group that all already have the other software installed.
user23013
  • 660
  • 5
  • 11
  • How can I test if my server is misconfigured or hacked to be used as a proxy? – minisaurus Aug 03 '20 at 15:24
  • @minisaurus One way is to send the request manually: `telnet your.domain.name 80` `GET http://.../ HTTP/1.1\r\nHost: your.domain.name\r\n\r\n` and see what happens. – user23013 Aug 03 '20 at 20:34
  • I did that: `telnet your.domain.name 80` and `GET http://www.123cha.com/ HTTP/1.1\r\nHost: your.domain.name\r\n\r\n` I received `HTTP/1.1 400 Bad Request Server: nginx/1.14.0 (Ubuntu)` I guess that's good? Or is it bad that I can telnet in? – minisaurus Aug 04 '20 at 18:45
  • Did you change the `\r\n` into newlines? If yes I think it's good. – user23013 Aug 04 '20 at 18:52
  • I hadn't done that, but have done now, and instead got `HTTP/1.1 301 Moved Permanently Server: nginx/1.14.0 (Ubuntu)` Still good? I know my site sends 301 on port 80 and redirects to 443 – minisaurus Aug 04 '20 at 18:57
  • I suppose it also returned `Location: a good url` and no much useful content (such as things from the 3rd party website) follows. In that case I think it's good. – user23013 Aug 04 '20 at 19:02
  • @minisaurus Correction: The url you have tried is also a redirection. So if it redirects to `https://www.123cha.com/` which is the target of the original url, it may not be good. If it's your https address as always, it should probably be good. – user23013 Aug 04 '20 at 19:10
  • It returned `Location https://my.domain`; it did not redirect to `https://www.123cha.com/` The html returned is ` 301 Moved Permanently

    301 Moved Permanently


    nginx/1.14.0 (Ubuntu)
    `
    – minisaurus Aug 04 '20 at 19:23
0

After research, I also added a filter and jail to fail2ban: (from this page: digital ocean fail2ban on ubuntu)

Filter (file nginx-noproxy.conf in /etc/fail2ban/filter.d):

[Definition]

failregex = ^<HOST> -.*GET http.*

ignoreregex =

Jail (in /etc/fail2ban/jail.local):

[nginx-noproxy]

enabled  = true
port     = http,https
filter   = nginx-noproxy
logpath  = /var/log/nginx/access.log
maxretry = 2

Will see how it goes and update

minisaurus
  • 103
  • 2