How to setup a cheap linux router/proxy to log all URLs requested

1

1

I have a bunch of kids/devices connecting to the wireless router in my house at any given time.

I know that there are many software/hardware solutions designed to filter out harmful content but I favor open communication with my kids over filtering out stuff. I told my family that I was going to start listing all the webpages visited by everyone in the house on a page that we could all look at.

I want to build a custom linux device (maybe using a raspberry pi) that sits between the modem and router that logs all the URLs requested by the various devices on the network and then dump them to a simple intranet page

Here are my questions for this scenario:

1) What is the minimum hardware I will need to do this simple logging of all URLs requested from the modem? Can this be done with a raspberry pi or will I need a laptop, pc, or something bigger?

2) What size sd card (or hard drive) will I need to run linux, apache, and store about a month of all URLs requested for 10 devices in a home environment?

3) Is this something that will need 2 nics as a go between the modem and router and be considered a router itself? I would like to not have to configure all the various devices already connecting to the wireless router if possible.

4) Is what I'm trying to do guaranteed to slow down my network?

Any advice from smart superusers would be very helpful before I start in on building this. Thanks

DaveAlger

Posted 2014-12-27T14:42:42.357

Reputation: 131

3You should be aware that if you only monitor network traffic, you will not be able to see any HTTPS webpages that are visited unless you perform a man-in-the-middle attack. For HTTPS webpages, you will be able to see the IP address of the website, but both the URL and the content of the page will be encrypted. – tlng05 – 2014-12-27T15:25:43.577

so i won't be able to see how many times facebook was requested since it is https? – DaveAlger – 2014-12-27T15:29:10.753

Inho you need to setup a squid server (proxy) to get all urls requested. Maybe you can look around for a specialized distro like smoothwall or similar http://www.squid-cache.org/Support/products.html

– maudam – 2014-12-27T15:31:21.783

1If your goal is to merely count the number of visits to facebook, you can get a list of facebook's IP address ranges and count how many connections are made to those addresses. – tlng05 – 2014-12-27T15:38:54.500

Bummer. I didn't expect this to be a hard thing to setup on a small home network. I figured linux gurus everywhere were already doing this. – DaveAlger – 2014-12-27T15:39:43.673

yeah i don't care about content or even the full URL as long as I can accurately determine the root domain name (google, facebook, netflix, etc) -- is there a list of these IP ranges somewhere? – DaveAlger – 2014-12-27T15:42:18.200

2Sure, just google "[website] ip addresses" and you should be able to find a list. If you really want to monitor HTTPS, you can go for the man-in-the-middle attack route, but you will have to generate a certificate and import it into every browser in order to prevent big red warning messages from being shown. The only problem is that some tablets/smartphones may not have the option of importing certificates. – tlng05 – 2014-12-27T15:46:25.307

I checked out Spector 360 and smoothwall but those solutions require installing some client agent or just do way more than I need. I figured all the URLs requested on my network were already in a log file somewhere and I'd just need a small bit of code to parse and count them. -- I figured wrong -- :( – DaveAlger – 2014-12-27T16:00:59.183

Some routers do already have URL logging capabilities built in. Check whether yours does. – tlng05 – 2014-12-27T16:12:25.913

No answers