1
I recently "cut the cable" and downgraded from cable internet (40 Mb/s) to DSL (5 Mb/s). It's awful but I'm stuck with it for a year. What I would like to do is pre-cache everything on pages I visit daily (on my NAS) and everything linked to from that page. The first page of HN for example. I'd like all devices on my network to access the same cache (so no browser add-in solutions please). I would like the cache to automatically clean old content (age based, cache size, whatever). I'm using Tomato on my router.
I'm sure I could figure out how to re-route requests in Tomato with a custom DNS and it wouldnt be terribly hard to set up a python job to cache the pages, but It would take me a full day or more I'm sure.
Others with slow internet must have worked out something similar. I'm just not finding much with the search terms I am using. Anyone know of a tutorial on how to set this up? Anyone have any experience doing something similar? Are there any turn-key solutions (commercial or not) out there?
I realize static pages are getting more and more rare these days. Maybe this is a fruitless endeavor. A better example would be to pre-cahce the imagur links from the reddit or something like that.
This probably runs afoul of some site's terms/conditions, but I'm only planning on making one request a day.