There are several convincing reasons why the answer is "no".
Some sites add content dynamically (e.g. facebook) the moment you scroll down, so these are probably the cases you are not after.
Other sites add content dynamically independent of user interaction (such as scrolling). In principle, these could be handled somehow by preventing/throttling the responsible web connections once the page is full. However, trying to reenable this transparently in case you decide to go to the "next page" is probably doomed as one may expect the processes to have given up because of timeouts.
Then again there are pages that simlpy output lots of page content, maybe even static content. The client might simply kill the connection once there is enough to render a screenfull of data, but that means that to get to the "next page", you'd have to start loading the page afresh. With many pages, this wuld cause a lot of wasted bandwidth for useless reloads; also this may not be desireable in case of side-effects (such as online orders). Alternatively, the client might throttle the TCP connection until you select to go to the "next page". I'm afraid the web servers out there won't be happy (and drop your connection) with such a solution because it wastes their resources. Moreover, if you want to be able to flip back through previous pages, you still have the problem of needing to keep the full content in RAM or cache.
Yet another alternative would be to download the page completely and simply display it pagewise. But that is equivalent to what you have right now: scroll through a long page. You might only hope that most part of the memory footprint is in disk instead of RAM, so maybe this option is not too far-fetched.
All this does not take into account that the overall layout may require more or less th efull page content to decide what the "pages" should look like or where to break pages.
The only really feasible way I see is to make use of mechanisms that allow for paged display of the a priori unpaged content: to use media:print
styling, which would essentially amount to downloading the page and making a print preview. You may already have observed that many pages display awfully in print or print preview even though many styling options are available to web developers to address paging specifically, so imagine what you would get.
However, if one were to employ this as a (e.g. Firefox) addon, the workflow would be that the page downloads and will then be presented like in parint preview. Without going deeply into the bowels of the browser, this will still result in the full page being held in RAM, so not what you are after.
adblock the specific javascript that causes the infinite scrolling? – endolith – 2016-08-20T20:15:47.447
@endolith: There are different solutions for different websites, as each one implements infinite scroll differently. Please indicate which are the main website(s) which cause you problems. – harrymc – 2017-05-26T05:09:35.250
@harrymc I want to block all of them. An adblock filter list that blocks as many as possible, for instance. – endolith – 2017-05-26T14:49:39.817
@endolith: You will need multiple adblock rules for multiple sites. And in addition, what you really want is to convert the scroll into the normal Next/Previous buttons. This is available for some websites, like Wordpress. – harrymc – 2017-05-26T15:03:24.220
@harrymc Yes, that's fine. – endolith – 2017-05-27T01:27:17.467
6You could disable JavaScript. But anything else would be impossible without the website itself supporting paging. – slhck – 2013-08-16T07:23:38.290