For privacy against tracking, I have my browser set up to refuse cookies by default. I only allow cookies from whitelisted domains. In general, this works OK for me. However, I now have at least one case where it has become inconvenient. I pay for a digital subscription to the Washington Post, because I like their journalism and want to support it, but I do not want them or their advertisers tracking me, so I never log in and don't accept cookies from them. This has worked fine until recently. Within the last few days, they have done something new on their web site with the result that although I can view their home page, if I click through to a story, I get this message in firefox:
The page isn’t redirecting properly. Firefox has detected that the server is redirecting the request for this address in a way that will never complete. This problem can sometimes be caused by disabling or refusing to accept cookies.
This is not them paywalling the site. In a browser set up to accept all cookies, I can access all their content without logging in, but any page click creates about 20 cookies from washingtonpost.com and about 20 cookies from their advertisers. If I go to the home page, clear the cookies, and then click on a link to a story, it works, but the cookies get recreated. So it appears that there is code on the pages I'm trying to view that attempts to create these cookies and then throws an error if creating the cookies fails.
Is there any good strategy for this type of situation that preserves my privacy? For example, I thought about writing a script that would run every 10 seconds on my machine and delete all cookies except those from whitelisted domains.
related: https://askubuntu.com/questions/368325/how-to-clear-browsers-cache-and-cookies-from-terminal