0

I am running a couple of spiders in parallel by scrapyd 1.2. Each process will raise the Buffer during the crawl significantly as seen in the chart. What is this value and how can I reduce the footprint?

RAM

merlin
  • 2,033
  • 11
  • 37
  • 72
  • Why do you want to reduce the footprint? The memory would go to waste otherwise and making memory free takes computational effort. – David Schwartz May 26 '20 at 20:18

1 Answers1

2

Linux will use available memory for various caches, mostly file related. slabtop command to see details.

how can I reduce the footprint?

You don't. These will be evicted quickly and automatically if needed.

Further, it is not yet a concern. 1 GB and change free on a 4 GB system is a significantly sized chunk of unused RAM.

https://www.linuxatemyram.com/

John Mahowald
  • 30,009
  • 1
  • 17
  • 32
  • Well, I updated the V-Box to 4GB from 2GB. Looking at the footprint I am asking myself if this was really needed. My question is, will it reduce the performance of the machine if I go back to 2GB since the blue bar is unclear to me. – merlin May 26 '20 at 22:35
  • 2 GB total, with a load similar to the 4 GB graph, would start reclaiming from cache as free decreases. Only testing can tell if the performance is adequate for your requirements. I predict it will be acceptable. – John Mahowald May 26 '20 at 23:24
  • Thank you. Good and helpful answer. – merlin May 27 '20 at 05:10