0

I have a java application that runs on a Linux server with physical memory(RAM) allocated as 12GB where I would see the normal utilization over a period of time as below.

sys> free -h
              total        used        free      shared  buff/cache   available
Mem:            11G        7.8G        1.6G        9.0M        2.2G        3.5G
Swap:            0B          0B          0B

Recently on increasing the load of the application, I could see the RAM utilization is almost full, and available space is very less where I could face some slowness but still application continues to work fine.

sys> free -h
              total        used        free      shared  buff/cache   available
Mem:            11G         11G        134M         17M        411M        240M
Swap:            0B          0B          0B
sys> free -h
              total        used        free      shared  buff/cache   available
Mem:            11G         11G        145M         25M        373M        204M
Swap:            0B          0B          0B

I referred to https://www.linuxatemyram.com/ where it suggested the below point.

Warning signs of a genuine low memory situation that you may want to look into:

  • available memory (or "free + buffers/cache") is close to zero
  • swap used increases or fluctuates.
  • dmesg | grep oom-killer shows the OutOfMemory-killer at work

From the above points, I don't see any OOM issue at the application level and the swap was also disabled. so neglecting the two points. One point which troubles me was available memory is less than zero where I need a clarification

Questions:

  1. In case available is close to 0, will it end up in a System crash?
  2. Does it mean I need to upgrade the RAM when the available memory goes less?
  3. On what basis the RAM memory should be allocated/increased?
  4. Do we have any official recommendations/guidelines that need to follow for RAM memory allocation?
  • Where do you see "available memory less than zero"? – Nikita Kipriyanov Sep 27 '21 at 09:02
  • Have you rebooted recently? sometimes applications will just keep using memory but if you restart they often go back to a lower usage. – Chopper3 Sep 27 '21 at 09:27
  • 3
    Java by default does not release heap memory to the OS in most cases, even if it is no longer using it. Before you go out to buy more RAM, profile the app to see how much memory it is really using. – Michael Hampton Sep 27 '21 at 10:31
  • @NikitaKipriyanov referred in this article https://www.linuxatemyram.com/ under the section "When should I start to worry?" – ragul rangarajan Sep 27 '21 at 11:09
  • @Chopper3 No reboot has been done. I have observed your point of memory going low on the restart and keep on increasing in the upcoming days. If the application needs to serve its purpose for a customer, restarting it will lead to an outage which won't be the proper case right? – ragul rangarajan Sep 27 '21 at 11:34
  • @MichaelHampton Hope profile of app means max heap allocation for the application which is configured as 50% of RAM (6GB). – ragul rangarajan Sep 27 '21 at 12:07

1 Answers1

1

Able to get an answer for one my question

In case available is close to 0, will it end up in a System crash?

On testing in one of my servers, where I loaded the memory with almost full as below

sys> free -h
              total        used        free      shared  buff/cache   available
Mem:            11G         11G        135M         25M        187M         45M
Swap:            0B          0B          0B

Able to see my application alone (which consumed more memory) got killed by the Out of memory killer which can be referred in kernel logs

dmesg -e

[355623.918401] [21805] 553000 21805 69 21 2 0 0 rm
[355623.921381] Out of memory: Kill process 11465 (java) score 205 or sacrifice child
[355623.925379] Killed process 11465 (java), UID 553000, total-vm:6372028kB, anon-rss:2485580kB, file-rss:0kB, shmem-rss:0kB

https://www.kernel.org/doc/gorman/html/understand/understand016.html

The Out Of Memory Killer or OOM Killer is a process that the linux kernel employs when the system is critically low on memory. This situation occurs because the linux kernel has over allocated memory to its processes. ... This means that the running processes require more memory than is physically available.