Rstudio desktop and R desktop memory allocation

0

I am very new to machine learning (caret) and I am having trouble with RAM allocation in RStudio, but not R.

I am trying to run a Random forest model using cross validation. After preparing my train data.table (149717 obs with a size of 5392816 bytes) I get a error message:

Error: cannot allocate vector of size 1.1 Gb

Meaning that I don't have enough RAM to load the object. So I decreased my train data.table systematically and eventually, with only 5% of the original training data, the model works without producing this error.

However, this is not ideal, and i need to run my model on a larger sample than 5% of the original size. I decided to test this in R (not Rstudio) and my model works just fine with no memory issues using the full training data.table.

When looking at my task manager during R and Rstudio processing, it seems that R is way more efficient with memory usage than Rstudio (Rstudio used 100% of RAM while R used 65% while ruing the SAME test with the SAME data!).

When I use the memory.limit() function in Rstudio i get this output: 1.759219e+13 And then I run the same function in R I get this output: 16291

Laptop specs: 256 GIG SSD. 16 GIG RAM. Windows 10 Pro 64bit. I7 with 8 logical cores.

My question is: Why is my memory allocated so differently in R vs Rstudio? The memory limit for R is fine and uses all my RAM I have, but this is not the case with Rstudio.

Any ideas what can be causing this issue, or advise to resolve this RAM allocation issue?

kjtheron

Posted 2019-07-30T14:28:47.527

Reputation: 1

Answers

0

For those interested in this, see this link.

https://community.rstudio.com/t/rstudio-and-r-memory-allocation/36477

This memory.limit() function is a bug in Rstudio. Work is underway.

kjtheron

Posted 2019-07-30T14:28:47.527

Reputation: 1