4

I consistently get this exception when trying to run my Junit tests on my mac:

java.lang.OutOfMemoryError: unable to create new native thread
        at java.lang.Thread.start0(Native Method)
        at java.lang.Thread.start(Thread.java:658)
        at java.util.concurrent.ThreadPoolExecutor.addIfUnderMaximumPoolSize(ThreadPoolExecutor.java:727)
        at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:657)
        at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:92)
        at com.google.appengine.tools.development.ApiProxyLocalImpl$PrivilegedApiAction.run(ApiProxyLocalImpl.java:197)
        at com.google.appengine.tools.development.ApiProxyLocalImpl$PrivilegedApiAction.run(ApiProxyLocalImpl.java:184)
        at java.security.AccessController.doPrivileged(Native Method)
        at com.google.appengine.tools.development.ApiProxyLocalImpl.doAsyncCall(ApiProxyLocalImpl.java:172)
        at com.google.appengine.tools.development.ApiProxyLocalImpl.makeAsyncCall(ApiProxyLocalImpl.java:138)

The same set of unit tests pass perfectly fine on ubuntu and windows.

Some information about my system resources on the mac:

$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
file size               (blocks, -f) unlimited
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 1
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 266
virtual memory          (kbytes, -v) unlimited

$ java -version
java version "1.6.0_24"
Java(TM) SE Runtime Environment (build 1.6.0_24-b07-334-10M3326)
Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02-334, mixed mode)

The reason I dont think this is an application issue is because the same tests pass in different environments. I have tried setting heap to 1024m, 512m and setting the stack to 64k and 128k (and each of these combinations) with no luck. My open files was originally 256 and I have bumped this to 1024.

I have been googling around for a bit and all posts say to decrease heap size and increase stack size but that doesnt seem to help. Anyone have anymore ideas?

EDIT: Here are is some environment information on my ubuntu box:

$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 20
file size               (blocks, -f) unlimited
pending signals                 (-i) 16382
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) unlimited
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

$ java -version
java version "1.6.0_24"
Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02, mixed mode)
Brad
  • 141
  • 1
  • 1
  • 4

4 Answers4

1

It could be the max user processes you're hitting. Try increasing that to something like 1024.

ulimit -u 1024

There are global limits to this, so check out sysctl.conf and look at the output of:

sysctl kern.maxprocperuid kern.maxproc

and adjust if needed.

Mat
  • 1,536
  • 1
  • 17
  • 21
  • Unfortunately, that didnt help... I run out of threads for the same set of tests. When I run those tests by themselves, they pass. – Brad May 19 '11 at 02:42
1

Allocate more memory to the jvm. Usually this is the cause. If you have 1GB, add 0.5GB or 1. Depending on your RAM, you have to keep 1-2 GB of the RAM for OS processes. Do not allocate more memory than you have.

As Mat said, it could be the issues with too many opened files, but, in this case, you should see such a message. Search for it in logs.

Paul
  • 1,837
  • 1
  • 11
  • 15
0

Another approach is to tune the JVM stack size as mentioned here: Preventing "OutOfMemory: unable to create new thread" using Javas -Xss non-standard option (see also tool docs)

This might be a feasible option, whenever it's not in your power to control/reduce the amount of threads (e.g. when the culprit is a third party lib or framework you are not allowed to ditch)

Jörg
  • 103
  • 1
  • 5
  • Thanks for the help but no luck. Something I noticed today is that the failing tests seem to be limited to the use of one third party lib. So, I may just begin mocking everything as I should have before. – Brad Jul 05 '11 at 16:52
0

stack size (kbytes, -s) 8192

That value is way too big for a large multithreaded application. If you are spinning a lot of threads try setting it to 1024 with ulimit -s or in /etc/security/limits.conf

Tyler Zale
  • 101
  • 2
  • Could you add more info on why that setting is too much? – Jacob Mar 21 '13 at 00:07
  • Every thread created with a stack usually defaults to that value, 8MB allocated per thread can get rapidly eat through the memory on your machine when you spin up hundreds or thousands of threads. – Tyler Zale Mar 21 '13 at 00:12
  • More knowlegeable people than me can explain it better, but from my testing on large production machines with this problem -> jvm defaults to system default -> defaults to limit, changes it has fixed my problems. – Tyler Zale Mar 21 '13 at 00:14