bash Segfault by infinite recursion

2

I just noticed that the bash I'm using (4.2.25(1)) isn't protected against infinite function recursion. In such a case a Segfault happens (and the bash process terminates). To check this on your bash, just type:

$ bash
$ f() { f; }
$ f

(Starting a subshell first (the first line) gives us a candidate we can experiment with without endangering the shell of the terminal; without the first line, your terminal window will probably close so quickly you cannot see anything interesting.)

I understand the reason of this phenomenon, I think; it probably is a stack overflow which results in the bash trying to write into regions of the memory which are not assigned to its process.

What I'm wondering about is two things:

  • Shouldn't there be a check in the bash to protect it against such situations? A more decent error message like "Stack Overflow in shell function" would certainly be better than a simple unhelpful Segfault.

  • Could this be a security issue? Before this method writes into memory parts which are not assigned to the process (resulting in a Segfault), it might overwrite other parts which were not meant to be used for the bash-internal stack.

Alfe

Posted 2015-02-19T11:47:33.540

Reputation: 263

1Can confirm this for bash 3.2.57 (OS X 10.10.2 Yosemite) and 4.3.11 (Ubuntu 14.04 LTS). This should not happen (from man bash): Functions may be recursive. No limit is imposed on the number of recursive calls. Though that may just mean that bash doesn't limit it, only available memory :-) – jaume – 2015-02-19T13:43:01.310

1

It's seems to be an old known behaviour since at least 2003. Maybe you can find interesting this.

– Hastur – 2015-02-20T11:16:41.967

Thank you @hastur, that was enlightening about the history of the "bug". The question remains whether this is a security issue. Probably too chaotic behaviour to exploit it, I guess. – Alfe – 2015-02-23T00:39:09.310

No answers