I am learning about buffer overflows and I’m developing my very first exploit.
There is a server process that listens to a socket and forks a new process for each client. The child process has a buffer overflow vulnerability which I’m exploiting.
My exploit works if I start the server using gdb, however I get a segfault when the exploit code is run if I simply start the server with no gdb.
My question is - does gdb automatically deactivate some protection mechanisms, like aslr/stack protection etc? What would be a possible explanation of this behaviour?
I have compiled the server with -zexecstack -fno-stack-protector , I still can’t exploit it without gdb.
This is on debian x86. By running the server with gdb I mean I run ‚gdb server‘ and then just type ‚run‘ in the gdb console, no breakpoints or anything else. This way my exploit is successful (makes a curl request to my server)