7

In CERT secure coding standard, there is a recommendation that "Always specify void even if a function accepts no arguments". A possible security vulnerability is proposed in it.

/* Compile using gcc4.3.3 */
   void foo() {
        /* Use asm code to retrieve i
         * implicitly from caller
         * and transfer it to a less privileged file */
       }

   /* Caller */
   foo(i); /* i is fed from user input */

In this noncompliant code example, a user with high privileges feeds some secret input to the caller that the caller then passes to foo(). Because of the way foo() is defined, we might assume there is no way for foo() to retrieve information from the caller. However, because the value of i is really passed into a stack (before the return address of the caller), a malicious programmer can change the internal implementation and copy the value manually into a less privileged file.

Is there any exploit which uses this "un-secure" coding practice. If not can someone explain or give a sample code how this can be used. Also, if this is really hard to exploit or not possible, could you please explain why?

Jor-el
  • 2,061
  • 17
  • 24

2 Answers2

13

What happens here is that the foo() function uses a so-called old-style declaration, i.e. as things were done in C before the first normalization (aka "ANSI C" from 1989). In pre-ANSI C, a function bar() which takes two arguments of types int and char * would be defined that way:

void bar()
    int i;
    char *p;
{
    /* do some stuff */
}

and it would be declared the following way (usually in a header file):

void bar();

This means that code which uses the function would include the header file, which would then convey information about the existence of the function, and about its return type (here, void), but nothing whatsoever about the number of arguments and their types. Thus, upon usage, the caller must provide the parameters and hope that it sent the appropriate number and types. If the caller code does not do things properly, then the compiler will not emit warnings.

As a security vulnerability, it is not very convincing. It makes sense only in a context of code inspection. Some auditor is reviewing a lot of source code. An evil developer is trying to do evil things which the auditor will not notice (this is the scenario which is explored in the Underhanded C Contest). Presumably, the auditor will look at the header files and also at the start of the function implementation. In your example, he will see:

void foo()
{
    /* some stuff */
}

then the auditor may just assume that foo() takes no parameter, since the opening brace immediately follows the foo(). However, the caller code (which is elsewhere) calls foo() with some parameters. The C compiler cannot warn: since the function is declared "old-style", the C compiler does not know, when compiling the caller code, that foo() does not actually use any parameter (or so it seems). The caller code will push the arguments on the stack (and remove them upon return). The evil programmer then includes in the definition of foo() some handmade assembly to retrieve the arguments from the stack, even if, at the C syntax level, they don't exist.

Thus, a semi-hidden communication channel between two evil codes (the caller code and the called function), in a way which is not visible from a cursory inspection of the function declaration and start of definition, and, crucially, not warned upon by the C compiler either.

As a vulnerability, I find it pretty weak. The scenario is quite implausible.

The issue is more about quality assurance. Old-style declarations are dangerous, not because of evil programmers, but because of human programmers, who cannot think of everything and must be helped by compiler warnings. That's all the point of function prototypes, introduced in ANSI C, which include type information for the function parameters. In our examples, here are two prototypes:

void foo(void);
void bar(int, char *);

With these declarations, the C compiler will notice that the caller code is trying to send parameters to a function which does not use any, and will abort compilation or at least emit a sternly worded warning.

A typical problem with old-style prototypes is failure to do automatic type casts. For instance, with this function:

void qux()
    char *p;
    int i;
{
    /* some stuff */
}

and this call:

qux(0, 42);

The compiler, seeing the call, will believe that the two parameters are two int values. But the function really expects a pointer and then an int. If the architecture is such that a pointer takes the same size on the stack as an int (and also such that a NULL pointer is encoded the same way as an integer of value 0, which is a rather common feature), then things appear to work. Then compile that on an architecture where pointers are twice larger than integers: the code will fail because the 42 will be interpreted as part of the pointer value.

(The details depend on the architecture, but this would be typical of 16-bit C code compiled on a 16/32-bit architecture with 16-bit alignment, e.g. a 68000 CPU. On 64-bit modern architectures, int values tend to be 64-bit aligned on the stack, which saves the skin of many careless programmers. The problem is more general, though; similar issues occur with floating-point types.)

You should use prototypes; not because old-style functions induce vulnerabilities, but because they induce bugs.

Side note: in C++, prototypes are mandatory, so the void foo(); declaration actually means "no argument whatsoever", like what void foo(void); would mean in ANSI C.

Tom Leek
  • 168,808
  • 28
  • 337
  • 475
  • Correct me if I am wrong, specifying C99 standard during compiling code in GCC, this case can be prevented to occur. – Jor-el Sep 27 '13 at 13:42
  • you can go back to the non-safe version if you declared with `...` and use the VARARG macros to populate the parameters – ratchet freak Sep 27 '13 at 13:48
  • 1
    In C99, old-style declarations and definition are an "obsolescent feature" but still supported (see section 6.11.6 and 6.11.7 of the C standard). GCC goes "out of his way" to actually abort compilation when in "C99 mode", which means that GCC's C99 mode is _not_ conforming to C99 -- but I agree that GCC's behaviour makes a lot of sense, and we cannot blame it for being a bit proactive. – Tom Leek Sep 27 '13 at 13:59
5

In C, the function prototype foo(); declares a function which takes an unspecified number of parameters. This differs from many other languages (including C++) in which such a prototype would indicate a function that takes no parameters.

The two potential problems listed on the page you link to are given as Ambiguous Interface and Information Outflow.

Starting with the latter, one might imagine working for an organisation that handles classified information. Code audits could be performed to ensure that TOP SECRET information cannot flow into a file that is classified as only SECRET. Such a code audit could easily miss a prototype like foo();, which would actually allow information to flow into it in the form of an undeclared parameter. If foo(); is something like update_log(); and our double-agent is able to alter the code for update_log(); to receive a parameter and write it as a log entry, you might see how outflow can be achieved.

The other potential issue, ambiguous interface, refers to the fact that the prototype foo(); can be called in different ways. It's a little far-fetched, but not impossible, that such ambiguity could lead to a software vulnerability.

If an implementation of foo(); is doing something very non-standard such as using pointer arithmetic to reach into it's callers stack frame, then passing parameters to foo(); would break that pointer arithmetic leading to arbitrary problems.

lynks
  • 10,636
  • 5
  • 29
  • 54