Any number you get is going to be fairly meaningless -- some factors to consider:
Programming Language - Some languages let you do very unsafe things; e.g., C makes you directly allocate memory, do pointer arithmetic, has null terminated strings, so introduces many potential security flaws that safer (but slightly slower) languages like ruby/python do not allow. Purpose of application? What type of coder/code review?
Type of Application - if a non-malicious programmer writes a relatively complex angry bird type game in Java (not using unsafe
module), there a very good chance there aren't any "exploitable" bugs -- especially after testing; with the possible exception of being able to crash the program. A web application in PHP written by amateurs, has a good chance of having various exploitable flaws (SQL injection, cross-site scripting, bad session management, weak hashing, remote file inclusion, etc.).
Programmer expertise at writing secure code. If you hire a high school student with no past experience to code up some web application, there's a reasonable chance they'll be major flaws.
Furthermore, counting the number of "exploitable" bugs is not a straightforward task either; if finding bugs was straightforward they'd be removed in code review. Sometimes many bugs only arise due to subtle race conditions or complex interactions among programs/libraries.
However, if you take open-source projects, its fairly easy to find a count of LoC at ohloh.net and a count of "exploitable" vulnerabilities at cvedetails.com (I arbitrarily defined 'exploitable' as CVSS over 7). I randomly decided to look at some web browsers, programming languages, and web frameworks and found:
Web Browsers:
open source programming languages:
Web Frameworks:
So again for these specific major programming projects (likely written by expert programmers) found rates of major exploitable vulnerabilities at a rate of 0.003 to 0.08 per 1000 LoC. (Or 1 per 12 500 - 300 000 LoC). I would necessarily extrapolate to non major open source projects.