I have seen proprietary testing results of pummelling a CA for a long time and measuring its throughput (certs issued / second). It was found that CAs become slower to issue new certificates as the size of their database / revocation list grows (proprietary work, no results to link to sadly). For example, some corporate infrastructure CAs (ex.: S/MIME) bog down with around 100,000 certs in the database. EJBCA bogs down around 1 million certs, etc.
My question is: databases can easily handle fast lookups / inserts on millions of lines, so what is different about certificate issuance? Presumably there is some sort of crypto going on whose runtime depends on the size of the database.
Note that this is a follow-on question from comments to this question.