While this of course depends on the encryption method, the usual, short answer is: no, making the file bigger does not make exhaustive search harder.
In general, someone tries an exhaustive search on some key because he is interested in the file contents, meaning that he has context: he already knows the kind of data that he will find. Decrypting only the start of the file is enough to know if he got something that "makes sense" (e.g. a recognizable header for a video file, picture, XML document...) or random junk. This kind of test does not have to be really strict: all the attacker needs is to be able to reject, say 99% of potential keys, and then try to decrypt some more of the file in the 1% of cases where more investigations are needed.
This assumes that the encryption method works in a "streamed" fashion, where the file can be decrypted as the data flows, which is how most encryption systems work in practice, because it is a much desirable feature, especially when handling huge files.
Of course, the right way to defeat exhaustive search is to use a random enough key in the first place. A 128-bit key, generated with a cryptographically secure PRNG, will thwart any exhaustive search, regardless of the encrypted file size. When a cracking attempt is so slow that it does not work in practice, it cannot be made "slower" in any meaningful way: it does not work, and that's the end of it.
If you cannot get a strong key, e.g. because you are doing password-based encryption, and passwords are weak (almost by definition), then you indeed want to make each decryption attempt heavy and slow, so as to try to improve your security situation. But the right method is to do so when transforming the password into a key (something called password hashing), not when applying the key to encrypt the actual data.