The other answer covers the reasoning behind this - for modern systems, mostly keeping load times within the event viewer GUI somewhat bearable. Copying the current log to a location that gets backed up, then clearing it, is also good.
For parsing large log files that end up being generated anyway, two good options occur:
1) Parse the log faster than the current GUI can manage or
2) Split the log into separate files.
I'm sure there are some easily-available utilities out there for 2), so I'll focus on 1).
Firstly, Powershell has an excellent cmdlet for this functionality called 'get-winevent'. The fastest performance I've seen involves using hash tables. Here's an example that gets all events in the security log pertaining to a specific user from the last day:
$timeframe = (get-date) - (new-timespan -day 1)
$userevt = Get-WinEvent -ComputerName <specify> -FilterHashTable @{LogName='Security'; Data='<enter username here>'; StartTime=$timeframe}
$userevt is now a collection of events. Depending on the number of matches, you can pipe it through to format-list to easily read a small number of events. For a medium number, do the same but redirect the output to a file:
$userevt | format-list > <outputfile>.txt
For a large number, start filtering (say you want the caller computer for a lockout event on the user we acquired above):
$userevt | %{if ($_.message -match "Caller Computer .*") {$matches[0]}}
This will show a single-line result for each lockout event. The above processes generally take 1-4 minutes for a 4GB log on 2008 R2.
Secondly, especially for any 2003 machines you might end up having to manage, you can right-click on a particular log file in the left pane in event viewer, and select 'save log file as'.
If you are running event viewer on the local machine, you can save a .evt file that can be parsed by get-winevent.
Alternatively, you can save a text or CSV file (I find CSV easier) that can be parsed by appropriate command-line utilities such as grep or findstr, or certain programs like notepad++.