13

I found this Microsoft KB that covers recommended Event Log setting maximums for operating systems up to Windows 2008/Vista, which recommends a maximum of 4GB, and have seen some other vague references that an Event Log larger than 4 GB is not recommended in at least 2008 R2, but I'm wondering what actually happens if an event log exceeds this size?

I've exceeded this on a test server (2012 R2) and haven't noticed anything like high memory usage etc. We don't care about OSes before 2008 R2, but want a large log because we are collecting events from many machines via Windows Event Forwarding and want to have all the events in one place.

HopelessN00b
  • 53,385
  • 32
  • 133
  • 208
lgaud
  • 243
  • 1
  • 3
  • 6
  • 3
    As your question intrigues me, and my boss pissed me off today, I'll let the event log on one of our servers grow out of control tonight, and post the results back into my existing answer, but as I say, 4 GB isn't a hard limit in 64 bit OSes, and my experience has been that even 32 bit apps and APIs usually handle files >4 GB. – HopelessN00b Feb 24 '15 at 21:13
  • Ah, looks like it might be a bit longer to generate a >4 GB event log file. Our busiest domain controller cleared its log 20 minutes ago. – HopelessN00b Feb 24 '15 at 21:25

3 Answers3

10

Other than the awful performance and ridiculous wait times when you have to load a 4 GB log and the hell it will be if you ever have to search through such a monstrous thing, not much. I think the largest one I've seen in my environments was 10 GB, and although I gave up waiting on it to load, it didn't seem to harm anything.

The 4GB caution for Server 2008 is due to that 32-bit limit that's often encountered at 4 GB. On a 64 bit system, you should be fine to let it grow to up to 16 TB (or 64, depending), though I don't know that anyone's gotten anywhere close to testing that limit.

Of course, if you haven't already, you'll discover that very large log files are simply impractical to use - the last time I tried to load a simple 100 GB (text) log file, it couldn't even be opened without crashing the application opening it, and I suspect you'll hit that issue well before 100 GB.

The far better approach is to limit the file size to something reasonable, and use a script to clear it out from time to time. I use the below in my environment, combined with a 1 GB size limit on our security log. Some (well, most) of our servers generate over 3 GB of security events per day, and we don't want to waste all that space on huge log files I'll quit before combing through, so my script copies the log contents to another folder and then clears the event log to be written to again. And since the folder I copy them to is backed up, we can always go back to the logs in the horrible event that we need to.

#Adapted from: http://blogs.technet.com/b/heyscriptingguy/archive/2009/04/08/how-can-i-check-the-size-of-my-event-log-and-then-backup-and-archive-it-if-it-is-more-than-half-full.aspx

Param($logName = "security",$backupFolder = "C:\backupLogs")

Function Get-EventLog([string]$logName)
{
 $log = Get-WmiObject -Class Win32_NTEventLogFile -filter "LogFileName = '$logName'"
 If($log.FileSize / $log.MaxFileSize -ge .9)
  {
   "Log is at least 90% full. Backing up now."
   Backup-EventLog($log)
  } #end if
 Else 
 { 
   "Not backed up: $logName is only " + ($log.FileSize / $log.MaxFileSize).tostring("N2") +  " percent full" 
 } #end else
} #end Get-EventLog

Function Backup-EventLog($log)
{
 $folder = Join-Path -Path $BackUpFolder -ChildPath (Get-Date).ToString("MMddyy_hhmm")
 If(-not(Test-Path $folder)) 
   { 
     New-Item -path $folder -itemtype Directory -force | out-Null
   }
  $rtn = $log.BackupEventLog("$folder\$logName.evt").ReturnValue
  If($rtn -eq 0)
    {
     $log.ClearEventLog() | out-null
    } #end if
 ELSE 
   {
    "$logName could not be cleared. Backup ended with $($rtn)" 
  }
} #end Backup-EventLog

# *** ENTRY POINT ***
Get-EventLog -logname $logname
HopelessN00b
  • 53,385
  • 32
  • 133
  • 208
  • 6
    For anyone that remembers that Windows Event logs were memory-mapped files and the *entire* log was loaded into memory, that limitation was [eliminated](https://technet.microsoft.com/en-us/library/cc722385(WS.10).aspx) by the new event logging infrastructure introduced in Windows Vista/Server 2008. However, if you're still using Server 2003, you cannot create logs that exceed 1GB in size because in that OS no process can have more than 1 GB of memory-mapped files in total. – I say Reinstate Monica Feb 25 '15 at 01:14
  • You can split the file into folders afterwards. You can write a PHP script to do so. And let it run for half a year or so. That would help you to organize the data. You can let an internal server with a very basic PHP page that lets you access the data from the gigantic files in the individual folders, thus helping you to view quickly the data you need. Or you can make a simple program to do that. VB.net or C# are good candidates for that. – Ismael Miguel Feb 25 '15 at 10:10
3

The other answer covers the reasoning behind this - for modern systems, mostly keeping load times within the event viewer GUI somewhat bearable. Copying the current log to a location that gets backed up, then clearing it, is also good.

For parsing large log files that end up being generated anyway, two good options occur:

1) Parse the log faster than the current GUI can manage or 2) Split the log into separate files.

I'm sure there are some easily-available utilities out there for 2), so I'll focus on 1).

Firstly, Powershell has an excellent cmdlet for this functionality called 'get-winevent'. The fastest performance I've seen involves using hash tables. Here's an example that gets all events in the security log pertaining to a specific user from the last day:

$timeframe = (get-date) - (new-timespan -day 1)
$userevt = Get-WinEvent -ComputerName <specify> -FilterHashTable @{LogName='Security'; Data='<enter username here>'; StartTime=$timeframe}

$userevt is now a collection of events. Depending on the number of matches, you can pipe it through to format-list to easily read a small number of events. For a medium number, do the same but redirect the output to a file:

$userevt | format-list > <outputfile>.txt

For a large number, start filtering (say you want the caller computer for a lockout event on the user we acquired above):

$userevt | %{if ($_.message -match "Caller Computer .*") {$matches[0]}}

This will show a single-line result for each lockout event. The above processes generally take 1-4 minutes for a 4GB log on 2008 R2.

Secondly, especially for any 2003 machines you might end up having to manage, you can right-click on a particular log file in the left pane in event viewer, and select 'save log file as'.

If you are running event viewer on the local machine, you can save a .evt file that can be parsed by get-winevent.

Alternatively, you can save a text or CSV file (I find CSV easier) that can be parsed by appropriate command-line utilities such as grep or findstr, or certain programs like notepad++.

Bruno
  • 281
  • 1
  • 10
1

Real world example: We had this occur with Security logs being increased to 12GB size to allow 6 month retention per a compliance requirement.

By month 3 we were unable to logon to the servers 2008r2 and 2012r2 servers. The logon would get stuck at the "Welcome" screen. We tried increasing server memory to 20gb to accommodate the large files being opened and the server was still angry. We ended up deciding on following manage engine's recommendation of 1GB and adjusting it to archive old file when full versus overwrite.

We have this script to cleanup old files older than 180 days if we need it but we can likely just keep the files in place.

get-childitem -Path "C:\Windows\System32\winevt\Logs" |
  where-object {$_.LastWriteTime -lt (get-date).AddDays(-180)} |
  remove-item –whatif

https://www.manageengine.com/products/active-directory-audit/help/getting-started/event-log-size-retention-settings.html

Phebs
  • 11
  • 1