1

I have a service on windows that's running another process that prints many lines to its STDERR and STDOUT. I'm capturing those lines and redirect them to some log file. I would like to logrotate on this log file, if its size is getting too big to create another file and save the previous one. instead of implementing it on my own i thought of using some third party logrotate libraries like spdlog.

the issue is that i can't obviously make the process to use the logger API and if

ItamarBe
  • 11
  • 2

1 Answers1

0

I put together a very basic Powershell script to guide you the right direction.

Function logrotate (
$activefile = "C:\activelog.text",
$dest = "C:\",$activefile_size,
$newfilename = "arotatedlog.text")
{
$activefile_size = get-item $activefile| Select-Object Name, CreationTime,  @{Name="Kbytes";Expression={$_.Length / 1Kb}}
if ($activefile_size.Kbytes -gt 150)
{Write-Host "File size equal or greater than 150KB"
$dest = "$dest + $newfilename"
Get-Content $activefile | Set-Content -Path $dest
}
logrotate
  1. Gets the activelog file
  2. Coverts the size of the activelog file to kilobytes
  3. Checks if the activelog file is 150kb
  4. If the file is 150kb, grabs the content of the file and sets it to new file.
  5. If the file is NOT 150kb ends the script.

Things to look out for

Duplicate file exist exceptions is the main thing you should look out for. Depending on the file threshold, you may want to add incremental number to the NEW file name.

Hope this helps.

bbatman
  • 101
  • 2