These (Powershell 3.0) scripts will convert archived Security (auditing) logs. . If you run these at night, configure the advanced settings of your laptop to forbid automatic sleep/hibernation. Note that the field names for an evtx file are different than those you would use to query an existing (working) event log. Of the four scripts I test 'Function Convert-Logs3' has the best combination of optimized resource use and speed.
The wrinkle in the first script is that the last two lines of memory management code clear all memory for the next run. On my two year old i5 laptop, the Powershell process and the svchost that hosts the eventlog service will take up 20 - 25 % (combined) of the processor and use at least 1.5 GB of physical memory to convert and export a 100 MB log
Function Convert-Logs1{
$list=(ls Archive-Security*.evtx).name
foreach ($i in $list) {
$a=Get-WinEvent -Path "$i";
($a | Select RecordID,ID,TimeCreated, Message)| export-csv -notypeinformation -path $(write "$i.csv");
foreach ($j in (ls variable:/*)) {rv -ea 0 -verbose $j.Name};[System.gc]::collect();
}
}
The second script doesn't select any specific fields or return output to a variable. This decreases (stabilizes) the memory usage, increases the CPU usage, the time to finish and the amount of information exported. By eliminating the variable storage above and directly exporting to a file, the script allows the user to cancel at any time and still have data that has been returned to the export file. You can take advantage of this script to run two conversion lists simultaneously to max out your processor. I use an array variable of file names for the first argument like this:
[array]$filelist =
"file1",
"file2",
"file3"
Function Convert-Logs2 {
[cmdletbinding()]
Param(
$filelist=$NULL
)
$filelist | foreach-object {
Get-WinEvent -Path "$PSItem"| export-csv -notypeinformation -path $(write "$PSItem.csv");
[System.gc]::collect();
}
}
The third script hybridizes the two methods above; selecting specific fields for file stream export. This is probably the most reliable script with the best compromise between speed and stability
Function Convert-Logs3 {
[cmdletbinding()]
Param(
$filelist=$NULL
)
$filelist | foreach-object {
Get-WinEvent -Path "$PSItem"| Select RecordID,ID,TimeCreated, Message | export-csv -notypeinformation -path $(write "$PSItem.csv");
[System.gc]::collect();
}
}
This last script is the least reliable but arguably the most aggressive of the four. It does not convert to CSV. It uses [System.IO.StreamWriter] to write each record as a hashed entry to a file. If you exit it before if finishes, you will need to close your session to unlock or delete the file. (e.g. 'stream.close' must be called otherwise). This one is greedy with both memory and CPU. Despite this, it is unclear if it is faster by very much.
Function Convert-Logs4 {
[cmdletbinding()]
Param(
$filelist=$NULL
)
$filelist | foreach-object {
$stream = [System.IO.StreamWriter]$(write $pwd\$PSItem.csv);
Get-WinEvent -Path "$PSItem"| Select RecordID,ID,TimeCreated, Message | %{$stream.WriteLine($PSItem)};
$stream.close();
}
}
No comments:
Post a Comment