Transcript Logging – Why you should do it
I’d been writing PowerShell scripts for a couple of years before I came to athena health, and had a pretty good feel for how I liked to do things. I knew about Transcript logging, but it was so inelegant. Most of the time it didn’t capture much anyway because I really do believe in running clean scripts. A good script shouldn’t have any problems, just run and not really return any feedback. Especially if it’s running as a scheduled task–no one’s there to read it anyway! And that’s the rub, if you’re not there to see what went wrong, how do you troubleshoot it? Logging, of course!
Try/Catch Logging
This is probably one of the most common ways of logging, though I’ve also seen people use the $? variable to check if there’s been an error and then react using a If/Then statement. This is an elegant way of logging right? It captures the error, and even allows you to handle the error in script so the whole thing doesn’t have to fail. The problem with this approach, from a logging perspective, is that your logging is only as good as your ability to predict where your script will fail–and wouldn’t you spend the most time there to make sure it doesn’t? Simple reality is that you won’t always know where something will fail, especially as you move into scripts that have multiple dependencies. If one of those dependencies fail, and you didn’t properly Try/Catch it you won’t know what happened to your script.
What this boils down to is your script will begin to grow–and grow–as you put more and more error catching in it to get a proper log. You’ll also spend more time troubleshooting when your script fails in an unexpected way and there’s nothing in the log. First you’ll have to back out any changes it may have made (if it’s that kind of script) run it manually to see what happened, back out the changes again, fix the code and run it again. Rinse and repeat until clean.
A Better Way
What I described above is exactly what I’ve been doing for the past year. But where I work we run a lot of scripts. I mean a LOT. Heck, we even have plug-in architecture style scripts, so scripts running other scripts! I haven’t counted but at least 50 or more, and at least 60-70% of them were written by me so that meant riding herd on a lot of different cats. And there’s nothing so frustrating as seeing a script has failed–maybe you didn’t get a report or your monitoring software notes an error–and having nothing to work with. Log is empty, no changes have been made. You’re stuck there staring at a blank screen.
I had one script that was giving me some problems–you know, one of those scripts that runs fine in the shell but fails as a scheduled task, we’ve all had one–and I finally broke down and did a transcript. I just needed to see what the hell was going on. I don’t remember now what it was, but I diagnosed the problem in seconds when I saw the log. Quick fix and the script has been running like a gem since. In one of those funny coincidences, my co-worker was having a similar problem so I told him to try a transcript and again he was able to fix the problem quickly.
One of my jobs as work is to bring PowerShell up to a standard level and with 3 of us writing code right now–and I’ll drag the rest of them in eventually!–we all have our own way of doing things. This is pretty unavoidable in my book, but by making things easier, providing snippets of code, showing better technique’s I think I can bring the overall code quality up. This led me to the transcript logging. Experience had shown me that this was the superior way of doing things but how can I provide a code snippet that will work for any script?
There were a couple of requirements:
- The log had to have a proper name, no script.log or log.txt.
- Log had to keep multiple days of logs
- Old log files need to be cleaned up so I don’t fill up my drives with thousands of log files
- Need to capture all information (verbose) which would also allow the use use Write-Verbose without the -Verbose parameter.
- Must be easy to implement
That last one is huge, right?
Here’s what I came up with:
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$KeepLogsFor = 15 | |
$VerbosePreference = "Continue" | |
$LogPath = Split-Path $MyInvocation.MyCommand.Path | |
Get-ChildItem "$LogPath\*.log" | Where LastWriteTime -LT (Get-Date).AddDays(–$KeepLogsFor) | Remove-Item –Confirm:$false | |
$LogPathName = Join-Path –Path $LogPath –ChildPath "$($MyInvocation.MyCommand.Name)–$(Get-Date –Format 'MM-dd-yyyy').log" | |
Start-Transcript $LogPathName –Append |
This set the verbose level on so any verbose messaging will be on, trim old log files and then turn on transcript logging. Running Stop-Transcript at the end of your script will turn it off. I like to include Get-Date in my verbose logging just to get time stamps in there:
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Write-Verbose "$(Get-Date): This is a verbose message with timestamp" |
Now the log file will meet all the criteria above, especially the last item. Because I’m using the $MyInvocation variables the name of the log will be the name of script with the date, and the log will append to the daily log if it runs multiple times a day.
There’s a side benefit here I’d like to mention. Because you’re capturing all the errors now, it really cuts down how much you use Try/Catch. If the level of the script doesn’t warrant a high level of error correction (a simple report, for example) then you just don’t have to do much. Just write the script and let the chips lay where they will. You don’t have to try and consider all the different places the script may have a problem, just write it all out and check the log if something goes wrong.
Next Level
But who can sit back and just do that? Let’s take this to the next level and build some advanced functions to do the same work but can easily be included in your scripts by importing them with your custom PowerShell modules. You are using those, right?
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Function Start-Log { | |
Param ( | |
[int]$KeepLog = 15 | |
) | |
$Script:VerbosePreference = "Continue" | |
$LogPath = Join-Path –Path (Split-Path $Script:MyInvocation.MyCommand.Path) –ChildPath "Logs" | |
If (-not (Test-Path $LogPath)) | |
{ | |
Try { | |
New-Item –Path $LogPath –ItemType Directory –ErrorAction Stop | Out-Null | |
} | |
Catch { | |
Write-Error "Unable to create log folder because ""$($Error[0])"", no logging will be available for this script" | |
Return | |
} | |
} | |
$LogPathName = Join-Path –Path $LogPath –ChildPath "$($Script:MyInvocation.MyCommand.Name)–$(Get-Date –Format 'MM-dd-yyyy').log" | |
Try { | |
Start-Transcript $LogPathName –Append –ErrorAction Stop | |
Write-Verbose "$(Get-Date): Logging for $($Script:MyInvocation.MyCommand.Name) started" | |
} | |
Catch { | |
Write-Error "Unable to create transcript log because ""$($Error[0])""" | |
Return | |
} | |
If (@(Get-ChildItem "$LogPath\*.log" | Where LastWriteTime -LT (Get-Date).AddDays(–$KeepLog)).Count -gt 0) | |
{ | |
Write-Verbose "$(Get-Date): Removing old log files:" | |
Get-ChildItem "$LogPath\*.log" | Where LastWriteTime -LT (Get-Date).AddDays(–$KeepLog) | Remove-Item –Confirm:$false | |
} | |
} | |
Function Stop-Log { | |
Write-Verbose "$(Get-Date): Logging for $($Script:MyInvocation.MyCommand.Name) completed" | |
Stop-Transcript | |
} |
These two functions will create your logs in a sub-folder called “Logs” but maintain all of the previous ease of use and functionality. There was a bit of a challenge here because when you execute a function the $MyInvocation.MyCommand.Path becomes a $null, but it was easily solved by adding a $Script: scope to the variable, which allowed me to refer to the calling script where that information is maintained.
Remember, a happy log is a happy scripter!
[…] changed, not necessarily something you want in your report but it would be nice to have it in a log file. Something the original Invoke-SQLQuery simply couldn’t capture. Another problem was […]
Nice one. However, why don’t you put the LogPath as a parameter with a default value instead of hardcoding it inside the cmdlet? I’d also split the LogPathName in 2, whereas the name of the file would be a parameter (precalculated as “$($Script:MyInvocation.MyCommand.Name)-$(Get-Date –Format ‘MM-dd-yyyy’).log”) and the FullPath (current LogPathName) be calculated inside the cmdlet. You may never get to use it but if one day you need it, you have the flexibility, without sacrificing the ease of use of the cmdlet.
Also you may want to log the parameters that are provided to the cmdlet:
“Parameters Provided:$($PSBoundParameters.Keys | ForEach-Object{“
n{0} : {1}" -f $_, $($PSBoundParameters.Item($_))})
n`n”