Friday, April 20, 2007

PowerShell, writing debug output

I have been playing around with redirecting verbose and debugging output. I have pursued debugging the most as I want to log debugging information to another place than the user interface. Verbose output is intended to go there.

My first hope was that Write-Debug would output to the debugger, so I could see the messages in Sysinternals' DebugView. But no. Write-Debug should write to the debug pipeline, but I do not think PowerShell v1 has a way of reading that pipeline.

Luckily, PowerShell is a .Net application, so - would it be possible to tamper with the listeners like I have done in The answer to this rhetoric question is yes.

So I wrote two new scripts Set-DebugOutput and Out-Debug. Originally, I would have just overridden the existing Write-Debug statement, but I cannot figure out how to do that. If I create a function called Write-Debug, that will be called - as PowerShell searches functions before cmdlets - but I also want to call the Write-Debug cmdlet. I have searched the Microsoft.PowerShell and System.Management.Automation namespaces without luck. But Out-Debug is also a 'valid' name as it sends information outside the PowerShell environment.

The two scripts are placed in by script library and is included in the path. I do that like this in my $profile -

$myPSConfiguration=(split-path $myinvocation.mycommand.path -parent)
$libraryDir=join-path $myPSConfiguration Library
new-item -type directory -path $libraryDir -ea silentlycontinue

Explanation: Figure out where $profile is placed, build path, create sub folder - ignore errors, add to search path

Both scripts are raw and not commented as they are so small. Out-Debug uses a method so it processes both arguments and pipeline input. An easy way to do so is to create a local function - f - that does the actual work (always using pipeline input) and then just process $args and $input. Out-Debug calls Write-Debug, so the debug is still visible if $DebugPreference is set to Continue. Set-DebugOutput lets you specify a file which then also receives the debug output.

Set-DebugOutput.ps1 -

[System.Diagnostics.Debug]::Listeners | ? { $ -eq "Default" } | % {
    Write-Verbose "Setting log file to $file"

Out-Debug.ps1 -
filter f{Write-Debug "$_";[System.Diagnostics.Debug]::WriteLine("$_")}
$args| f
$input | f

PS. If you see Dennis, remember to tell him that he has something in his eye...


aleksandar said...

You might like my solution. :-)

# profile folder
$p = ([System.IO.FileInfo]$PROFILE).DirectoryName

# put all subfolders with scripts in my path
dir $p | where {$_.PsIsContainer -and ($ -ne "functions")} | %{$env:path += ";" + $p + "\" + $}

# dot-source functions from Functions subfolder

dir ($p + "\functions") | where {!$_.PsIsContainer} | %{. ($p + "\functions\" + $}

More details at


Per Østergaard said...

Hi aleksandar

Thank you for leaving a comment.

The $profile-commands were only a small subset of my $profile and like you, I also have a way to load functions into the shel – maybe another blog entry?

Although, this statement is short and smart –
$p = ([System.IO.FileInfo]$PROFILE).DirectoryName

I purposely used the $myinvocation… setup to make the readers aware that $myinvocation can give you the directory where the executing script is placed. This is very useful in other scenarios.

But, as us users of PowerShell soon starts to realize – most things can be done in a number of ways :)


aleksandar said...

Hi Per,

You are absolutely right. There are so many ways in PowerShell to accomplish the same thing. (I've never heard of $myinvocation before. It could be handy.) Please, post about your way to load functions into shell.