Once a PowerShell pipeline runs, it runs. There is no way of cancelling it. This is bad news. Or is there a way?
Runaway Pipeline
Just imagine you are retrieving tons of data to find a piece of information. Once you found it, you do not need to retrieve the rest. Since you cannot cancel the pipeline, you are stuck and have to wait until the initial pipeline command finishes.
There is a cmdlet that seems to provide an answer: Select-Object. This cmdlet has a parameter called -First, so you could try and output only the first n elements like this:
1
2
3
4
5
As it turns out, though, Select-Object simply filters out the first 5 elements but does not stop the pipeline:
Select-Object -First 5 | % { “Outputting $_” }
PROCESSING 1
Outputting 1
PROCESSING 2
Outputting 2
PROCESSING 3
Outputting 3
PROCESSING 4
Outputting 4
PROCESSING 5
Outputting 5
PROCESSING 6
PROCESSING 7
PROCESSING 8
PROCESSING 9
PROCESSING 10
Or in a more practical example, the next line will continue to run even though you have received the information you were after, causing a delay and unnecessary system load:
Index Time EntryType Source InstanceID Message
—– —- ——— —— ———- ——-
53391 Jan 16 02:53 Error TracerX – PowerSh… 1001 …
53390 Jan 16 00:04 Information VSS 8224 ….
53389 Jan 15 23:51 Information VSS 8224 ….
53388 Jan 15 22:39 Information Outlook 1073741856 ..
53387 Jan 15 22:38 Information Outlook 1073741856 …
A Surprising Answer: Yes you can!
With a little trick, you can cancel the pipeline and stop any cmdlet upstream immediately. You need to use continue.
The next line “simulates” Select-Object -First by using a Foreach-Object (alias: %) statement. It counts up and once it received more than 5 elements, it calls continue. There is another Foreach-Object clause right before it, showing you the objects traversing the pipeline. As you will see, this approach not just filters the output but also cancels the upstream cmdlets.
% { $i=0 } {$i++; if ($i -gt 5) { continue }; “Outputting $_” }
PROCESSING 1
Outputting 1
PROCESSING 2
Outputting 2
PROCESSING 3
Outputting 3
PROCESSING 4
Outputting 4
PROCESSING 5
Outputting 5
PROCESSING 6
You may be wondering why you need to call continue to not continue – doesn’t exactly sound right. Continue was originally designed to skip a loop or continue with the next statement after an exception was trapped, so its name does make sense. The only problem with this is that in the previous example, continue stops everything, so if you use it inside a script, the entire script gets cancelled. Not good.
This is because we called continue directly, not from within a loop or an error handler, so continue really continues with the next statement outside the loop or trap, and since there is none, everything stops. The real problem is that PowerShell does not identify a pipeline as a loop which it really is. If it did, we would be done already. Instead, we need to work around it by wrapping the pipeline inside a dummy loop like this:
1..20 | % { Write-Host “Processing $_” -ForegroundColor red ; $_ } |
% { $i=0 } {$i++; if ($i -gt 5) { continue }; “Outputting $_” } }
Processing 1
Outputting 1
Processing 2
Outputting 2
Processing 3
Outputting 3
Processing 4
Outputting 4
Processing 5
Outputting 5
Processing 6
Here again is a more practical example. It will output all events from your Application log until it hits one with an InstanceID of greater than 10000. Once hit, it cancels the entire pipeline upstream and continues with the next command:
Get-Eventlog Application |
% {
$_
if ($_.InstanceID -gt 10000) { continue }
}
}
Index Time EntryType Source InstanceID
—– —- ——— —— ———-
53391 Jan 16 02:53 Error TracerX – PowerSh… 1001
53390 Jan 16 00:04 Information VSS 8224
53389 Jan 15 23:51 Information VSS 8224
53388 Jan 15 22:39 Information Outlook 1073741856
Wrapping the pipeline inside a dummy loop looks ugly but is key to using continue not just in regular loops but also to immediately cancel the pipeline.
Beware of the Spirits you called…
This is a hack. It works beautifully if done right, but it also poses some risk. Here is why.
When you call continue, somehow all upstream cmdlets (the ones before the continue) need to notice that continue has been called and respond to it. In other words, continue needs to actively communicate to all previous cmdlets in the pipeline that they need to stop because the pipeline is shutted down. This is necessary in order to gracefully shut down all upstream cmdlets. You do not want to just pull the plug on them.
Most cmdlets do that just fine. Sometimes, though, this red alert gets lost, so upstream cmdlets continue to run and are surprised when the pipeline is taken away under their feet. When cmdlets get surprised, they throw an exception (a red bulk of text with glibberishy technical stuff in it). This is the worst it can get, fortunately. No chance of blowing something up.
The message is sent to upstream cmdlets by throwing a PipelineStopped exception. Upstream cmdlets receive this exception. If they are done right, they will do two things:
- abort whatever they were doing
- passing on the exception so the next cmdlet gets a chance to see it and act accordingly
The problem can occur when a cmdlet fails to do either one.
- If it does not abort, it might throw an exception because the pipeline is gone with all the side effects arising from this
- If it does not pass on the exception, the cmdlet before this one might throw an exception for the very same reason: pipeline is gone
How come a cmdlet would not pass on the exception? Easy. If the author implemented a generic trap and catches all exceptions, and if the author chose continue at the end of his trap, the exception then is “handled” and no longer bubbles up to the other cmdlets in the pipe. You can illustrate this quite easily:
Get-Eventlog Application |
& {
process {
trap { Write-Host -fore Red “I ate the exception so the upstream cmdlet continues: $_”; continue }
$_
}
} |
% {
$_
if ($_.InstanceID -gt 10000) { continue }
}
}
When you run this, the scriptblock following Get-Eventlog has its own trap. When you call continue downstream, it catches the PipelineStopped exception and handles it. This way, Get-EventLog never gets to see it and happily continues to spit out results, causing additional exceptions for any piece of result that no longer can be handled.
To correct this, the trap would have to call break instead of continue to signal back the exception it received.
Creating your own Stop-Pipeline command
Now, you may argue this concept creates spaghetti code and makes the pipeline look ugly, and you are right. When you use continue, having to wrap the whole enchilada in a loop is not exactly esthetic. It also violates the modular philosophy of PowerShell and instead uses the old and ugly “onion” concept. As it turns out, you do not need the wrapping loop. Think on.
As you have seen, upstream cmdlets are cancelled by sending them a PipelineStopped-Exception. In the previous examples, I did this indirectly using continue, and since continue can only live inside a loop, I had to place a loop around the pipeline. Why not throw a PipelineStopped-Exception myself and not use continue in the first place? As it turns out, this works just as well. So in order to retrieve only Application log events until I find one with InstanceID greater than 10000, I could do this:
$_
if (& $condition) {
Throw (New-Object System.Management.Automation.PipelineStoppedException)
}
}
I use a filter called Stop-Pipeline which takes a condition. When the condition is met, the filter raises the PipelineStopped-Exception directly. It gets passed on to the upstream cmdlets and cancels them. No need for continue, no need for a wrapper loop, and a very straight-forward pipeline design.
Since this kind of exception is a terminating error, the exception will cancel the entire script, so you should wrap the pipeline in a try/catch-block like this:
try {
Get-EventLog Application | Stop-Pipeline { $_.InstanceID -gt 10000}
} catch {}
‘Done’
Unfortunately, when the pipeline is cancelled this way, it also cancels an important last task a pipeline usually does: should you want to store the result in a variable, this no longer happens. Take a look:
try {
$result = Get-EventLog Application | Stop-Pipeline { $_.InstanceID -gt 10000}
} catch {}
$result
The answer is: “No result”. Bummer.
Going back to the continue approach, it turns out this cancels the pipeline more gracefully after all, so the final solution looks like this:
$_
if (& $condition) {continue}
}
$result = “No result”
$result = do {
$result
Conclusion
Don’t be overwhelmed by this. Being able to stop a pipeline can be a wonderful trick. Just make sure you “know” the cmdlets involved.
As long as they honor and pass on the PipelineStopped exception, you are fine. So in your cozy PS shop at your workbench, test the pipeline you designed and make sure no exceptions are raised by upstream parts of your pipeline so that you can safely assume they know how to pass on the exception correctly. Once you know that, you also know that your particular pipeline is safe for this kind of technique. Thanks to Jaykul (fellow PS MVP) to pointing me to the PipelineStopped-side effects.
It really boils down to this simple filter:
$_
if (& $condition) {continue}
}
Make sure you embed your pipeline in a do/while-block if it runs as part of a script so that only the particular pipeline is cancelled:
Get-EventLog Application | Stop-Pipeline { $_.InstanceID -gt 10000}
} while ($false)
Notice that Stop-Pipeline assigns a default script block to $condition so if you call Stop-Pipeline without argument, it will pass the first object through the pipeline and then cancels it:
Get-Process | Stop-Pipeline
Maybe we will see this functionality as a built-in cmdlet in a future release of PowerShell. Meanwhile, simply copy & paste it into your scripts whenever you need it.
Have fun! Hey, and watch out for PowerShellPlus 3.1! Did you try the beta yet?
And one more thing: if you happen to be located in Germany, Switzerland or Austria, rent me!
I am doing inhouse trainings, tutorials and projects. Simply drop a mail to tobias@powershell.com, and maybe we meet in person some time soon…! I have tons of tricks for you and do trainings for midsize and large enterprises on a regular basis – it’s always a lot of fun!
Cheerio
Tobias
PowerShellPlus-Editor Architect