I have recently been working with the Microsoft Azure Pack and more specifically, SMA. SMA heavily relies on PowerShell Workflows. This was a great opportuniy for me to refresh my workflow knowledge. I have documented here under the most common things that you need to know when you are working with PowerShell workflows.

A PowerShell workflow is the Powershell implementation of the WWF (Windows workflow FrameWork). It brings a cool set of functionalities such as the possibilities to execute code in parallel, to create scripts that are persist to reboot, and lot’s of other neat things.

Using Powershell Workflow is a piece of cake when you have already powershell scripting skills. PowerShell workflow is available from Powershell version 3.0 up to version 5.0 (Or should I say ‘since’ version 3.0?). Have you ever written a Function ?Then you already know how to write a PowerShell workflow.

PowerShell workflows: the basics

The most basic (ever) workflow would be written like in the example here below:


Other keywords will need to be used in it, but you have here the basics of a workflow. You are probably wondering why I asked if you already wrote a function using PowerShell ? Well, simply because you can leverage that same knowledge in order to create Workflows. You saw how simple it was to create an empty workflow. You can add to that all your existing knowledge of the Advanced Parameters that you have acquired during all these hard years of scripting, and use them directly in your workflows for some advanced parameter validation.

Like I said above, the power of workflows is really released when you use the correct key words. Here are a few that you should remember:

  • PsPersist
  • Parallelism
    • Parallel
    • Foreach -Parallel
  • Sequence
  • InlineScript

Also, workflows have particular scoping possibilities.

  • $Using
  • $Workflow

Let’s go through them, and see which possibilities they offer us:

WorkFlow CheckPoints:

One really cool thing about a workflow is that it can be “persistent”. this means, that It can survive things such as reboots, network disruptions etc…

We already know that a workflow is actually a set of independent activies running one after each other. It is possible to make a workflow resiliant either after one specefic activty, or directly once the workflow is started.

The two ways of doing this are the following ones:

  • PsPersist
  • Checkpoint-Workflow

Bear in mind that using a Checkpoint is great because it makes you script persistant, but remember that you have signed a pact with the  “duration devil“. Using this feature will allow you to make your script persistent, yes! But only in exchange of lowering your performance results.


The PsPersist term isn’t actually really a keyword that you use inside a workflow. It is a common parameter that you add to the call of your workflow. Something similar to the -verbose of normal function or any PowerShell cmdlet.

This is the parameter that you would like to add to your workflow if you know that your workflow will face reboots. This will make it “persistent” and it will be able to overcome reboots.

-Persist is a parameter that allows your workflows to be resiliant to reboots. This parameter is available to any powershell workflow since it is part of the common set of workflow parameters.



The checkPoint-workflow is a quite handy cmdlet. Indeed, it allows to make the workflow persist very easily. Why that ? well simply because you write it (almost) anywhere in the workflow, and it will do the a check point for you.

I personnaly like to put a CheckPoint-Workflow after a part that might have taken a long time to run.  (Specefically in  an InlineScript block).

In order to call a CheckPoint-Workflow, simply add it anywhere you would like in you script (Except in an inlineScript!).


In the example above, I have adapted a workflow I once wrote in order to show where I would set the CheckPoint-Workflow. Indeed, I have set it there, because the PostConfig.ps1 script is a heavy script that would set specefic registry keys, copy files, install BGinfo etc… It would also send back an object with values specefic to the server it has been executed that I can then re-use later on in my workflow for deeper automation.

keep in mind that CheckPoint-Workflow cannot be put in an InlineScript section.


(Parallelism is actuall not a keyword, but a concept that englobes two key words.)
By default, PowerShell Workflows (and scripts, functions etc..) only work sequencialy. This means that each command of a workflow (called activity) will be run one after another. PowerShell workflow brings a new dimension to this by introducing the concept of Parallelism.

The Parallelism functionnality can be used with two different keywords:

  • Parallel
  • Foreach -parallel


The parrallel key word will allow to specify a block of code that needs to be run in parallel.


Foreach -parallel

Foreach -parallel will do exactly what is says it does: It will do tasks in parallel. It work just like the good old Foreach construct that we all know by now, but it will call the different elements from the array in parallel in stead of one by one.

The example below for a more real life example:

(only works in workflows)

Unfortunatley, Foreach -Parallel only works in Workflows.


The sequence command allow to run activities sequencialy. This means that activities will be run one after each other.

But wait! Aren’t  workflows already supposed to be working sequentially? That is correct! But they become extremly helpfull when used in a parallel block, because we could want to call one activity in the parallel script block which actually has to respect a certain order (like the installation of prerequisites prior the installation of an application for instance).

Again, Sequence is only allowed in a PowerShell WorkFlow.

Sequences should be consired as independant blocks. Avoid trying to get data back from it.


Not everything will work as straight forward in PowerShell workflows as when you PowerShell functions, or scripts etc..
Why that?

Well, for a workflow to become a real workflow, it is converted to WWF. And not any powershell command cannot be converted to WWF unfortunatley.

But not worries, we have an alternative. We have “inlineScript”.

Indeed, “inlineScript” Let’s you use any powershell command. Not just the one that can be converted to WWF.

And inline script section is executed as one and single unit, and, one very important point is that the InlineScript has it’s own scope.

$Workflow scope cannot be used in InlineScript. Must use $Using

$Using: –> Use this in order to be able to access variables that have been declared in the $workflow scope.


So, now that we have started to talk a little bit about scoping, let’s get a bit more into the details.

Two elements should be kept in mind here:

$Using: –> use this when writing “inlineScript” parts. Using the “$Using” scoping variable will allow you like this for this example of the variable $Path which would have been declared previously:

The following code will NOT work

The $Using:Path will tell the inlineScript section to get the variable value in the scope above.


As decribed above, there are actually two different scopes:

  • $workflow

This scope is as it’s name says global to the complete workflow. Anything declared under this scope will only be available through that scope.

  • $Using

as explained above, the “inlineScript{}” section allows you tu use powershell code and cmdlets that would not be permitted in a normal section of a workflow. It is possible to use a variable that has been declared in the $workflow scope using the keyword “$using:NameOfYourVariable”.

For example, you  have a variable called $ScriptPath in your workflow, and you want to you use the value of that same variable in you “inlineScript” section. You would do it like this:

The example is very simplified, just to make the understanding more easy. Here we have access to the $ScriptPath Value simply by using the keyword $using:ScriptPath


Workflow variables –> http://technet.microsoft.com/fr-fr/library/jj574187.aspx

PowerShell Magazine article –> http://www.powershellmagazine.com/2012/11/14/powershell-workflows/

Output information in SMA Azure pack –> http://blogs.technet.com/b/orchestrator/archive/2014/01/16/sma-capabilities-in-depth-controlling-runbook-streams-for-testing-and-troubleshooting.aspx