vRO F5 Plugin – Pools Not In Common Partition Not Shown

Introduction

I’ve been working with vRO F5 plugin recently and found an interesting behaviour that I would like to share. In this blog post, it will be discussing:

  1. Environment
  2. Task
  3. Interesting Behaviour
  4. Wrap-Up

Environment

The following is the list of products with versions I used:

  1. vRealize Orchestrator version 5.5
  2. F5 version 11.5.2
  3. F5 vRO plugin version 2.0.1

Task

The task I tried to achieve with the F5 vRO plugin is to:

  1. Attach LTM server from vRO
  2. Create a pool under a partition called “Production”
  3. Add a member under the pool created
  4. Delete the pool

And the list of built in workflows attached below:

1

First of all, I tried to attach LTM server by running “Attach LTM” workflow:

1

F5 plugin version 1.x.x uses SOAP API and from 2.0.0, they introduced a new support for REST API. As I wanted to stick to SOAP API, I’ve set false to REST API flag and clicked Submit.

As per the task above, second step was to create a pool under Production partition. Running the “Create Pool” workflow, inputs required were:

  1. LTM Instance
  2. Name for pool
  3. Method

As shown below, there was no input to specify which partition I wanted to create a pool in:

2

Wanted to check which partition the pool goes into. Executed the workflow with the attached LTM server giving a name “StevenTest”.

Looking at the Logs tab:

3

And the F5 Pools List page:

4

As expected, it was created under Common partition. How could I add a pool under Production partition? One thing came up in my mind was, what would happen if I give the full path i.e. /Production/StevenTest as the name. This time, for the name, I used /Production/StevenTest1:

5

Wallah, it worked! The pool was created under Production partition:

6

On the pool I created, I executed “Add Pool Member” workflow to add a member to it. One of the inputs was LTM Pool. Clicked on it and found out that I couldn’t see /Production/StevenTest1 pool:

Screenshot 2015-04-20 17.46.35

7

So by the look of it, only pools in Common partition could be found. Why is this the case?

Interesting Behaviour

I was looking at all API calls, blog posts from F5, literally everything I could but there was no luck to fix this issue. Then I realised that would this still be the case if I use REST API instead of SOAP API? I gave it a go, detached the LTM server and attached it again enabling REST API:

8

9

Once done, I navigated to Inventory Tab and expanded F5 Networks and guess what, the pool created under Production partition was there:

10

Ran “Add Pool Member” workflow again using port 443:

13

14

And checking the member created from Members tab on F5 interface, it was successfully made under Production path.

15

The final step was to delete pool. Ran “Delete a pool” workflow selecting /Production/StevenTest1 pool:

4

Looking at the pools list from F5 interface, it was gone:

18

Wrap-Up

The behaviour I found, wondering if this is a bug. Will be sending an email to F5 vRO plugin to confirm if it is the case. Otherwise, will try to get a solution or workaround. Will update this post once I get an answer from them.

Hope this was helpful and feel free to leave a message for any clarifications 😀

vRO Deepdive Series Part 2 – Action & Workflow

Introduction

This is the second vRO deep dive series, where I will be discussing action and workflow. I recommend you to read the first series before going through this post.

I will first start with action and then workflow.

Action

What is Action? Simple way of describing is, think it as a scriptable task that you can load anywhere which is saved in modules (module is equivalent to folder). It’s totally a personal preference to use either a scriptable task or action but if it’s a task that could be used in multiple workflows i.e. repeatable task, it would be easier to save it as an action and call it from workflows.

Let’s do an exercise to see what action is. I will be creating an action and re-calling last series’ example. Navigate to Design section and go to Action’s tab:

Screenshot 2015-03-24 12.55.03

Screenshot 2015-03-24 12.55.18

Right-click on the root folder and create a module. You could call it anything but in my case, I named it as org.company.test:

Screenshot 2015-03-24 12.57.02

Right click on the created module, click Add Action and name it sampleAction:

Screenshot 2015-03-24 13.00.48

Screenshot 2015-03-24 13.01.02

Right click on the created action, Edit and navigate to Scripting tab:

Screenshot 2015-03-24 13.01.13

You will then see a space similar to a scriptable task. On top of the scripting area, you could define multiple inputs and only one output. This is the main difference between a scriptable task and an action, only one output could be defined.

Let’s continue, click on Add Parameter and create 2 input parameters as below:

  • VM with VC:VirtualMachine Type
  • Cluster with VC:ClusterComputeResource Type

Screenshot 2015-03-24 13.01.37

Screenshot 2015-03-24 13.02.50

On the scripting area below, copy and paste the following in:

if (VM && Cluster) {
    System.log("Please specify either VM or Cluster");
} else if (Cluster) {
    System.log(Cluster.datastore);
} else if (VM){
    System.log(VM.datastore);
}

Screenshot 2015-03-24 13.02.56

Click Save and Close and you will see sampleAction saved in the module you created:

Screenshot 2015-03-24 13.07.53

Let’s replace previous Sample Workflow with the action we just created. Delete everything within the Sample Workflow built last time and drag & drop the action built between start and end:

Screenshot 2015-03-24 13.08.40

Click Edit on the workflow to finish off the Visual Binding:

Screenshot 2015-03-24 13.08.59

One thing to note is that Scripting tab won’t show you the full details, it’s a pre-generated command which simply loads the action:

Screenshot 2015-03-24 13.08.48

Let’s run the workflow! Same as last time, I will cover 2 scenarios:

  • VM or Cluster only
  • VM and Cluster

Defining VM only:

 Screenshot 2015-03-24 13.09.35
Screenshot 2015-03-24 13.10.40

Giving values to both VM and Cluster:

Screenshot 2015-03-24 13.10.53

In this exercise, we’ve replaced a scriptable task to an action. Instead of using the action itself like above, you could call the action from a scriptable task. Let’s try it. Edit the Sample Workflow, delete the action and put a scriptable task in between instead:

Screenshot 2015-03-24 13.14.47

Screenshot 2015-03-12 11.33.40

Edit the scriptable task and complete the Visual Binding:

Screenshot 2015-03-24 13.16.51

Type the following on the Scripting tab:

System.getModule("org.company.test").sampleAction(VM,Cluster);

Save and Close then run it. Same as above, I only defined VM for the input:

Screenshot 2015-03-24 13.18.45

This exercise was to show that an action could be called from a scriptable task. Some of you might ask “why would you load an action from a scriptable task, why not just action”? Let me explain it.

Go back to the action created under org.company.test module, edit and navigate to scripting tab:

Screenshot 2015-03-24 13.21.59

Take a close look at “Return type : void”. What does it mean? As mentioned earlier, action could return only 1 output and this is where you set type for an output. Click on void and select VC:Datastore as type:

Screenshot 2015-03-24 13.22.11

For the script below, change it to the following:

if (VM && Cluster) {
    System.log("Please specify either VM or Cluster");
} else if (Cluster) {
    return Cluster.datastore;
} else if (VM){
    return VM.datastore;
}

Screenshot 2015-04-02 13.25.42

From now on, this action will return datastore object which could be saved to an output attribute within a workflow.

Save and Close the action and go back to Sample Workflow. Edit the scriptable task and drag & drop Datastore attribute to this:

Screenshot 2015-03-24 13.21.05

As the action returns VC:Datastore object, you could simply do the following to save the result from the action to a variable:

Datastore = System.getModule("org.company.test").sampleAction(VM,Cluster);
System.log(Datastore);

Running the workflow:

Screenshot 2015-03-24 13.23.23

In summary, I have gone through:

  1. Using an action within the workflow
  2. Calling an action from a scriptable task
  3. Saving an output from an action in a scriptable task and map it to an output attribute

I will continue with Workflow.

Workflow

In the previous series, I’ve been demonstrating with scriptable tasks only within a workflow. Moving forward, I will be using multiples of different elements that could be used within workflows.

In this section, even though there are lots of other components you could use, will be concentrating on the following only:

  • Decision
  • Custom Decision
  • User Interactive
  • Sleep
  • Workflow Note

Before wrapping up, I will be providing an exercise to utilise all elements above.

Decision

The decision is normally used to decide if it’s good to proceed or terminate the workflow. Once again, I will be re-calling last series’ example. What I will show you is, the workflow will only output datastore information only if there is/are datastore(s) mapped to a cluster or a VM. Let’s start!

Go back to Sample Workflow and create one more attribute called DatastoreArrayLength with Type number:

Screenshot 2015-03-27 09.56.11

DatastoreArrayLength attribute will be calculating the length of Datastore array to verify if it has value or not, i.e. 0 meaning array is empty otherwise, has values.

In the scriptable task, drag & drop DatastoreArrayLength into Output:

Screenshot 2015-03-27 09.57.12

Navigate to Scripting and type the following in:

Screenshot 2015-03-27 09.57.30

Save and Close scriptable task and drag & drop Decision between the scriptable task and end:

Screenshot 2015-03-27 09.57.38

You will then see two arrows:

  • The green arrow representing success
    • Same as “Return true”
  • The red arrow representing failure
    • Same as “Return false”

Based on the decision you define, i.e. length of Datastore array, it could go to success end or failure end.

Edit the decision, navigate to Decision tab, click on Not set and select DatastoreArrayLength:

Screenshot 2015-03-27 09.58.05

Click on the equals and change it to greater:

Screenshot 2015-03-27 09.58.15

As mentioned earlier, the workflow will only continue if the Datastore array isn’t empty. Therefore, type 0.

Screenshot 2015-03-27 09.58.23

Once the decision’s made, it would be good to print-out if the process was successful or not. Drag & drop 2 scriptable tasks like the following:

Screenshot 2015-03-27 09.58.57

For the success scriptable task, navigate to Visual Binding tab and drag & drop Datastore attribute to input:

Screenshot 2015-03-27 09.59.06

Go to Scripting tab and type the following in to output Datastore value:

Screenshot 2015-03-27 09.59.25

Save and Close and edit the scriptable task for failure branch. This time, you won’t have to do any Visual Binding as there’s no Datastore found before. You will only require to write the following in the Scripting tab:

Screenshot 2015-03-27 09.59.45

Save and Close and time to run the workflow. I’ve chosen a cluster with 1 datastore attached:

Screenshot 2015-03-27 10.00.47

Choosing a cluster with no ESXi servers i.e. no datastores:

Screenshot 2015-03-27 10.00.57

Hope this example helped you to understand decision and I will move on to Custom Decision.

Custom Decision

The custom decision is exactly same as decision above but it provides the ability to write a script to make it more richful decision tool. Will repeat the above example replacing decision to custom decision element.

Go back to Sample Workflow and delete the DatastoreArrayLength attribute:

Screenshot 2015-03-27 10.02.42

Also delete the Decision created:

Screenshot 2015-03-27 10.03.06

When you delete, it will ask you which branches you want to delete. In this case, as we are planning to replace it to custom decision select Both branches:

Screenshot 2015-03-27 10.03.12

Edit the scriptable task and delete output from it:

Screenshot 2015-03-27 10.03.26

Also, remove DatastoreArrayLength line:

Screenshot 2015-03-27 10.03.35

Everything’s back to original scriptable task. Let’s try custom decision, drag & drop custom decision between scriptable task and end:

Screenshot 2015-03-27 10.03.47

Edit the custom decision and navigate to IN tab:

Screenshot 2015-03-27 10.13.44

Click on “Bind to workflow parameter/attribute” and select Datastore:

Screenshot 2015-03-27 10.14.06

Navigate to Scripting tab and type the following in. To add some explanations to the following screenshot, any custom decision’s output is always boolean which is either true or false. Hence, if the length of Datastore array is greater than 0, you must return true as it means the workflow will be successful. Else, return false:

Screenshot 2015-03-27 10.14.59

Same as decision exercise, create two scriptable tasks and configure:

Screenshot 2015-03-27 10.15.45

Save and Close and running the workflow that the cluster with a datastore mapped:

Screenshot 2015-03-27 10.16.03

And the result with a cluster with no datastore:

Screenshot 2015-03-27 10.16.14

Literally, decision and custom decision provide same functionally but custom decision is more flexible. One of the examples is that you could actually remove 2 scriptable tasks created that was used to print output and modify the script within custom decision to the following:

Screenshot 2015-04-02 15.00.53

Hope this was useful and time to look at User Interaction.

User Interactive

User interactive is used if you want users to specify inputs during workflow processes. I will be using above example again but will change it slightly:

  1. Specify either Cluster or VM
  2. User choose 1 datastore from the list of datastores mapped to either Cluster or VM
  3. Show the detail of the chosen datastore

Let us begin!

Edit the Sample Workflow and create an attribute FinalDatastore as the following:

Screenshot 2015-03-27 10.23.42

This attribute will be used to save the selected datastore by the user. Go back to Schema, delete scriptable task on the successful branch and drag & drop User Interaction:

Screenshot 2015-03-27 10.24.02

 

Screenshot 2015-03-27 10.24.12

Edit User Interaction, navigate to External inputs and click “Bind to workflow parameter/attribute”:

Screenshot 2015-03-27 10.24.34

Select FinalDatastore attribute defined:

Screenshot 2015-03-27 10.24.46

What we’ve just done is that when a user selects a datastore, the value will be saved to this FinalDatastore attribute. Remaining work is to define datastore(s) found to be presented to users. It will require some work on Presentation layer which will be discussed in depth on the next series. For now, follow the instructions below.

Go to Presentation tab, select FinalDatastore and click “Add property”:

Screenshot 2015-03-27 10.24.58

Select Predefined list of elements:

Screenshot 2015-03-27 10.25.07

Click edit “pencil” button:

Screenshot 2015-03-27 10.25.13

Select Datastore as the linked parameter and Accept:

Screenshot 2015-03-27 10.25.19

Saving Sample Workflow will cause some issues saying 2 attributes are not set yet. Go back to User Interaction and set the following two parameters to NULL:

Screenshot 2015-03-27 10.25.41

Screenshot 2015-03-27 10.25.49

I will explain this separately in near future so for now, make them NULL!

Go back to Schema and drag & drop a scriptable task between User Interaction and end. We will want to output the datastore user selected. Edit the scriptable task, do the Visual Binding and type the following script in:

Screenshot 2015-03-27 10.26.50

Screenshot 2015-03-27 10.27.05

Save and Close and the time has came, let’s execute the workflow! This time, I’ve chosen a cluster with 2 datastores and when the process reaches to User Interaction, it will prompt you a screen like the following:

Screenshot 2015-03-27 10.28.43

Click on Not Set and you will see 2 datastores:

Screenshot 2015-03-27 10.28.55

I’ve selected datastore11 and it printed out the details of datastore11:

Screenshot 2015-03-27 10.29.05

Running the workflow selecting a cluster with no datastores:

Screenshot 2015-03-27 10.29.28

This is how you would use User Interaction element and moving on to Sleep.

Sleep

Sleep is very simple, literally you are sleeping for x amount of seconds before moving on to next process. Will quickly show you how to use this element.

Edit Sample Workflow, define an attribute called SleepTimer with Type number and give it a value of 10:

Screenshot 2015-03-27 14.14.41

Drag & drop Sleep between custom decision and User Interaction:

Screenshot 2015-03-27 14.14.51

Edit Sleep element and complete Visual Binding like the following:

Screenshot 2015-03-27 14.14.58

 

Running the workflow, it will sleep for 10 seconds before asking user to select a datastore:

Screenshot 2015-03-27 14.15.45

Screenshot 2015-03-27 14.15.57

Hope this was straightforward and I will be continuing with last element, workflow notes.

Workflow Notes

Workflow note is used to visually comment the process within a workflow. Let’s take a look at example below, it’s a workflow I wrote to ensure the VM provisioned from vRA is added to proper DRS group in vMSC cluster making sure the uniform access is achieved. From start to end, I highlighted in yellow saying “Find cluster, convert vCAC:VM to VC:VirtualMachine and find/move to DRS group” and highlight in red for outputting error logs and throw exceptions. This way, it will become much easier to debug processes or transfer this workflow to others:Screenshot 2015-04-02 15.32.40

You could change the colour of it to something else, select the workflow note, right click and Edit:Screenshot 2015-04-02 15.33.37

Choose a color and click OK:

Screenshot 2015-04-02 15.33.47

Exercise

Prior to  this section, the scenario below is an example I came up with. Use it for learning!

Using all elements discussed above, let’s develop a workflow with the sample scenario given below:

Scenario: As part of Windows Server VM decommissioning process, Windows administrator requested VMware team to develop a workflow that:

  1. Un-join Windows Server from Domain
    • PowerShell command provided: “cmd /c netdom remove /d:ssbkang.com SERVER_NAME /ud:ssbkang\ACCOUNT
      /pd:PASSWORD”
    • Service Account provided
  2. Reboot the VM to make the change above
  3. If the server was successfully un-joined from domain, power-Off and rename the VM to “Name-Decommission”

Looking at the request from Windows administrator, you will have to:

  1. Run the command provided within the Windows OS with PowerShell
  2. Reboot the VM
  3. If the change is successfully made:
    1. Power-Off the VM
    2. Rename the VM
  4. Else, notify the workflow runner

Let’s take a look at built-in workflows that you could use for this work. On the right hand corner of vRO main page, search with the term “Guest”:

Screenshot 2015-04-02 14.19.37

You will then see workflow called “Run program in guest”. Click on Go to Selection and Close the window. Reading the description, looks like you will be able to run PowerShell command with this workflow:

Screenshot 2015-04-02 14.21.09

And looking at inputs to this workflow, your decommissioning workflow will require the following inputs:

  • Local Administrator / Password
  • Name of the VM

And for input attributes of this workflow:

  • interactiveSession
    • Will be set to false as interaction’s not required
  • programPath
    • Program path will be PowerShell, which will be “C:\Windows\SysWOW64\WindowsPowershell\v1.0\powershell.exe
  • arguments
    • This is the PowerShell script provided, which will be “cmd /c netdom remove /d:ssbkang.com SERVER_NAME /ud:ssbkang\ACCOUNT
      /pd:PASSWORD
  • workingDirectory
    • This will be set to Null as this isn’t required for this work
  • environment
    • This will be set to Null as this isn’t required for this work

Screenshot 2015-04-02 14.22.01

Next bit will be rebooting the VM gracefully. Search for reboot and you will find the following workflow with 1 input called “Reboot guest OS”:

Screenshot 2015-04-02 15.52.57

Screenshot 2015-04-02 15.56.05

Screenshot 2015-04-02 15.53.06

Similar to above, you will be able to find “Shut down guest OS and wait” and “Rename virtual machine”:

Screenshot 2015-04-02 15.55.45

Screenshot 2015-04-02 15.55.50

Screenshot 2015-04-02 15.56.14

Screenshot 2015-04-02 15.56.19

Assembling all the information above, we’re ready to start building a workflow. First of all, create a workflow and call it something like “Decommission Windows Server” and create required attributes and inputs as per below:

Attributes:

Screenshot 2015-04-09 09.04.49

Inputs:

Screenshot 2015-04-02 16.08.04

Everything’s prepared, let’s start working on Schema. Collecting above workflows, attributes and inputs, the schema I came up with looks like the following:

Screenshot 2015-04-02 16.12.23

First scriptable task creates an argument which will be saved as an output attribute that will be parsed to “Run Program in guest workflow”:

attrArgument = "cmd /c netdom remove /d:ssbkang.com " + VM.name + " /ud:" + attrWindowsServiceAccount + " /pd:" + attrWindowsServiceAccountPassword

System.log("Argument: " + "cmd /c netdom remove /d:ssbkang.com " + VM.name + " /ud:" + attrWindowsServiceAccount + " /pd:" + "**********");

Screenshot 2015-04-09 08.37.15

And the Visual Binding:

Screenshot 2015-04-09 08.37.08

 

With the argument defined above, it runs “Run program in guest”:

Screenshot 2015-04-09 08.37.25

Once the command is ran within the guest OS, I used sleep for 5 seconds before restarting the VM.

After 5 seconds, reboots guest OS. Visual Binding will be:

Screenshot 2015-04-09 08.37.35

Once “Reboot guest OS” workflow is executed, I wrote a script to check if the server was un-joined from domain and it is attached below:

Screenshot 2015-04-09 08.37.53

var state = "on";
var expression = new RegExp("SSBKANG.COM","i");

System.log("Current hostname is: " + VM.summary.guest.hostName);

while (attrToolStatusBoolean) {
    if (VM.guest.toolsRunningStatus === "guestToolsNotRunning" && state === "on") {
        state = "off";
    }

    if (VM.guest.toolsRunningStatus === "guestToolsRunning" && state === "off") {
        System.sleep(60000);
        System.log(VM.name + " has rebooted and the current hostname is: " + VM.summary.guest.hostName);
        
        if ((VM.summary.guest.hostName).match(expression)) {
            System.log("Computer " + VM.name + " was not un-joined from Domain, please check");
                attrContinue = false;
        } else {
            System.log("Computer " + VM.name + " was successfully un-joined from domain");
            attrContinue = true;
        }
        attrToolStatusBoolean = false;
    }
}

To briefly explain the script above, I used a while loop with a boolean attribute defined attrToolStatusBoolean that will be checking the status of VMware tools to ensure the guest OS was rebooted and then get the latest hostname. While VM is restarting, VM.guest.toolsRunningStatus becomes “guestToolsNotRunning” status and once the server is up, VM.guest.toolsRunningStatus changes to “guestToolsRunning”. I then sleep for 1 minute to make sure the VMware tools updates the information that in this case is the hostname. It will then get the hostname VM.summary.guest.hostName to find out if the server was un-joined from domain. If it was un-joined, the hostname will be the name of the VM itself returning true otherwise, it will still be VM name + domain returning false. Feel free to use this function for your personal usage if needed 😀

The Visual Binding:

Screenshot 2015-04-09 08.37.47

Based on the result from this scriptable task, if the server was un-joined from domain then it will re-name the VM. Otherwise, it will finish the workflow:

Screenshot 2015-04-09 08.38.04

Visual Binding of “Rename virtual machine”:

Screenshot 2015-04-09 08.38.22

And for the last step, drag & drop Workflow note to briefly explain what the workflow does, example attached below:

Screenshot 2015-04-09 09.14.33

 

Time for testing! I deployed a Windows 2012 Server that’s currently joined to a domain. Screenshots attached below:

Screenshot 2015-04-02 16.11.04

Screenshot 2015-04-02 16.11.44

Time to run the workflow. Fill in the required information and click run:

Screenshot 2015-04-10 10.49.47

Logging into vCenter Server, you will then see tasks ran by com.vmware.orchestrator user:

Screenshot 2015-04-09 09.07.39

The VM is re-named assuming the server was un-joined from domain:

Screenshot 2015-04-09 09.07.46

Logging into Windows Server, the server was un-joined from domain!

Screenshot 2015-04-09 09.08.06

Screenshot 2015-04-10 10.52.11

Wrap-Up

Hope this series was helpful for you to understand action & workflow and for the next series, I will come back with Presentation Layer. Stay tuned 😀

vRO Deepdive Series Part 1 – Introduction

Introduction

Finally it’s here, deep dive series of vRealize Orchestrator (vRO), previously known as vCenter Orchestrator (vCO). The purpose of this series is to explain and discuss developing custom workflows to automate IT processes. My plan is to go through the following:

  • vRO Part 1 – Introduction
  • vRO Part 2 – Action & Workflow
  • vRO Part 3 – Presentation Layer
  • vRO Part 4 – Log Handling, Throwing Exceptions and Fail-back
  • vRO Part 5 – Introduction to Relationship Between vRO & vRA
  • vRO Part 6 – Integration of vRO & vRA

I won’t be going through how to install/configure vRO, there are many blogs out there for reference. Rather, I will be deep diving into development side.

In this blog post, I will be discussing:
  • Language to learn
  • Object and Type
  • Input, Output and Attribute
Let’s get started!

Language to learn

First of all, vRO is JavaScript based. So, if you are not familiar with this language, I suggest you to Google and read some basics with JavaScript, there are tons of resources out there!

Preparation

Before we start, let’s create a simple workflow for the exercises later:

Login to vRO, right click on folder and create a workflow:

Screenshot 2015-03-12 11.31.11

Name it Sample Workflow, or something you would like to name:

Screenshot 2015-03-12 11.31.18

Edit the workflow, navigate to Schema and between start and end, drag and drop a scriptable task:

Screenshot 2015-03-12 11.31.56

Navigate to Inputs tab and click Add parameter:

Screenshot 2015-03-12 13.37.10

Name it VM and for the Type, search for VC:VirtualMachine and Accept:

Screenshot 2015-03-12 13.38.25

Go back to Schema, edit Scriptable Task and navigate to Visual Binding:

Screenshot 2015-03-12 13.54.04

Drag VM and drop it in IN box:

Screenshot 2015-03-12 13.55.31

Save and Close. You can safely ignore the validation for now.

So everything’s ready, let’s get started!

Object & Type

Starting with definition of object:

A JavaScript object is an unordered collection of variables called named values

In vRO, there are lots of predefined objects VMware created, for example, VC:VirtualMachine and to develop a workflow properly, you must get familiar with objects. Let me make it clear with an example, go back to the workflow created above, click edit on the scriptable task and navigate to Scripting tab:

Screenshot 2015-03-12 11.33.40

Screenshot 2015-03-12 11.34.39

On the left hand above corner, you will see the list of APIs in the box. Let’s search for VC:VirtualMachine and click Go to selection:

Screenshot 2015-03-12 11.36.05

When you close the window, you will see:

Screenshot 2015-03-12 11.36.59

To view more details, click on VcVirtualMachine and expand:

Screenshot 2015-03-12 11.42.43

Now, you will see the list of all properties and functions and this is where you will have to search and read all the time. From the screenshot above, empty rectangles represent property, filled rectangles represent functions and literally these are the form of VCVirtualMachine object where the type is VC:VirtualMachine. You can either call property or function like the following:

  • Property => VC.datastore
  • Function => VC.destory_Task()

In summary, an object has a type of xxx which consists of properties and functions.

To view more details, you can click on any properties or functions. For example, let’s click datastore:

Screenshot 2015-03-12 11.50.52

One thing to note is, have a look at Return Type “Array of VcDatastore”. It means if you call for this property datastore, the return type will be Array of VcDatastore. I will be shortly going through type hence for now, let’s call some properties and output them on the log tab to see how they look like.

Edit the Scriptable task, navigate to Scripting tab and type System.log(VM.datastore);

Screenshot 2015-03-12 14.12.50

Before moving on, I would like to emphasise one thing. When you call properties, it’s case-sensitive meaning if you type “System.log(VM.Datastore)” it will fail to call datastore property as it doesn’t exist! Ensure you look at the API box on the top left hand corner and type exactly what it says.

It’s time to run the workflow. Right click, run and you will see the workflow asking for an input VM.

Screenshot 2015-03-12 14.16.36

Click on Not set and select a VM and submit. Then, you will see VcDatastore type returned with the value which represents the name of datastore.

Screenshot 2015-03-12 14.19.04

It will output VMFS volumes being used by this VM. That was quite simple, wasn’t it? The reason I wanted to go through objects is when you start developing workflows or utilising other pre-built workflows, you must understand and use correct objects. I will give you an example. Let’s say administrator from CompanyA is trying to automate the following process:

  • Create a VM
  • Move the VM to resource pool
  • Power On

As there are already pre-built workflows available, he would want to use them instead of building from scratch. He created a workflow called “Custom VM Build” and put pre-built workflows into it. Then he will define 2 inputs for this workflow:

  • VM, String
  • ResourcePool, String

Then from Visual Binding, he will try and connect workflow inputs to pre-built ones and he will realise the operation is denied. The reason is simple, he hasn’t checked the inputs for 3 workflows above:

  • Create a VM (pre-built)
    • Input Type = VC:VirtualMachine
  • Move the VM to resource pool (pre-built)
    • Input Type = VC:VirtualMachine, VC:ResourcePool
  • Power On (pre-built)
    • Input Type = VC:VirtualMachine

Hence, he should have checked input types to match pre-built workflows’ inputs from the beginning. Make sure, know which objects with types you are planning to use, this looks very basic but this is the most important aspect.

Alternatively, he could have still used String inputs but then, extra scriptable task or action is required to convert String to either VC:VirtualMachine or VC:ResourcePool type. This will be discussed in the next series.

Input, Output and Attributes

Input

Time to look into inputs, outputs and attributes. Inputs and outputs are simple to understand, literally they are inputs and outputs to a workflow. Let’s re-cap the example above, I added an input called VM, VC:VirtualMachine to the Sample Workflow and this asked me to specify a VM when I ran it. The workflow prompts to ask users to put down inputs before running a workflow.

One thing to note is that the above statement doesn’t mean you must specify input value to run a workflow. I will show you why, go back to Orchestrator Client, edit Sample Workflow and navigate to Inputs tab:

Screenshot 2015-03-17 11.54.56

Create one more input called “Cluster” and type to be “VC:ClusterComputeResource”:

Screenshot 2015-03-17 11.54.47

Save and Close, ignore the warning.

Now when you run it, the workflow will ask you to write down VM and Cluster:

Screenshot 2015-03-17 11.57.04

Remember, the scriptable task in this workflow only looks for VM input and calls datastore property, i.e. System.log(VM.datastore). This means whether you specify Cluster value or not, it has no impact to the workflow. OK, why would you want to do this? I will give you another scenario.

Administrator from CompanyA wants to develop a workflow that allows the users to look for datastore(s). For the inputs, he/she wants to ask users to specify either VM or Cluster.

Assuming you created an input Cluster, navigate to scriptable task, Visual Binding tab and drag and drop Cluster to this:

Screenshot 2015-03-17 12.02.59

Go to Scripting tab and type the following in:

if (VM && Cluster) {
    System.log("Please specify either VM or Cluster");
} else if (VM) {
   System.log(VM.datastore);
} else if (Cluster){
   System.log(Cluster.datastore);
}

What this scriptable task does it:

  • If VM and Cluster specified, it asks the user to specify either VM or Cluster
  • If VM is specified, outputs datastore(s) attached to this VM
  • If Cluster is specified, outputs datastore(s) attached to this Cluster
 Running the workflow specifying Cluster only:
Screenshot 2015-03-17 12.08.21
Screenshot 2015-03-17 12.09.18
Specifying both VM and Cluster:
Screenshot 2015-03-17 12.11.48

Output

Next bit will be output. Once more, refer to the above example. This time, rather than displaying output directly, I am going to save the datastore property to an output.

First of all edit the workflow, navigate to Outputs tab and create an output parameter called Datastore with Array of VC:Datastore type:

Screenshot 2015-03-17 14.18.24
Then at Visual Binding tab, connect drag and drop Datastore to Out as per below:
Screenshot 2015-03-17 14.18.51

Edit scriptable task and modify the existing script to the following:

if (VM && Cluster) {
     System.log("Please specify either VM or Cluster");
} else if (Cluster) {
     Datastore = Cluster.datastore;
} else if (VM){
     Datastore = VM.datastore;
}

OK then you will say how do I output Datastore? The thing is, this output is the output of this workflow meaning, you will need to create another workflow and use this output as an attribute. An example will be provided in the next series for now, let me start discussing attributes.

Attribute

What is attribute? I personally want to call it “global variable” which can be defined in the beginning or set during workflow run. The reason I want to call it global variable is because when a value is given to it, it could be used anywhere within a workflow. So in summary:

  • Attribute could be pre-defined, i.e. input to a workflow
  • Attribute could be defined by a scriptable task, i.e. output to a workflow

Time for exercise!

Edit the workflow, navigate General tab and click Add attribute:

Screenshot 2015-03-17 14.53.48
Call it tempString and leave the type to string:
Screenshot 2015-03-17 14.54.15
Go to Schema tab, navigate to Visual Binding and you will see tempString is available on both In/Out:
Screenshot 2015-03-17 14.55.31
This time let’s try in attribute. Drag and drop tempString into input:
Screenshot 2015-03-17 14.59.31

As mentioned earlier, the value for input attribute should be pre-defined otherwise the default value will be ‘’, which is null!

Close the window, navigate to General tab and type in “Hello World” in value tab:
Screenshot 2015-03-17 15.02.47
Then go back to scriptable task and type the following in:
System.log(tempString);

And running the workflow will give you the following:

Screenshot 2015-03-17 15.03.51

The exercise we’ve just went through shows you how to pre-define an attribute and output it in the workflow. In this case, the attribute is set to “Hello World” type to String and it could be called anywhere within this workflow. Literally, you can create 10 scriptable tasks and use this value in all of them.

Next one will be output attribute, where you can give a value to it within the workflow.

Edit the workflow, go back to scriptable task, navigate Visual Binding and disconnect tempString:
Screenshot 2015-03-17 15.08.00
And remove tempString from Input:

Screenshot 2015-03-17 15.08.14

Next one will be migrating an output parameter to attribute. Instead of creating an attribute manually, you can always migrate an existing input or output parameters to attributes. Go to Outputs and click on attributes which will automatically move output to an attribute:

Screenshot 2015-03-17 15.12.38

Screenshot 2015-03-18 11.23.16

You will see Datastore is now moved to attributes. Go back to scriptable task, navigate to Visual Binding tab and drag and drop Datastore to Out box:

Screenshot 2015-03-17 15.13.22

Then add the following in at Scripting field:

if (VM && Cluster) {
     System.log("Please specify either VM or Cluster");
} else if (Cluster) {
     Datastore = Cluster.datastore;
} else if (VM){
     Datastore = VM.datastore;
}

Let’s go back to this statement above:

The thing is, this output is the output of this workflow meaning, you will need to create another workflow and use this output as an attribute.

This time, rather than saving Datastore as an output of the workflow, now the output is from a scriptable task. This means it could be used anywhere within this workflow.

Let me show you how to do this. Drag and drop one more scriptable task just after the original scriptable task:

Screenshot 2015-03-17 15.14.55

This time, drag and drop Datastore from Input Attributes:

Screenshot 2015-03-17 15.15.39

On the second scriptable task, type the following in Scripting field:

System.log(Datastore);
Running it with Cluster defined:
Screenshot 2015-03-17 15.16.46

The output is same as last time but the difference now is that:

  • Saved datastore output to Datastore object as an attribute
  • Parsed the attribute to a scriptable task and presented this value

The purpose of this exercise was to make you familiar with attribute as moving on, this will be used most of the time.

Wrap-Up

Hope this series helped and always welcome to leave a reply below or Twitter for any clarifications.

As mentioned in introduction, the next series will be Action & Workflow. Stay tuned!