Azure – Intra Subscription Azure SQL Database Migration Tip

Introduction

First blog for the year! It’s been extremely busy but it means that I’ve got lots of things to share.

In this blog post, I will be sharing efficient way of migrating (copying) a Azure database from one subscription to another.

Let’s get started!

Traditional Way

Previous to ARM model, and even ARM was introduced, I believe lots of you would still be using the following method:

  • Make a copy from a production database
  • Export the copied database to a blob storage
  • Import the .bacpac to the destination SQL server
  • Delete the database copy
  • Delete the .bacpac file

This method is completely fine, but there are some cons around this:

  • Export & import process takes long time which ends up incurring extra operational cost
  • Possibility of leaving the .bacpac file back in the blob storage (if not an automated way)
  • You need to know the administrator login details for both source and destination SQL servers

New Way

From the new ARM portal, it allows you to migrate a resource to a resource group in different subscription.
Using this functionality, the following approach could be made:

  • Create a temporary SQL server in the resource group where a production SQL server is running
  • Make a copy of the production database to the temporary SQL server
  • Move the temporary SQL server to another subscription
  • Make a copy of the copied database to the destination SQL server
  • Delete the temporary SQL server

What are the benefits?

Pretty much reverse to what’s mentioned above, cons for traditional way.
One of the main benefits is “reduced operational cost” – it doesn’t require a copied database to stand up for export & import processes.

Also, no additional space required in blob storage to hold the .bacpac file.

Hope this helps and feel free to leave a comment for any questions or clarifications 🙂

Azure – Intra Subscription Azure SQL Database Migration Tip

vRO – Convert CSV String Input to Multidimensional Array

Introduction

I’ve been playing around with workflow scripts and one thing popped in my mind that might be useful in future was to parse a CSV from 3rd party source into vRO workflow and utilise it. Created a simple CSV file, parsed to vRO and I realised that it becomes a pure string.

In this blog post, I will be attaching and a script that converts CSV string input into multidimensional array and in the end, I will be going through some of the use cases.

Walk-through

In this example, what I will be presenting is:
  1. Use SSH workflow to get a CSV file from remote linux server and parse the CSV file as an attribute
  2. Pass above attribute to a scriptable task, convert it to a multidimensional array
  3. Utilise it

Let’s get started!

1. First of all, create a workflow, I called it “Convert CSV String Input to Hash Table”.

2. Edit the workflow, navigate to General tab and create attributes as per below. It may differ based on how you want to parse a CSV file from external source. In this case, I used a cat command to simply output a CSV file.

3

3. Go to Schema, put a scriptable task and call it “Convert”

1

4. Go to Script tab and put the following script in. Ensure attrCsvInput is recognised, it will be highlighted as pink.

var output = new Object();
var numberOfColumns = attrCsvInput.split('\n')[0].split(',').length;
var numberOfRows = attrCsvInput.split('\n').length - 1;
var tempArray = new Array();

for (i = 0; i > numberOfColumns; i++) {
    for (j = 1; j > numberOfRows; j++) {
        tempArray.push(attrCsvInput.split('\n')[j].split(',')[i]);
    }
    output[attrCsvInput.split('\n')[0].split(',')[i]] = tempArray;
    tempArray = new Array();
}
for (var k in output) {
    System.log('key is: ' + k + ', value is: ' + output[k]);
}

5. Finish off the visual binding as per below.

2

6. Let’s setup a SSH connection to a remote linux box. Search for the SSH workflow and put it between start and Convert scriptable task.

4

7. Finish off the binding as per below and make sure unused inputs are set to NULL, i.e. path, passphrase and encoding to NULL.

5

6

7

8

9

8. Run the workflow and you will see that the CSV file is converted to a multidimensional array, 3 keys Name, Age and Sex and values accordingly.

13

Use Cases

Replace the following bit in the end of the scriptable task to one below.

From:

for (var k in output) {
    System.log('key is: ' + k + ', value is: ' + output[k]);
}

To:

System.log("People younger than 30 are: ");
for (var o = 0; o < output["Age"].length; o++) {
    if (output["Age"][o] < 30) {
        System.log(output["Name"][o] + ": " + output["Age"][o]);
    }
}
Run the workflow. An example output attached below and as shown, people younger than 30 years old are listed.
14

Further, you could filter it by either man or woman…etc.

One of the recommendations is to create this parsing function as an action, input to be a CSV input and output to be a JavaScript array.

Hope this helped for who requires external parsing CSV input to a vRO workflow and as always, feel free to leave a commend for any clarifications or questions 😀

vRO – Convert CSV String Input to Multidimensional Array