Setting up Automation in Azure

As I’ve started digging deeper into using Azure the need to automate tasks (turn off all VMs, turn on all VMs, automatically resize all VMs, etc…) has becomes very apparent.  In April 2014 Microsoft announced the release of Microsoft Azure Automation to handle such operations.

Activate Azure Automation

This new feature is still in beta but you can sign up your Windows Azure account to try the feature by going here:

https://account.windowsazure.com/PreviewFeatures?fid=automation

image

image

Once the Automation feature is enabled for your Azure account you will see it listed with all other available feature on your account.

image

Microsoft has some documentation on what to do next:

http://azure.microsoft.com/en-us/documentation/articles/automation-create-runbook-from-samples/?WT.mc_id=Blog_SC_Automation_Azure

This link gives you a basic “Hello World” example, but thought I would take you through a more interesting example that I’m using to turn off all my VMs at a certain time. The high level steps are:

  1. Create an Automation Account
  2. Create a Management Certificate
  3. Upload the Management Certificate
  4. Create a Connection Asset
  5. Create a Credential Asset
  6. Upload a Runbook
  7. Schedule the Runbook

Create an Automation Account

  • Log in to the Azure Management Portal.
  • Navigate to the Automation feature on the side pane and select Create an Automation Account.
  • I’m creating this automation to turn off and back on Azure VMs that I use for training classes so I’ll name my account appropriately.

image

Create a Management Certificate

Azure Automation authenticates to Microsoft Azure subscriptions using certificate-based authentication. There are several ways to create a new management certificate. I used Visual Studio’s makecert utility to create my certificate but if you don’t have Visual Studio there are other methods for creating a certificate.

  • Launch the Visual Studio Command Prompt with Run as Administrator.

image

  • Run the following command to produce the certificate:

makecert -sky exchange -r -n “CN=<CertificateName>” -pe -a sha1 -len 2048 -ss My “<CertificateName>.cer”

Microsoft Documentation on this command: http://msdn.microsoft.com/en-us/library/azure/gg551722.aspx

  • Next you’ll need to export the private key and certificate from the Certificate Manager. You can launch this by typing certmgr.msc from your start menu.
  • Once the Certificate Manager launches open the Personal and then Certificates folder.
  • Right-click on the certificate that you created previously and select All Tasks and then Export.
  • Let’s start by exporting the private key. In the Certificate Export Wizard select Yes, export the private key.

image

  • Leave the default selection of Personal Information Exchange and then click Next.

image

  • Provide a password to the private key. You will be required to type this in later in Azure.

image

  • Name the private key and store it in a local drive for now.  It will be imported into Azure later.

image

  • Click Finish to execute the export.

image

  • We still need to export the certificate file. Right-click on the certificate in the Certificate Manager again and select All Tasks and then Export.
  • This time select No, do not export the private key.

image

  • Keep the default selection of the format and then click Next.

image

  • Name the certificate and store it in a local drive for now.  It will be imported into Azure later.

image

Upload the Management Certificate

With the certificate now created we’re ready to upload it to Azure.

image

  • Select Management Certificates from the Settings page.
  • Then select Upload from the bottom of the page.
  • Browse to the certificate file we created in the previous steps and select the Azure subscription you would like to apply it to. Click the check to upload the certificate.

image

Create a Connection Asset

The next step is to create a connection asset. Doing this allows you to easily pass in information about your Azure subscription into your scripts we will create later called runbooks.This also helps as information about your Azure subscription changes you only have to change it one place instead of all your runbooks.

  • Go back to the Automation page in the Azure Management Portal.
  • Click the arrow on the Automation account created earlier.
  • Select Assets from the top of the account page.
  • Then select Add Setting from the bottom of the Assets view.

image

  • Select Add Connection.

image

  • In the Configure connection page select Azure as the Connection Type and then provide a name for the connection. Click the arrow in the bottom right to move to the next configuration step.

image

  • On the next page you must provide a name for the automation certificate name. You can use any name you want but you must remember this exact name for our next step so I’d recommend copying somewhere for later use.
  • You’re also required to provide the Subscription ID that you wish to apply this connection to. You can find the Subscription id under the Settings page. Click the check to complete the connection asset.

image

Create a Credential Asset

Before we can create an automation to shutdown VMs automatically we have to add credentials. This is required to authenticate to the subscription that is hosting the virtual machine.

  • Select Add Setting again.

image

  • Select Add Credential.

image

  • Chose Certificate for the Credential Type and ensure that you name the credential the exact same name you provided in the connection asset found in the last section of steps. Click the arrow to continue.

image

  • Browse to and select the private key (.pfx) created earlier and then provide the password you entered when creating the private key. Click the check to complete the creation of the certificate asset.

image

Upload a Runbook

We are now ready to scheduling scripts like turning on and off VMs through runbooks.

There is are few lines of code that are used to connect a runbook to your Azure subscription through the certificate and connection assets we just created. To make this easier the Azure Automation team has consolidated this script into a runbook called Connect-Azure. There are many template runbooks available for you to download at the Azure Automation Script Center. The great thing about how runbooks work is you can call other runbooks from inside your code.

Let’s start by downloading and publishing the Connect-Azure runbook provided by the Micorosoft team to your Azure portal.

  • Go to the script center and download the Connect-Azure runbook template.
  • Select your automation account by clicking on the arrow next to the name of the account.
  • Select Runbooks to import the Connect-Azure runbook.

image

  • At the bottom of the page click Import and browse to the location you saved the Connect-Azure.ps1 file.

image

image

  • On the Runbooks page click on the newly imported Connect-Azure runbook.
  • Then click the Author tab and then select Publish on the bottom of the page. This makes this runbook ready to use. You’ll notice you can also test and even edit the script from inside the Azure Portal.

image

  • With this fundamental runbook imported we can start creating our own. In my case I wish to turn off all the VMs that are part of a cloud service so I select Import to another runbook I’ve created to do this.
  • If you would like you can download my PowerOffVM.ps1 I’m using for this demonstration and then Import it as shown below.

image

  • **Note that the PowerShell script must be within a workflow to be consumed as an automation. I’m new to PowerShell so here’s the TechNet description of what a workflow is “A workflow is a sequence of programmed, connected steps that perform long-running tasks or require the coordination of multiple steps across multiple devices or managed nodes. The benefits of a workflow over a normal script include the ability to simultaneously perform an action against multiple devices and the ability to automatically recover from failures.”Here’s what the inside of my PowerShell Script looks like in case your curious:

<#
.SYNOPSIS
Automates the process of powering down all you Azure VMs in a particular cloud services

.NOTES
Author: Devin Knight
Last Updated: 6/16/2014
#>

workflow PowerOffVMs {
Param
(

# Optional parameter of type string.
# If you do not enter anything, the default value of Name
# will be MyCloudService
[parameter(Mandatory=$false)]
[String]$cloudSvcName = “MyCloudService”,
[parameter(Mandatory=$false)]
[String]$vmName = “*”

)

# Select the subscription to use
Connect-Azure -AzureConnectionName “AzureAutomationConnection”
Select-AzureSubscription -SubscriptionName “AzureAutomationConnection”

# Turn off Virtual Machines
Stop-AzureVM -ServiceName $cloudSvcName -Name $vmName -Force

}

  • You’ll notice that the name of the VM is being populated by a wildcard “*” to return back all VMs that are part of my cloud service. If you’re just finding this script and attempting to use it without Automation you do not need a workflow. You can simply run the following:

Stop-AzureVM -ServiceName “MyCloudService” -Name “*” -Force

  • With the runbook imported you will see your runbook now available.  Click on the arrow next to the name of the workflow

image

  • Go to the Author page. You will find here you can do things like modify, test and publish the script. Select Publish to make this runbook available to schedule.

image

Schedule the Runbook

  • Next click the Schedule page and select Link to a New Schedule
  • Provide a name for the schedule then click the arrow in the bottom right to continue

image

  • I need my VMs to power down everyday at 8:00 PM so I’ve configured the schedule to shut down daily at 20:00. Then click the check to complete the configuration of the schedule.

image

  • This runbook has two parameters so you can type the values that should be used when executing on this schedule. I provided my cloud service that my VMs are located on and a ‘*’ as a wildcard to include all VMs in the cloud service.

image

Congratulations! You’ve just automated the powering down of all your Azure VMs in a cloud service. You can of course create another runbook for powering up all you VMs in the morning. If you’d like you can download my runbook called PowerOnVMs.ps1 for doing this.

**Update**

I was sent a brilliant update to my PowerOnVMs script from Ken Ross (@kzhen | http://kzhendev.wordpress.com).  You can download my script above again and now it will skip Saturday and Sunday when it automatically turns on your VMs.  Thanks Ken!

Using a Hyper-V VHD in an Azure VM

I’m in the process of moving some my lab environments that I use for teaching SQL Server classes to Azure. I’m pretty new to the Azure infrastructure world but I thought I could share some of my lessons learned in doing this. My first post will walk you through moving an existing Hyper-V virtual machine image to be hosted in Azure.

Limitations

Before you start this process here’s a couple things to be aware of:

  • Virtual Machines in Azure must be .vhd files.  They cannot be .vhdx or other platforms. There are easy ways to convert to .vhd though.Supposedly the Add-AzureVHD command will automatically convert .vhdx files to traditional .vhd files however  when I attempted this it did not work, and produced the following error:

    Add-AzureVHD : ‘C:\VM\Virtual Hard disks\MyVM.vhdx’ is not a valid VHD file.

    No problem though. With a quick Windows PowerShell command I converted my .vhdx file to a .vhd file type.

    PS C:\>Convert-VHD –Path “C:\VM\Virtual Hard Disks\MyVM.vhdx” –DestinationPath “C:\VM\Virtual Hard Disks\MyNewVM.vhd”

  • image
  • You should also be aware that there are limits to the size of the VM that you can attach. I went through the process I’m about to describe to you only to find out that at my .vhd was too large.  The maximum supported OS disk size is 128 GB.  Any larger and you will see an error like the screen shot below. Notice this is just the OS disk size so that tells you that other disks that are larger can be attached later. Use this post to guide you through shrinking your virtual disk size Reducing the Disk Size of a Hyper-V Virtual Machine.
  • image

Uploading Your VM

These first set of steps will walk you through uploading an existing Hyper-V VHD into Azure.

  • If you haven’t already download and Install the Windows Azure PowerShell module http://go.microsoft.com/fwlink/p/?linkid=320376&clcid=0x409
  • Launch Windows Azure PowerShell
  • Before you start using the Windows Azure PowerShell cmdlets, you need to configure connectivity between your machine and Windows Azure. One way to do this is by downloading the PublishSettings file from Windows Azure and importing it. Using the PowerShell prompt enter the following commands:
    • Get-AzurePublishSettingsFile
    • Import-AzurePublishSettingsFile “C:\Subscription-6-5-2014-credentials.publishsettings”
  • Next setup source and destination parameters for your .vhd file.  Using the PowerShell prompt enter the following commands:
    • $sourceVHD = “C:\VM\Virtual Hard Disks\MyNewVM.vhd”
    • $destinationVHD =  “https://yourcontainer.blob.core.windows.net/vhds/MyNewVM.vhd”
  • Using these parameters us the Add-AzureVHD cmdlet to upload your .vhd to an Azure blob storage container.  You can also specify the number of threads you would like to use for the upload.
    • Add-AzureVHD –LocalFilePath $sourceVHD –Destination $destinationVHD –NumberOfUploaderThreads 3

image

Map the VHD to an Azure Virtual Machine Disk

  • Once the upload completes login to your Azure portal. https://manage.windowsazure.com
  • Click on the Virtual Machines page and then select Disks and then Create on the bottom of the screen.
  • Provide a Name and then click browse to find the VHD URL that you uploaded in the previous steps.
  • You should also Check the box indicating that The VHD contains an operating system.  Then select the appropriate OS from the Operating System Family property. Click OK.

image

Creating the Azure VM

With this complete you can now create an Azure Virtual Machine from you .vhd you upload from your local machine.

  • Click on the Virtual Machine page again and click New in the bottom of the screen.
  • Select From Gallery.

image

  • Click on My Disks and you will find the .vhd uploaded and assigned as a Virtual Machine disk.
  • Select your machine then click the arrow in the bottom right to continue.

image

  • Provide a name in the Virtual Machine Name property and select the resource plan you would like this virtual machine to use.
  • Click the arrow in the bottom right to continue.

image

  • Configure the DNS name for the virtual machine and which virtual group you would like to join it to then click the arrow in the bottom right to continue.

image

  • Optionally you could add third-party additional extensions to the VM. Click the check to confirm your configuration of the new VM.

image

Your VM will be provisioned and should be available immediately once these steps are complete.  Keep in mind a lot of these steps that I did through the management portal could certainly be done using PowerShell as well.

Power Pivot as the Core to your Power BI Solution

I’ve noticed a recent trend with many presentations and articles on Power BI and Excel functionality. Many of these presentations show how quickly you can take tools like Power Query to find and discover data and then quickly present it through tools like Power View and Power Map. These are really impressive demonstrations of what you can do in Excel in a matter of a few minutes to start with no data to quickly making business decisions through some form of a reporting layer.

While I really enjoy these demonstrations my one gripe is it doesn’t tell the full story. Take the example I just gave of discovering data with Power Query then presenting in in Power View.  In the excitement of taking the results of Power Query into Power View we often forget there is another important tool involved. I’m certainly guilty of this as well but I thought it was worth some explanation of what’s really going on behind the scenes.

By now you have probably learned that Power Query provides the option to put your data into Power Pivot immediately after you have finish transforming the data by selecting Load to Data Model.

image

But what if you don’t choose this option? You go with the default Load to worksheet setting and go about your business. Your data gets imported into an Excel worksheet and then you have to make a decision on how to visualize it.

This is what is often skipped in presentations. As soon as the decision is made to present your data through either Power View or Power Map the data in this Excel worksheet is automatically pushed into Power Pivot.

Let’s get a bit of an expanded view of how all these parts work together:

image

Another way of looking at this is based on the tool:

Power View

Power View reports can be developed from either Excel 2013 or SharePoint (On Premises) 2010 SP1 with SQL Server 2012.

If using Excel 2013 at some point the data must be placed in Power Pivot to visualize it in Power View.

If designing reports through SharePoint you can design reports on top of a Power Pivot workbook deployed to SharePoint, a connection to Analysis Services Tabular deployed to SharePoint or a connection to Analysis Services Multidimensional deployed to SharePoint.

Power Map

Power Map can currently only be done within Excel 2013 and Office 365 (Power Map preview is still available for On Premises Office). The data in Power Map must be based on top of a Power Pivot workbook.

Power BI Q&A

Power BI Q&A can only be done through a Power BI site and Office 365 subscription. The Q&A interaction is all based on top of a Power Pivot workbook that has been deployed to a Power BI site. You can optimize the Q&A interaction by adding things like synonyms to your Power Pivot workbook.

Conclusion

What you can get from this diagram is that Power Pivot is central to creating a complete Power BI solution. A well designed data model can only increase the chances of your reporting layer being successful. A poor design data model can make it difficult to make appropriate visualizations. So as you start exploring the Power BI tools make it a point to have a well designed Power Pivot data model.

Power Query to HDFS – Remote Name Could Not Be Resolved

I wanted to share a quick resolution I had to a problem with connecting Power Query to data stored in HDFS.

Seems easy enough. Launch Excel and through Power Query select From Other Sources –> From Hadoop Files (HDFS)

image

Next you’re prompted to provide your Hadoop server name then click OK.  I’ll be using the Hortonworks Sandbox environment for this demo.

image

I then see all the files available on HDFS.

image

If I try to explore the data in these files by clicking Binary next to any of the files listed I’m presented with the following error: DataSource.Error: HDFS cannot connect to server ‘sandbox.hortonworks.com’. The remote name could not be resolved: ‘sandbox.hortonworks.com’.

image

The issue here is with the name resolution and is a configuration issue. To fix this you must put your Hadoop cluster details in the HOSTS file on your machine that’s running Power Query. 

To edit the HOSTS file (I recommend you take a backup of it first) you’ll need to launch a text editor as administrator then open the HOSTS file found in the following folder:

C:\Windows\System32\drivers\etc

Provide the IP address and host name of the server as shown below then save.

image

Next time I try to connect to a file on HDFS everything now works properly.

image

Hope this helps!

Creating Power Map Reporting Solutions Recording and Q&A

I hope you were able to attend my free webinar on Creating Power Map Reporting Solutions on April 17 2014.  If you weren’t you can now watch the recording here.

Following each webinar I like to post a quick Q&A session for everything I was not able to answer in the allotted time.

Q: Am I correct in that you need the Pro Plus version of Office 365 and not the Home or University version?

Yes, that’s correct most of the Power BI features in Excel (Power Pivot, Power View, etc..) require the Office Pro Plus edition.

Q: I thought I could do Power Map without Office 365?

Prior to Power Map’s official release it was available to anyone running Excel 2013 (O365 or traditional) but when Power Map was officially release it was released as an Office 365 only feature.  You can however still download the preview and try it on non-O365 versions.

Q: Is there any way to use custom geographic data using shape file? For ex: school boundaries

I get this question a lot.  If your school boundaries fit within traditional geo-political boundaries (zip, county, etc…) you can make this work now but you cannot upload your own shapes yet.  To make this work now you would just categories each of you boundaries in something Power Map can understand.

Bringing Power BI Q&A to your Organization Recording and Q&A

I hope you were able to attend my free webinar on Bringing Power BI Q&A to your Organization on April 15 2014.  If you weren’t you can now watch the recording here.

Traditionally following a webinar I try to post a quick Q&A session for everything I was not able to answer in the allotted time.

Q:  Does Power BI require SharePoint Online or can it work with public facing on premises SharePoint?

Yes, Power BI sites and Power BI Q&A are only available if you’re using SharePoint Online.  Public Facing traditional SharePoint servers do not have the ability to add the Power BI functionality.

Q: If you create a PivotChart or Power View report, can Q&A show that chart when a user searches something relevant to the chart?

No, currently Power BI Q&A can only create Power View reports only the fly.  It cannot reference preexisting reports and display those.  You can however influence the way your data is presented by giving proper relationships, data types and data categories.  Once a Power View report is created on the fly you can change the visualization easily by selecting the the chart type you would prefer.

Q: Can I export the results to Excel and/or PowerPoint?

The current use of Q&A is as a quick ad-hoc analysis tool and does not have an output like this although I think this is an interesting thought. 

Q: Can i upload an Excel file to the Power BI site which is data connected to an SSAS source (locally hosted)?

You can as long as the SSAS source exposed through Power Pivot.  If you want to refresh any locally hosted data sources then you must configure the Data Management Gateway to allow local data sources to interact with the Power BI cloud.

Upcoming Webinar – Creating Power Map Reporting Solutions

image

Come join me for this webinar on 4/17/2014 at 11:00 AM EST

Creating Power Map Reporting Solutions

Excel 2013 with Office 365 brings many new features to the realm of Self-Service BI. There are now new ways to extract, model and present data all within Excel. One of these newest ways to present data is called Power Map. Power Map allows you to create engaging 3D map visualizations to help understand your geographical data better. This session will walk you through the beginning to end process of creating Power Map tours. Register now here.

Follow

Get every new post delivered to your Inbox.

Join 37 other followers