In this two-part series, I’ll take you through the necessary steps to take data from Azure Migrate and visualise it in a much friendlier way in Power BI. In this first part, I’ll share how I worked through Azure Migrate’s problems and started putting together the data sources in Power BI.

Azure Migrate

Azure Migrate is a great tool to help businesses lift-and-shift workloads into Azure pretty quickly. Unfortunately though, it’s use case kind of stops there at present. While useful features are being added all the time, there is still a bit of work to get it up to feature parity with competing products like Cloudamize that support tagging and better workload separation.

Sadly, these current limitations are most evident in how cumbersome it is to grab data out in a way that makes sense for a business. Microsoft recommends creating groups to break down servers/workloads logically but then gives absolutely no way to visualise a holistic overview of the entire digital estate! This is a major pain point if you’re looking to present your Azure Migrate data to key stakeholders interested in different things. For example, a CFO may want to know the entire running cost of the digital estate or a subset of logical groupings. But at the same time, an application owner would probably only care about making sure their workloads are compatible with Azure.  

At this point, you really only have two options. One is to create just one big group and get all the workloads in one report. Or the other is to create lots of groups using a logical separation and deal with the multiple reports generated somehow. 

For my use case, and most likely for yours, many Azure Migrate groups still make the most sense. Creating groups based on applications and the environment (e.g. “Intranet – Dev”) provide greater visibility if that application is suitable for Azure. After all, not every workload in your digital estate can be moved at once!

When creating the Azure Migrate groups, keep in mind that any assessments you create are bounded by the group. So while you have got a nice break down of your digital estate, you now got the problem of tieing it all together. This is where Power BI can help!

Exporting Azure Migrate Assessments with the REST API

Once you have created all your groups and assessments in Azure Migrate, you now need to get your data somewhere so Power BI can use it. Assessment exports (Excel) are possible but depending on how many assessments you’ve created, there will be multiple files to download. The problem, however, is the UI in the Azure Portal is horrible in terms of getting this data out! It’s at least five (5) clicks to get to the assessment export button, and there is no guarantee when you click on it that it downloads! Thankfully, you can get to these Excel files pretty quickly with the REST API. Here’s a simple script that does it for you.


The script above will inspect your Azure Migrate project and download all assessments workbooks (Excel) to the same folder. For reference on what API calls are made, check out Azure Migrate REST API | Microsoft Docs.

Source

With potentially hundreds of Excel files to manage because of the number of assessments you’ve made, the next step will be to bring these files into Power BI. There are two ways you could do this from a data source perspective. These are: 

  1. Use a folder path data source for files on local storage, or
  2. Move the files to a SharePoint Online document library and use a SharePoint site data source. 

Here are some power queries to help you get started.

SharePoint query

Local Folder query

In my experience, the OData query with the SharePoint data source is incredibly slow, so I’d recommend keeping it locally sourced. That way, you shouldn’t face any query timeout issues l initially faced building out this solution.

Transform and Combine

Once you have the sources right and can see the excel files in Power BI, you need to transform and combine each worksheet inside each assessment workbook. For each Azure Migrate workbook, there are four worksheets we need to get data from. These are:

  • Assessment_Summary
  • All_Assessed_Disks
  • All_Assessed_Machines
  • Assessment_Properties

Azure Migrate Assessment Workbook

To make this easy, Power BI gives us a “Combine Data” process when looking at source files from the local/SharePoint folder source. To make use of this, open up the Power Query editor for the source folder and use the combine button (see the screenshot below) on the Content column.

Combine Content in Power BI

In the transformation wizard, you are only given the option to combine just one of the four worksheets. To get all of them, there are two paths you can take. 

  1. Firstly, you could repeat this process four times for each worksheet. This works fine but does make a mess with sample files and parameters. Or,
  2. Secondly, you could do this for just one worksheet and then make multiple copies of the first function. Then all you need to do is make some tiny edits to get the desired worksheet in the four functions.

If the second option is the path you want to take, here are the function queries for your reference.

All Asssed Machines

All Assessed Disks

Assessment Properties

Assessment Summary

As you can see in these power queries, the Source is referring to a Parameter called Parameter1. Because there are potentially multiple Azure Migrate assessment workbooks in the folder, we need to filter on one of them to define the custom function query. Then, when we used this custom function in the power query for each table, all the worksheets will be merged. Here is what you need for the Parameter1 and the supporting Sample File query if you’re having trouble making these custom functions work.

Parameter1

Sample File

As you can see, Sample File will use the first file in the directory by using the Source index of 0. This does not mean we’ll only pull the worksheets from just one file. As the name implies, it’s only a sample.

Summary

Constructing the Tables in Power BI

With the custom functions and sample file parameter created, you should now be able to make the necessary tables by calling the right custom function in the table query to get all the data. If you’re unsure how to do this, keep an eye out for part two, where I’ll provide the power queries needed for the tables, build out the data model and create some reports. Stay tuned!