How to use Power BI to present Data Migration Assistant 5.0

If you are planning on migrating your on-premise SQL Servers to the Azure platform, or upgrading to later version of SQL Servers, such as SQL Server 2017 or 2019, Microsoft’s Data Migration Assistant (DMA) is a great tool that can be used to assist with these tasks.

The Microsoft DMA tool can be used to complete a thorough assessment of SQL Servers and provide recommendations for selected migration targets. However, reading the actual assessment result can be a bit daunting, especially if you have multiple SQL Server instances to review with different migration target types.

This article will show you how to consolidate DMA assessment results into a SQL database, which can be used with Power BI to create visualisations making interpretation and utilisation of the DMA results more user friendly.

Prerequisites for using Microsoft’s DMA tool:

  1. DMA 5.0. Download from: https://www.microsoft.com/en-us/download/details.aspx?id=53595
  2. PowerShell 5.0.
  3. PowerShell Modules to process DMA report data. https://techcommunity.microsoft.com/gxcuf89792/attachments/gxcuf89792/MicrosoftDataMigration/56/4/PowerShell-Modules2.zip
  4. A SQL on-premise instance. This will be used to consolidate DMA result data and Power BI data source.
  5. DMA data process TSQL scripts. https://techcommunity.microsoft.com/gxcuf89792/attachments/gxcuf89792/MicrosoftDataMigration/56/1/LoadWarehouse1.zip
  6. DMA Report Power BI template. https://techcommunity.microsoft.com/gxcuf89792/attachments/gxcuf89792/MicrosoftDataMigration/56/2/PowerBI-Reports.zip

Download and install DMA application (No reboot required).


When you open the DMA application, you will need to create a new project. It is recommended to have a key word of the target server type in the project name as one DMA project can only assess one target server type. For example, if you want to assess your servers against Azure SQL Database and Azure SQL Managed Instance, you must create two DMA projects.



Next, add all source instances and databases. Please avoid busy hours for production servers. Follow the best practice from here: https://docs.microsoft.com/en-us/sql/dma/dma-bestpractices?view=sql-server-ver15



Once the assessment is complete, export the result as a JSON file.



Once all assessment results have been exported as JSON files, it is time to consolidate these into a JSON Directory.

You will require a computer with Power Shell 5 installed (Prerequisites item 2, listed above).

Download Power Shell Modules zip file (Prerequisites item 3, listed above) and extract 4 folders out of the zip file. Copy these 4 folders to the Power Shell installation folder. Default is c:\program files\WindowsPowerShell\Modules\. Then you will not need to import these modules every time you run the Power Shell scripts.



Now run the script:

dmaProcessor -processTo SQLSERVER `
-servername “localhost” `
-CreateDMAReporting 1 `
-CreateDataWarehouse 1 `
-databaseName DMAReporting `
-warehouseName DMAWarehouse `
-jsonDirectory “C:\Cubesys\JSON\”

This script will:

  1. Create DMAReporting database,
  2. Create DMAWarehouse database,
  3. Populate DMA Reporting database with all JSON files’ content from c:\cubesys\json folder.

-jsonDirectory is the folder where all the JSON files are saved. Multiple JSON files can be processed together.
-servername is the SQL instance name. This needs to be an on-premise SQL Server



Once this step has been successfully completed as in the screenshot above, you can then use SQL Management Studio to connect to the SQL instance.

Download the Prerequisites item 5 (listed above), run the SQL script to load data into DMAWarehouse. Please read the comments and follow the instruction for each step in the script.

The original script from Microsoft doesn’t have database names populated. Please be sure to update ALL <Report Data Database, sysname, DMAReporting> to DMAReporting and <Warehouse Database, sysname, DMAWarehouse> to DMAWarehouse.  These need to match the parameters’ value from the Power Shell command.

Note. When copying and pasting code from this blog into SQL Management Studio, please replace any invalid characters before executing the script. This is due to the converting job done by HTML editor.”

/*
IMPORTANT – Before running the script, press CTRL+SHIRT+M to set the database name parameters.
RUN EACH STEP INDIVIDUALLY.
TAKE CARE TO READ THE COMMENTS.
Step 1 – cleans object types. (this should be a temp step until tool is fixed)
*/

UPDATE ..ReportData
SET ImpactedObjectType = ‘Database Options’
WHERE ImpactedObjectType = ‘DatabaseOptions’

/*
— These scripts use 3 part names. Ensure you update your database names
— Step 2 – Capture any missing rules
*/

INSERT INTO ..dimRules (RuleID, Title, Impact, Recommendation, MoreInfo, ChangeCategory)
SELECT DISTINCT rd.Ruleid
, rd.Title
, rd.Impact
, rd.Recommendation
, rd.MoreInfo
, rd.ChangeCategory
FROM ..ReportData rd
LEFT JOIN ..dimRules dr
ON rd.RuleId = dr.RuleID
AND rd.Title = dr.Title
WHERE dr.RuleID IS NULL

/*
— Capture any missing object types
*/

INSERT INTO ..dimObjectType (ObjectType)
SELECT DISTINCT rd.ImpactedObjectType
FROM ..ReportData rd
LEFT JOIN ..dimObjectType ot
ON rd.ImpactedObjectType = ot.ObjectType
WHERE ot.ObjectType IS NULL

/*
— Step 3 – Run select statement first to ensure no NULL keys (except dbowner which is expected to be null at this point).
— IMPORTANT – Once happy uncomment out the INSERT statement and run again
— INSERT INTO ..FactAssessment(DateKey, StatusKey, SourceCompatKey, TargetCompatKey, Categorykey, SeverityKey, ChangeCategorykey, RulesKey, AssessmentTargetKey, ObjectTypeKey, DBOwnerKey, InstanceName, DatabaseName, SizeMB, ImpactedObjectName, ImpactDetail, AssessmentName, AssessmentNumber)
*/

SELECT dd.DateKey AS “DateKey”
,ds.StatusKey AS “StatusKey”
,sc.SourceCompatKey AS “SourceCompatKey”
,tc.TargetCompatKey AS “TargetCompatKey”
,dc.CategoryKey AS “CategoryKey”
,dsev.SeverityKey AS “SeverityKey”
,dcc.ChangeCategoryKey AS “ChangeCategoryKey”
,dr.RulesKey AS “RulesKey”
,AssessmentTargetKey AS “AssessmentTargetKey”
,ot.ObjectTypeKey AS “ObjectTypeKey”
,dbo.DBOwnerKey AS “DBOwnerKey”
,dma_rd.InstanceName AS “InstanceName”
,[Name] AS “DatabaseName”
,SizeMB AS “SizeMB”
,COALESCE(ImpactedObjectName, ‘NA’) AS “ImpactedObjectName”
,COALESCE(ImpactDetail, ‘NA’) AS “ImpactDetail”
,AssessmentName
,AssessmentNumber
FROM ..reportdata dma_rd
LEFT JOIN DMAWarehouse..dimDate dd
ON CONVERT(CHAR(8),dma_rd.ImportDate,112) = dd.[Date]
LEFT JOIN DMAWarehouse..dimStatus ds
ON dma_rd.[Status] = ds.[Status]
LEFT JOIN DMAWarehouse..dimSourceCompatibility sc
ON dma_rd.SourceCompatibilityLevel = sc.SourceCompatibilityLevel
LEFT JOIN DMAWarehouse..dimTargetCompatibility tc
ON dma_rd.TargetCompatibilityLevel = tc.TargetCompatibilityLevel
LEFT JOIN DMAWarehouse..dimCategory dc
ON dma_rd.Category = dc.Category
LEFT JOIN DMAWarehouse..dimSeverity dsev
ON dma_rd.Severity = dsev.Severity
LEFT JOIN DMAWarehouse..dimRules dr
ON dma_rd.RuleId = dr.RuleID
AND dma_rd.title = dr.Title — there is a ruleid being used for 2 different titles
LEFT JOIN DMAWarehouse..dimAssessmentTarget ast
ON dma_rd.AssessmentTarget = ast.AssessmentTarget
LEFT JOIN DMAWarehouse..dimChangeCategory dcc
ON dma_rd.ChangeCategory = dcc.ChangeCategory
LEFT JOIN DMAWarehouse..dimObjectType ot
ON CASE WHEN dma_rd.ImpactedObjectType IS NULL OR dma_rd.ImpactedObjectType = ” THEN ‘NA’ ELSE ImpactedObjectType END = ot.ObjectType
LEFT JOIN DMAWarehouse..dimDBOwner dbo
ON dma_rd.InstanceName = dbo.InstanceName
AND dma_rd.Name = dbo.DatabaseName
where IsLoaded = 0

/*
— Step 4 – update database owners
— This step is optional. This will enable the powerbi reports to filter on specific teams / owners

— Populate database owners
— Repeat for every instance / database
*/

INSERT INTO <Warehouse Database, sysname, DMAWarehouse>..dimDBOwner (InstanceName, DatabaseName, DBOwner)
VALUES (‘ExampleInstance’, ‘ExampleDatabase’, ‘ExampleOwner’)

/* — Once DBOwner dimension is populated update factassessment table

UPDATE fa
SET fa.DBOwnerKey = db.DBOwnerKey
FROM ..FactAssessment fa
JOIN ..dimDBOwner db
ON fa.InstanceName = db.InstanceName
AND fa.DatabaseName = db.DatabaseName;

After the data is loaded into the DMAWarehouse database, it is ready to import into a Power BI report.

Download the PowerBi application (Prerequisites item 6 as listed above). Select the SQL instance of the DMAWarehouse database as the data source for your PowerBi report. The database name is the name of the -warehouseName parameter value of the Power Shell script:

-warehouseName DMAWarehouse `



Once the data is refreshed you will have a report that will look like the below screenshot:



You can view individual database assessment results by right clicking and selecting “Drillthrough” to see remediation plans.



You can also publish this report to your Power BI workspace and share with different team members.




In my next post I will explain how to:

  1. fix NULL value issue when populating DMAWarehouse database.
  2. Add SQL Server 2019 support into the report (currently the PowerShell module only supports up to SQL Server 2017).

Nelson Jin
Nelson Jin
Senior Cloud Consultant