Dataflows for Power BI with Dynamics GP Data

Part 2

In Part 1 of Dataflows for Power BI with Dynamics GP Data, we talked about what dataflows are and started creating our first dataflow entity by creating a Date dimension. Now let’s start creating the remaining entities for:

  • Dimensions from Dynamics GP card windows via SQL queries:
  • Customer data from Sales module’s customer card
  • Salesperson data from the Sales module’s Salesperson card
  • Sales Territory data from the Sales module’s Sales Territory card
  • Product data from Inventory module’s item card
  • Fact data from our Dynamics GP transaction entry windows via SQL queries:
  • Sales data from the Sales Transaction Entry window
  • Inventory data from the Inventory Transaction Entry window

select the “Add entities” button to add additional data.

Add entities

Select the “SQL Server database” tile.

Choose data source window

Fill out your Server, Database and Connection credentials for the Enterprise gateway.

Fill in your data source connection info

I like to connect to my data via SQL views and you can download the queries for those views here. Once you’re done creating all of the entities your dataflows screen should look something like this.

View of entities from data source based on SQL views

Creating Custom Functions in Dataflows

The easiest way to create custom function in Power BI service’s dataflow is to create it in Power BI Desktop’s Power Query and then open up the Advance Editor to copy and paste the M code into a blank query.

let
  Source = (input) =>

let
values = {
{"ALABAMA","AL"},
{"ALASKA","AK"},
{"ARIZONA","AZ"},
{"ARKANSAS","AR"},
{"CALIFORNIA","CA"},
{"COLORADO","CO"},
{"CONNECTICUT","CT"},
{"DELAWARE","DE"},
{"FLORIDA","FL"},
{"GEORGIA","GA"},
{"HAWAII","HI"},
{"IDAHO","ID"},
{"ILLINOIS","IL"},
{"INDIANA","IN"},
{"IOWA","IA"},
{"KANSAS","KS"},
{"KENTUCKY","KY"},
{"LOUISIANA","LA"},
{"MAINE","ME"},
{"MARYLAND","MD"},
{"MASSACHUSETTS","MA"},
{"MICHIGAN","MI"},
{"MINNESOTA","MN"},
{"MISSISSIPPI","MS"},
{"MISSOURI","MO"},
{"MONTANA","MT"},
{"NEBRASKA","NE"},
{"NEVADA","NV"},
{"NEW HAMPSHIRE","NH"},
{"NEW JERSEY","NJ"},
{"NEW MEXICO","NM"},
{"NEW YORK","NY"},
{"NORTH CAROLINA","NC"},
{"NORTH DAKOTA","ND"},
{"OHIO","OH"},
{"OKLAHOMA","OK"},
{"OREGON","OR"},
{"PENNSYLVANIA","PA"},
{"RHODE ISLAND","RI"},
{"SOUTH CAROLINA","SC"},
{"SOUTH DAKOTA","SD"},
{"TENNESSEE","TN"},
{"TEXAS","TX"},
{"UTAH","UT"},
{"VERMONT","VT"},
{"VIRGINIA","VA"},
{"WASHINGTON","WA"},
{"WEST VIRGINIA","WV"},
{"WISCONSIN","WI"},
{"WYOMING","WY"},
{"WASHINGTON, D.C.","DC"},
{"WAHINGTON","WA"},
{"ONTARIO","ON"},
{"ONTARIO CANADA","ON"},
{"QUEBEC", "QC"},
{"QUEBEC CANADA","QC"},
{"NEWFOUNDLAND","NL"},
{"ALBERTA","AB"},
{"ALBERTA CANADA","AB"},
{"BRITISH COLUMBIA","BC"},
{"BRITISH COLUMBIA CANADA","BC"},
{"MANITOBA","MB"}

},

Result = List.First(List.Select(values, each _{0}=input)){1}

in

Result
in
  Source

Get data screen of custom function

Rename the custom function to “fnLookup”.

Edit queries

Now we need to modify our Customer entity to clean up the state column. Right click on the Customer entity and select “Advanced Editor”.

Advanced editor

Copy and paste the below Power Query code into the Advanced Editor window. This custom function helps replace the data in the State column with the accepted two-character State abbreviations.

let
  Source = Sql.Database("YOUR SQL SERVER", "YOUR DATABASE"),
  #"Navigation 1" = Source{[Schema = "dbo", Item = "view_Customers"]}[Data],
  #"Renamed columns" = Table.RenameColumns(#"Navigation 1", {{"State", "State_old"}}),
  #"Trimmed text" = Table.TransformColumns(#"Renamed columns", {{"State_old", each Text.Trim(_), type text}}),
  #"Inserted conditional column" = Table.AddColumn(#"Trimmed text", "State", each try fnLookup([State_old]) otherwise [State_old]),
  #"Reordered columns" = Table.ReorderColumns(#"Inserted conditional column", {"Customer_Key", "Company_Key", "CustomerNumber", "CustomerName", "Group Name", "SubGroup Name", "Address1", "Address2", "Address3", "City", "State", "State_old", "Zip", "Phone", "Region", "GeographyKey", "Active", "CreatedBy", "CreatedOn", "UpdatedBy", "UpdatedOn", "ActivatedOn", "DeactivatedOn", "Source", "CheckSum", "CompanyID"}),
  #"Transform columns" = Table.TransformColumnTypes(#"Reordered columns", {{"State", type text}}),
  #"Replace errors" = Table.ReplaceErrorValues(#"Transform columns", {{"State", null}}),
  #"Removed columns" = Table.RemoveColumns(#"Replace errors", {"State_old"})
in
  #"Removed columns"

After all your dataflow entities have been created, save your changes and create a schedule to refresh your data.

Connecting our Dataflows to Power BI Desktop

From your workstation with the December release of the Power BI Desktop, click the “Get Data” button and select the “Power BI dataflows (Beta)” connection.

Bringing our dataflow data into Power BI

Select the dataflow entities we created in the Power BI Service.

Selecting out entities

Once all of your entities have been imported, navigate to the Relationship section to verify the correct relationships exist and make any changes. Your relationship view should look like the screenshot below.

Power BI relationships

Don’t forget to mark our dataflow Date entities as a date table after importing it into Power BI Desktop.

Marking the date table

I’m going to build out Product Performance report from the dataflow entities that I created, I will add ABC segmentation, moving averages and time comparisons using DAX measures. ABC segmentation allows you to sort a list of values in three groups, which have different impact on the final result. ABC segmentation work on the famous Pareto principle, which states that 20% of efforts give 80% of the result. The meaning of the segments: A – the most important for the total of (20% gives 80% of the results). B – average in importance (30% – 15%). C – the least important (50% – 5%). Are three segments enough? Should we use more? What percentages? To answer these questions, you need to know your data and ABC segmentation is being done for. I have seen companies using 4 classes (ABCD) or even more. For this example, three classes have the advantage that they separate the assortment in three categories of high, medium, low importance, which is easy to communicate.

In the Product Performance report below, I used a scatter chart to analysis the Product classes profit margin and totals profits, the Sales ABC by ABC Class stacked bar chart segments the products using ABC segmentation and allows you to drill down into each segment to see the under-lining detail. Additionally, there is a total cost by warehouse location and weekly moving average cost comparison chart at the bottom of the report. Here are the DAX measures and DAX Columns that I used to create the Product Performance report:

DAX Columns:

ProductSales = 
CALCULATE(SUM('Sales'[ExtendedPrice]))

CumulatedSales = 
CALCULATE(
    SUM('Products'[ProductSales]),
    ALL('Products'),
    'Products'[ProductSales] >= EARLIER('Products'[ProductSales]))

CumulatedPercentage = 'Products'[CumulatedSales] / SUM('Products'[ProductSales])

DAX Measures:

Profit Margin = DIVIDE ([Total Profits], [Total Sales], 0)

Total Sales = SUM(Sales[ExtendedPrice])

Units Sold = SUM('Sales'[Quantity])

Total Profits = [Total Sales] -[Total Cost]

Costs Weekly MA = 
AVERAGEX( 
    DATESINPERIOD( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
        -7,
        DAY),
    [Costs LY]
)

Costs Weekly MA LY = 
AVERAGEX( 
    DATESINPERIOD( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
        -7,
        DAY),
    [Total Cost]
)


YTD COGS =
CALCULATE (
    SUM ( Sales[ExtendedCost] ),
    FILTER ( ALL ( 'Dates' ), 'Dates'[Year] = MAX ( 'Dates'[Year] ) )
)

Qty on Hand = 
SUM (Inventory[Quantity on Hand])

Turn-Earn Index =
 ( [Inventory Turnover Ratio] * 'Sales Measures'[Profit Margin] )

Total Cost = SUM(Sales[ExtendedCost])

Sales ABC = 
CALCULATE(
    [Total Sales] , 
    VALUES( Products[ItemDescription] ),
    FILTER(
        CALCULATETABLE(
            ADDCOLUMNS(
                ADDCOLUMNS(
                    VALUES( Products[ItemDescription] ),
                    "OuterValue", [Total Sales]
                ),
                "CumulatedSalesPercentage" , DIVIDE(
                    SUMX(
                        FILTER(
                            ADDCOLUMNS(
                                VALUES( Products[ItemDescription] ),
                                "InnerValue", [Total Sales]
                            ),
                            [InnerValue] >= [OuterValue]
                        ),
                        [InnerValue]
                    ),
                    CALCULATE(
                        [Total Sales],
                        VALUES( Products[ItemDescription] )
                    )
                )
            ),
            ALL( Products )
        ),
        [CumulatedSalesPercentage] > [Min Boundary]
        && [CumulatedSalesPercentage] <= [Max Boundary]
    )
)

Product performance visualization

Ready to do more with your data? Check out these other great websites:

Dataflows for Power BI with Dynamics GP data

Part 1

Been a while since my last blog post so this is rather late when talking about Dataflows with Power BI, but I was really excited about the announcement in November and finally got a chance to play around with it. If you haven’t heard about Dataflows or know what it is, here is a excerpt from that announcement:

“In the modern BI world, data preparation is considered the most difficult, expensive, and time-consuming task, estimated by experts as taking 60%-80% of the time and cost of a typical analytics project. Some of the challenges in those projects include fragmented and incomplete data, complex system integration, business data without any structural consistency, and of course, a high skillset barrier. Specialized expertise, typically reserved for data warehousing professionals, is often required. Such advanced skills are rare and expensive.

To answer many of these challenges, Power BI serves analysts today with industry leading data preparation capabilities using Power Query in Power BI Desktop. Now, With Power BI dataflows, we’re bringing these self-service data preparation capabilities into the Power BI online service, and significantly expanding the capabilities in the following ways:

  • Self-service data prep for big data in Power BI – Dataflows can be used to easily ingest, cleanse, transform, integrate, enrich, and schematize data from a large array of transactional and observational sources, encompassing all data preparation logic. Previously, ETL logic could only be included within datasets in Power BI, copied over and over between datasets and bound to dataset management settings. With dataflows, ETL logic is elevated to a first-class artifact within Power BI and includes dedicated authoring and management experiences. Business analysts and BI professionals can use dataflows to handle the most complex data preparation challenges and build on each other’s work, thanks to a revolutionary model-driven calculation engine, which takes care of all the transformation and dependency logic—cutting time, cost, and expertise to a fraction of what’s traditionally been required for those tasks. Better yet, analysts can now easily create dataflows using familiar self-service tools, such as the well known Power Query data preparation experience. Dataflows are created and easily managed in app workspaces, enjoying all the capabilities that the Power BI service has to offer, such as permission management, scheduled refreshes, and more.
  • Advanced Analytics and AI with Azure – Power BI dataflows store data in Azure Data Lake Storage Gen2 – which means that data ingested through a Power BI dataflow is now available to data engineers and data scientists to leverage the full power of Azure Data Services such as Azure Machine Learning, Azure Databricks, and Azure SQL Datawarehouse for advanced analytics and AI. This allows business analysts, data engineers, and data scientists to collaborate on the same data within their organization.
  • Support for the Common Data Model – The Common Data Model (CDM) is a set of a standardized data schemas and a metadata system to allow consistency of data and its meaning across applications and business processes.  Dataflows support the CDM by offering easy mapping from any data in any shape into the standard CDM entities, such as Account, Contact etc. Dataflows also land the data, both standard and custom entities, in schematized CDM form. Business analysts can take advantage of the standard schema and its semantic consistency, or customize their entities based on their unique needs. The Common Data Model continues to evolve as part of the recently announced Open Data Initiative. 

Once dataflows are created, users can use Power BI Desktop and the Power BI service to create datasets, reports, dashboards, and apps to drive deep insights into their business.”

Microsoft has release a white paper on Dataflows and you can find that here: https://docs.microsoft.com/en-us/power-bi/whitepapers or check out Matthew Rouche’s great blog posts at BI Polar: https://ssbipolar.com/2018/10/23/dataflows-in-power-bi/. They really are a great guide to get you started and up to speed the Dataflows.

Getting started with Dataflows

So, Let’s try doing some of this ourselves with Dynamics GP data. To do this you will need to have already installed an Enterprise gateway to your Dynamics GP SQL server. Log into your Power BP Pro and Premium service and create a new workspace. You should now see a “Dataflows (Preview)” option. Click on the that option and then the “+ Create” button to start and select Dataflows from the dropdown menu.

Dataflows Preview Window

Dataflows Preview Menu

Creating your Entities

This will open up the below window with the option to “Add new entities” or “Add linked entities”. Let’s select “Add new entities” so we can add a date dimension to our Dataflow prep.

Define new entities window

From the “Choose data source” window, select the “Blank query” tile.

Choose data source window

This will open up the “Connect to data source” window. Copy and paste the below Power Query code to create your data dimension.

Blank query Window

let
  Source = List.Dates(StartDate, Length, #duration(1, 0, 0, 0)),
  #"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
  #"Renamed Columns" = Table.RenameColumns(#"Converted to Table", {{"Column1", "Date"}}),
  #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns", {{"Date", type date}}),
  StartDate = Date.StartOfYear(Date.AddYears(DateTime.Date(DateTime.LocalNow()),-10)),
  Today = Date.EndOfYear(Date.AddYears(DateTime.Date(DateTime.LocalNow()),5)),
  Length = Duration.Days(Today - StartDate),
  Custom1 = #"Changed Type",
  #"Inserted Year" = Table.AddColumn(Custom1, "Fin Year", each Date.Year([Date]+#duration(184,0,0,0)), Int64.Type),
  #"Inserted Month Name" = Table.AddColumn(#"Inserted Year", "Month Name", each Date.MonthName([Date]), type text),
  #"Inserted Day Name" = Table.AddColumn(#"Inserted Month Name", "Day Name", each Date.DayOfWeekName([Date]), type text),
  #"Inserted Month" = Table.AddColumn(#"Inserted Day Name", "Fin Month", each if Date.Month([Date]) >=7 then Date.Month([Date])-6 else Date.Month([Date])+6  , Int64.Type),
  #"Inserted Day of Week" = Table.AddColumn(#"Inserted Month", "Day of Week", each Date.DayOfWeek([Date])+1, Int64.Type),
  #"Inserted First Characters" = Table.AddColumn(#"Inserted Day of Week", "Short Month", each Text.Start([Month Name], 3), type text),
  #"Inserted First Characters1" = Table.AddColumn(#"Inserted First Characters", "DDD", each Text.Start([Day Name], 3), type text),
  #"Reordered Columns" = Table.ReorderColumns(#"Inserted First Characters1", {"Date", "Fin Year", "Month Name", "Short Month", "Fin Month", "Day Name", "DDD", "Day of Week"}),
  #"Added Custom1" = Table.AddColumn(#"Reordered Columns", "Month Number", each (Date.Month([Date]))),
  #"Inserted Start of Month" = Table.AddColumn(#"Added Custom1", "Start of Month", each Date.StartOfMonth([Date]), type date),
  #"Inserted End of Month" = Table.AddColumn(#"Inserted Start of Month", "End of Month", each Date.EndOfMonth([Start of Month]), type date),
  #"Duplicated Column" = Table.DuplicateColumn(#"Inserted End of Month", "Date", "Date - Copy"),
  #"Calculated Quarter" = Table.TransformColumns(#"Duplicated Column",{{"Date - Copy", Date.QuarterOfYear, Int64.Type}}),
  #"Renamed Columns1" = Table.RenameColumns(#"Calculated Quarter", {{"Date - Copy", "Quarter Number"}}),
  #"Inserted Merged Column" = Table.AddColumn(#"Renamed Columns1", "Merged", each Text.Combine({"Q", Text.From([Quarter Number], "en-US")}), type text),
  #"Current Date" = Table.AddColumn(#"Inserted Merged Column", "Current Date", each Date.From(DateTimeZone.FixedLocalNow())),
  #"Added Custom10" = Table.AddColumn(#"Current Date", "Is Work Day", each if Date.DayOfWeek([Date]) >=0 and Date.DayOfWeek([Date]) <= 4 then "Is Work Day" else "Weekend"),
  #"Renamed Columns2" = Table.RenameColumns(#"Added Custom10", {{"Merged", "Calendar Quarter"}, {"DDD", "Short Day"}}),
  #"Duplicated Column1" = Table.DuplicateColumn(#"Renamed Columns2", "Date", "Date - Copy"),
  #"Extracted Year" = Table.TransformColumns(#"Duplicated Column1",{{"Date - Copy", Date.Year, Int64.Type}}),
  #"Renamed Columns3" = Table.RenameColumns(#"Extracted Year", {{"Date - Copy", "Calendar Year"}}),
  #"Changed Type3" = Table.TransformColumnTypes(#"Renamed Columns3", {{"Fin Year", Int64.Type}, {"Current Date", type date}}),
  #"Added Custom Column" = Table.AddColumn(#"Changed Type3", "DateKey", each Text.Combine({Date.ToText([Date], "yyyy"), Date.ToText([Date], "MM"), Date.ToText([Date], "dd")}), Int64.Type),
  #"Transform columns" = Table.TransformColumnTypes(#"Added Custom Column", {{"Month Number", type text}, {"Is Work Day", type text}}),
  #"Replace errors" = Table.ReplaceErrorValues(#"Transform columns", {{"Month Number", null}, {"Is Work Day", null}}),
  #"Added Custom" = Table.AddColumn(#"Replace errors", "YYYYMM", each Text.Combine({Date.ToText([Date],"yyyy"), Date.ToText([Date], "MM")})),
  #"Added Custom3" = Table.AddColumn(#"Added Custom", "YYYY-MM", each Text.Combine({Date.ToText([Date],"yyyy"),"-", Date.ToText([Date], "MM")})),
    #"Changed Type1" = Table.TransformColumnTypes(#"Added Custom3",{{"YYYYMM", type text}, {"YYYY-MM", type text}})
in
    #"Changed Type1"

On the “Edit queries” window. Here I changed the Name to “Date” and then clicked “Done”.

Edit queries Window

That’s enough for today blog post. In part 2, we will create a entities for our customer, salesperson, sales territory, product, sales transactions, inventory transactions and a custom function in are dataflow to help clean up our customer data and then connect Power BI Desktop to our dataflows so we can create any relationship and DAX measures needed to start visualizing our Dynamics GP data.

How to Create Awesome Business Central Data Analysis – Part 2

Wow! Microsoft’s Business Application Summit just got over and I can’t believe all of the great things that were announced. check out the following links for details on the summit.

Here is a list of some of the features that are coming:

  1. Integration with Python – Connector and custom visuals like R
  2. Composite model – Composite models allow you to mix import and tabular direct query sources.
  3. Aggregations – Massive volumes of data require new ways of storing information to balance the needs of slice-and-dice interactive analysis with deep, detail-level reporting.
  4. Intellisense support for the M formula language
  5. Smart Data Prep – Data profiling and fuzzy-logic matching

Find out more about these feature and more by viewing the On Demand sessions from the Business Applications summit – https://www.microsoft.com/en-us/businessapplicationssummit[](https://www.microsoft.com/en-us/businessapplicationssummit)

October 2018 Release Overview – https://docs.microsoft.com/en-us/business-applications-release-notes/October18/[](https://docs.microsoft.com/en-us/business-applications-release-notes/October18/)

Now on to part two of my Create Awesome Business Central Data Analysis. In my last post, How to Create Awesome Business Central Data Analysis, one of the things I discussed was how to connect Power BI to Dynamics 365 Business Central and build an Inventory Performance report.

Creating our Sales Performance Visual

Here are the data tables we are going to import to analyze our Business Central sales data:

  • SalesDashboard
  • Customers
  • Customer List

After importing the SalesDashboard data we need to make the following changes.

  1. Rename several columns
  2. Remove several columns listed below in the M Code.
  3. Add a custom column to calculate Profit

Here is the M code to help you with the transformations. You will need to change the source information to match your environment.

    Source = Dynamics365BusinessCentral.Contents(null),
    #"CRONUS USA, Inc." = Source{[Name="CRONUS USA, Inc."]}[Data],
    SalesDashboard_table = #"CRONUS USA, Inc."{[Name="SalesDashboard",Signature="table"]}[Data],
    #"Renamed Columns" = Table.RenameColumns(SalesDashboard_table,{{"AuxiliaryIndex1", "Country"}, {"AuxiliaryIndex3", "Item No"}}),
    #"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"ETag"}),
    #"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"SalesPersonName", "SalesPerson Name"}}),
    #"Removed Columns1" = Table.RemoveColumns(#"Renamed Columns1",{"Sales_Amount_Expected", "Cost_Amount_Expected", "Customer_Disc_Group", "Country"}),
    #"Added Custom" = Table.AddColumn(#"Removed Columns1", "Profit", each [Sales_Amount_Actual]+[Cost_Amount_Actual]),
    #"Calculated Absolute Value" = Table.TransformColumns(#"Added Custom",{{"Quantity", Number.Abs, type number}, {"Cost_Amount_Actual", Number.Abs, type number}}),
    #"Renamed Columns2" = Table.RenameColumns(#"Calculated Absolute Value",{{"AuxiliaryIndex2", "Customer No"}})
in
    #"Renamed Columns2"

Now with all three Business Central view imported, we can Save & Apply our changes.

Creating the DAX for our Sales Performance Report

On the Sales Performance report, I’ve added the ability to segment my Accounts Receivable into aging buckets. Here are the steps I used to add the changes to the report.

  1. First, I clicked on the “Enter Data” icon from the Home tab in Power BI Desktop and entered the following information and named the table “AR Buckets”.AR Buckets Table
  2. The DAX Columns & Measures needed for our visuals:

First our new two columns:

InvoiceAge =
 ( TODAY () )
    - IF (
        ISBLANK ( 'Customer List'[Posting_Date] ),
        TODAY (),
        'Customer List'[Posting_Date]
    )
ARBucket = 
CALCULATE(
    VALUES('AR Buckets'[Bucket]),
    FILTER(
        'AR Buckets',
        'Customer List'[InvoiceAge] >='AR Buckets'[Min]
        && 'Customer List'[InvoiceAge] < 'AR Buckets'[Max]
    )
)

Then our DAX measure:

Aged_AR = 
IF (
    COUNTROWS ( 'AR Buckets' ) = 1,
    CALCULATE (
        [AR_Balance],
        FILTER (
            'Date',
            MAXX ( 'Date', MAX ( 'Date'[Calendar EndOfMonth] ) - EARLIER ( 'Date'[Date] ) )
                >= MAX ( 'AR Buckets'[Min] )
                && MAXX ( 'Date', MAX ( 'Date'[Calendar EndOfMonth] ) - EARLIER ( 'Date'[Date] ) )
                    < MAX ( 'AR Buckets'[Max] )
        )
    ),
    [AR_Balance]
)

With the Columns and Measure created for the Aged Account Receivable visual, let’s work on our Gross Sales, Revenue Total, Revenue Percent, Cost Total and Cost Percent DAX Measures.

Gross Sales CY = SUM('Sales'[Sales_Amount_Actual])
Gross Sales LY = CALCULATE( 
    [Gross Sales CY] , 
        SAMEPERIODLASTYEAR( 'Date'[Date] ))
RevenueTotal =
CALCULATE ( SUM ( 'Sales'[Sales_Amount_Actual] ), ALL ( 'Sales' ) )
RevenuePct = 
    SUM('Sales'[Sales_Amount_Actual])/[RevenueTotal]
SalesCostTotal = 
    CALCULATE(SUM('Sales'[Cost_Amount_Actual]),ALL('Sales'))
SalesCostPct = 
    SUM('Sales'[Cost_Amount_Actual])/[SalesCostTotal]

With all of the DAX Columns and Measures created we can now put them all together to create the below Sales Performance report that now gives us the ability to track KPI to goals, see Sales, Profit and Cost trends over time. Here is how we create our visuals:

  • Gross Sales KPI – add the Gross Sales CY measure to the Indicator field in the Visualization pane, Calendar Year from Date dimension to Trend axis and Gross Sales LY measure to Target goals.
  • Gross Sales and Profit Gauge – Add Gross Sales to the Value field of the Visualization pane and Profit to the Minimum value field.
  • Aged Accounts Receivable clustered bar chart – Add ARBucket column from Customer List to the Axis field in the Visualization pane and Aged_AR measure to the Value field.
  • Cost vs Revenue % by Product EnhancedShatter chart – Add Item Description to the Details field in the visualization pane, RevenuePct measure to the X Axis field, SalesCostPct to the Y Axis field, Sales_Amount_Actual field to the Size field and Profit to the Color saturation field.
  • Current to Prior Year Sales by Month clustered column chart – Add Calendar Month from the Date dimension to the Axis field in the visualization pane, Gross Sales CY and Gross Sales LY measures to the Value field.
  • Gross Sales by State line and clustered column chart – Add State from the Customers dimension to the Shared Axis field in the visualization pane and Gross Sales to the Columns Value field.

Power BI Sales Performance Report

Start your Power BI learning for free and bring your company data to life!

Power BI transforms your company’s data into rich visuals for you to collect and organize so you can focus on what matters to you. Stay in the know, spot trends as they happen, and push your business further.

Looking for new ways to find and visualize data and to share your discoveries? Power BI can help. This collection of tools, online services and features from Microsoft transforms your company’s data into rich visuals for you to collect and organize, so you can spot trends, collaborate in new ways, and make sense of your data. Start your Power BI learning at some of these great sites:

Happy learning and visualizing your data!

How to Create Awesome Business Central Data Analysis

In my last post, 3 Thing to Know About Dynamics 365 Business Central, one of the things I discussed was how to connect Power BI to Dynamics 365 Business Central. The Power BI Content Pack has several reports. Two of those are “Item Sales Dashboard” and “Customer Sales Dashboard” (screenshots below). These are a good start but enhance our data analysis by connecting to our data and transforming it and adding some DAX measures.

Item Sales Dashboard

Customer Sales Dashboard

Creating our Visuals

First some basics, before we connect to our Dynamics 365 Business Central data. We need a Power BI model with a good date dimension table and there is no better place to start then with #sqlbi.com DAX Date template. You can get it here: https://www.sqlbi.com/tools/dax-date-template/. Alberto and Marco do a great job teaching all things DAX, so explore around while you’re on their web site.

Now let’s add Dynamics 365 Business Central as a data source in Power BI Desktop

  1. In Power BI Desktop, in the left navigation pane, choose Get Data.
  2. In the Get Data window, choose Online Services, choose Microsoft Dynamics 365 Business Central, and then choose the Connect button.Get Data connect login to Business Central
  3. Power BI displays a wizard that will guide you through the connection process. You will be prompted to sign into the service. Select Sign in and choose the account you would like to sign in as. This should be the same account you sign into Dynamics 365 Business Central with.Business Central Login
  4. Choose the Connect button to continue. The Power BI wizard shows a list of Microsoft Business Central companies and data sources. These data source represent all the web services that you have published from each company in Microsoft Dynamics 365 Business Central.Connect screen
  5. Alternatively, create a new web service URL in Dynamics 365 Business Central by using the Create Data Set action in the Web Services page, using the Set Up Reporting Assisted Setup guide, or by choosing the Edit in Excel action in any lists.
  6. Specify the data you want to add to your data model, and then choose the Load button.
  7. Repeat the previous steps to add additional Microsoft Dynamics 365 Business Central, or other data, to your Power BI data model.

Once the data is loaded it will appear in the right navigation on the page. At this point, you have successfully connected to your Microsoft Dynamics 365 Business Central data and are ready to begin building your Power BI report.

Before building your report, we recommend that you import the Microsoft Dynamics 365 Business Central theme file. The theme file will create a color palette so that you can build reports with the same color styling as the Microsoft Dynamics 365 Business Central content packs without requiring you to define custom colors for each visual. You can find the download for the json theme file here: https://community.powerbi.com/t5/Themes-Gallery/Microsoft-Dynamics-365-Business-Central/m-p/385875

Business Central available data

Here are the data tables we are going to import:

  • InventoryItems
  • InventoryTransactions
  • ItemLedgerEntries
  • ItemSalesandProfits

After importing the InventoryItems data we need to make the following changes.

  1. Rename the “No” column to “Item No”
  2. Remove several columns listed below in the M Code.
  3. Add a custom column to calculate Unit Profit

Here is the M code to help you with the transformations. You will need to change the source information to match your environment.

    Source = Dynamics365BusinessCentral.Contents(null),
    #"CRONUS USA, Inc." = Source{[Name="CRONUS USA, Inc."]}[Data],
    InventoryItems_table = #"CRONUS USA, Inc."{[Name="InventoryItems",Signature="table"]}[Data],
    #"Renamed Columns" = Table.RenameColumns(InventoryItems_table,{{"No", "Item No"}}),
    #"Trimmed Text" = Table.TransformColumns(#"Renamed Columns",{{"Description", Text.Trim, type text}}),
    #"Removed Columns" = Table.RemoveColumns(#"Trimmed Text",{"Created_From_Nonstock_Item", "Substitutes_Exist", "Stockkeeping_Unit_Exists", "Assembly_BOM", "Production_BOM_No", "Routing_No", "Shelf_No", "Cost_is_Adjusted", "Standard_Cost", "Last_Direct_Cost", "Price_Profit_Calculation", "VAT_Prod_Posting_Group", "Item_Disc_Group", "Tariff_No", "Search_Description", "Overhead_Rate", "Indirect_Cost_Percent", "Blocked", "Last_Date_Modified", "Manufacturing_Policy", "Flushing_Method", "Assembly_Policy", "Default_Deferral_Template_Code", "Global_Dimension_1_Filter", "Global_Dimension_2_Filter", "Location_Filter", "Drop_Shipment_Filter", "Variant_Filter", "Lot_No_Filter", "Serial_No_Filter", "ETag", "Item_Tracking_Code"}),
    #"Added Custom" = Table.AddColumn(#"Removed Columns", "Unit Profit", each [Unit_Price]-[Unit_Cost]),
    #"Replaced Value" = Table.ReplaceValue(#"Added Custom","","KIT",Replacer.ReplaceValue,{"Item_Category_Code"}),
    #"Renamed Columns1" = Table.RenameColumns(#"Replaced Value",{{"Gen_Prod_Posting_Group", "Channel Name"}, {"Item_Category_Code", "Item Category"}, {"Base_Unit_of_Measure", "Unit of Measure"}, {"Profit_Percent", "Profit Percent"}, {"Inventory", "Inventory Qty"}, {"Unit_Cost", "Unit Cost"}, {"Unit_Price", "Unit Price"}})
in
    #"Renamed Columns1"

Now let’s duplicate the Inventory Transactions data and make some changes to it.

  1. Right click on the InventoryTransactions table and select “Duplicate”
  2. Rename the duplicated table to Inventory Amounts
  3. Remove several columns. They are listed below in the M code
  4. Select the Transform tab and then the Group By icon on the menu
  5. Select the advanced option and group the data by Posting_Date and Item_No and sum the Inventory Quantity
    Group By screen
    Here is the M code to help you with the transformations. You will need to change the source information to match your environment.
    Source = Dynamics365BusinessCentral.Contents(null),
    #"CRONUS USA, Inc." = Source{[Name="CRONUS USA, Inc."]}[Data],
    InventoryTransactions_table = #"CRONUS USA, Inc."{[Name="InventoryTransactions",Signature="table"]}[Data],
    #"Renamed Columns" = Table.RenameColumns(InventoryTransactions_table,{{"Document_No", "Document No"}}),
    #"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"Expiration_Date", "ETag", "Document_Type", "Document No", "Document_Line_No", "Variant_Code", "Description", "Return_Reason_Code", "Global_Dimension_1_Code", "Global_Dimension_2_Code", "Serial_No", "Lot_No", "Cost_Amount_Non_Invtbl", "Cost_Amount_Expected_ACY", "Cost_Amount_Actual_ACY", "Cost_Amount_Non_Invtbl_ACY", "Completely_Invoiced", "Open", "Drop_Shipment", "Assemble_to_Order", "Applied_Entry_to_Adjust", "Order_Type", "Order_No", "Order_Line_No", "Prod_Order_Comp_Line_No", "Job_No", "Job_Task_No", "Dimension_Set_ID", "Item_No_Link", "Job_No_Link", "Job_Task_No_Link", "Cost_Amount_Expected", "Reserved_Quantity", "Qty_per_Unit_of_Measure", "Sales_Amount_Expected", "Entry_No", "Location_Code", "Shipped_Qty_Not_Returned", "Sales_Amount_Actual", "Cost_Amount_Actual", "Invoiced_Quantity"}),
    #"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"Remaining_Quantity", "Inventory_Quantity"}}),
    #"Grouped Rows" = Table.Group(#"Renamed Columns1", {"Posting_Date", "Item_No"}, {{"InvQty", each List.Sum([Inventory_Quantity]), type number}}),
    #"Sorted Rows" = Table.Sort(#"Grouped Rows",{{"Posting_Date", Order.Ascending}})
in
    #"Sorted Rows"

Creating the DAX for our Visuals

On the Inventory Performance report, I’ve added the ability to segment my quantity sold by growth rate over the previous period selected. Here are the steps I used to add the changes to the report.

  1. First, I clicked on the “Enter Data” icon from the Home tab in Power BI Desktop and entered the following information and named the table “Growth Segments”.
  2. The DAX measures needed for our visuals:

Total Qty Sold CY = SUM('Inventory Amounts'[InvQty])

CALCULATE( 
    [Total Qty Sold CY], 
    SAMEPERIODLASTYEAR( 'Date'[Date] ))

Qty Sold Growth = DIVIDE([Total Qty Sold CY],[Total Qty Sold LY],0)-1

    CALCULATE (
    [Total Qty Sold CY],
    FILTER (
        VALUES ( 'Inventory Items'[Item No] ),
        COUNTROWS (
            FILTER (
                'Growth Segments',
                [Qty Sold Growth] >= 'Growth Segments'[Min]
                    && [Qty Sold Growth]  0))
InventoryAvgAmt = SUM('Inventory Amounts'[Inventory Cost])/DISTINCTCOUNT('Inventory Amounts'[Posting_Date])
COGS_Actual = ABS(SUM('Inventory Transactions'[Cost_Amount_Actual]))

Turnover = [COGS_Actual]/[InventoryAvgAmt]

Inventory period = 365/[Turnover]

With all of the DAX Measures created we put them all together to create the below Inventory Performance report that now gives us the ability to track KPI to goals, see Sales and inventory quantity trends over time and visualize quantity sold per growth group compared to last period.

Inventory Performance Report

In my next post, I will create the Customer Sales Performance report to complete the enhance of the Business Central Sales Content pack. Until then, go sign up for a free trial of Dynamics 365 Business Central at https://trials.dynamics.com/. You just need to provide your work email address and Phone Number. Dynamics 365 Business Central is an all-in-one business management solution that helps organizations streamline business processes, improve customer interactions and enable growth by offering:

  • Business without silos. Unify business and boost efficiency with automated tasks and workflows—all from within familiar Office tools like Outlook, Word, and Excel.
  • Actionable insights. Achieve greater outcomes and gain a complete view across the business with connected data, business analytics, and guidance delivered by Microsoft’s leading intelligent technologies.
  • Solutions built to evolve. Start quickly, grow and adapt in real time with a flexible platform that makes it easy to extend beyond Business Central based on evolving business needs.

3 Things to Know About Dynamics 365 Business Central

Just got done with two days of learning about Dynamics 365 Business Central at the Microsoft Center in New York City. Before the training, my main focus was on how to use it with Power BI. This was my first training related to Business Central and the trainers, Craig, @craigcrescas, and Tom did a great job!!

Dynamics 365 Business Central is an all-in-one business management solution that’s easy to use and adapt, helping you connect your business and make smarter decisions. Built on functionality within Microsoft Dynamics NAV and adaptable to extend business applications to other Microsoft Cloud Services such as Office 365, Microsoft Flow, Power BI and PowerApps.

In this post I’m going to discuss some of my biggest take a ways – Ready To Go, Built to Develop and Office 365 and Power BI integration.

Ready To Go

If you haven’t found it yet, go to http://aka.ms/readytogo. There you we find information on:

* “Ready To Go” online learning

* “Ready To Go” coaching

* “Ready To Go” platform

“Ready To Go” online learning

The “Ready to Go” online learning catalog is an extensive library of training materials, hosted on the Dynamics Learning Portal. You can find the “Ready to Go” online learning catalog here.

This catalog contains readiness information for marketers, business decision makers, architects, and developers. The content addresses the needs of both reselling partners of Dynamics 365 Business Central as App builders who want to provide Apps on Microsoft AppSource. Currently there is over 15 hours of online learning. Here are some of the catalog categories:

Online learning catalog

“Ready To Go” coaching

The “Ready to Go” coaching is executed by Microsoft Development Centers and Master VARs, which are independently owned and operated. They bring technical services and coaching to market and have industry specific knowledge that you can contract and leverage when developing your offerings. Over time they have developed lots of best practices by supporting multiple partners, they also have strong ties with Microsoft’s R&D teams and are always up to date on the latest strategy and product innovation.

The “Ready to Go” validation workshop is offered by all Development Centers and Master VARs at fixed price and is designed to coach you in bringing your Dynamics 365 Business Central offerings to market faster and with less friction by providing individual coaching. This 8-hour workshop helps partners make the right decisions and is based upon a unified checklist and set of training materials. The workshop validates and coaches the partner’s readiness on their go-to market business model, marketing plans, architectural plans, app Development and internal readiness.

Where needed the Microsoft Development Centers and Master VARs have more offerings to support you. We recommend reaching out if you want more information.

“Ready To Go” platform

To support you in App development, testing and learning scenario’s, the “Ready to Go” platform is a place where you can find current, upcoming and daily builds of Dynamics 365 Business Central. It also holds a way to provide feedback to the Microsoft engineers working on Dynamics 365 Business Central. These builds are made available to you on the Microsoft Collaborate platform. To get access to the “Ready to Go” platform, you need to go through these steps:

  1. Register on Microsoft Collaborate using your AAD Global Admin account.
  2. Contact us, once registered on Microsoft Collaborate. Your company’s Azure Active Directory (AAD) global administrator should contact us to complete the on-boarding. We need to manually assign you to the right programs and engagements. Expect a response from us within 1-2 business days.
  3. The Ready to Go team will notify you and confirm your registration.

Built for Development

Dynamics 365 is a flexible business platform that can be scaled to suit and meet your business needs in terms of size and complexity. There are three ways that the system can be customized by either using – Personalize, Designer or Visual Studio Code for al language development. Personalize can be done by the Business Central end user by simply navigating to the gear icon in the top right corner of the window/screen you are on and selecting “Personalize”.

Personalize Screenshot

For me, I’m on the Sales Order window. I now select the Sales Order line detail section and select the “Field” option in the top gray bar. From there you will get the “Add Field to Page” section on the right of the below screenshot. Use the search function and type in “Bin”. Now drag the Bin Code field over the Lines section of the Sales Order. Select “Done” to complete your personalization.

Screen Personalization

Designer is a little more work in the customization process but is applied globally instead of at a personal level. First you will need to create a Sandbox. A sandbox environment (Preview) is a non-production instance of Business Central. Isolated from production, a sandbox environment is the place to safely explore, learn, demo, develop, and test the service without the risk of affecting the data and settings of your production environment. Here is the link to how to create one. Once you’re done, you should see “Sandbox” in the black tool bar on the top right of the screen. Selecting the gear icon once again will bring up the additional “Designer” option. Both the Designer and custom al language option should be handled by a developer or more technical user. You can read more about these option in the “Ready To Go” online learning Technical catalog categories discuss above.

Power BI Integration

Getting insights into your Business Central data is easy with Power BI and the Business Central content packs. Power BI retrieves your data and then builds an out-of-the-box dashboard and reports based on that data.

You must have a valid account with Dynamics 365 and with Power BI. Also, you must download Power BI Desktop if you wish to create your own Power BI reports. Power BI content packs require permissions to the tables where data is retrieved from.

How to Connect

  1. Select Get Data at the bottom of the left navigation pane. You may also get starting from within Dynamics 365 Business Edition. From the role center, navigate to Report Selection in the Power BI Role Center part. Select either Service or My Organization from the ribbon. When either of these actions are selected, you will be taken to either the Organization gallery in Power BI or to the services library in Power BI, which will also be filtered to only display content packs related to Dynamics 365 Business Central.Get Data
  2. You may also get starting from within Dynamics 365 Business Edition. From the role center, navigate to Report Selection in the Power BI Role Center part. Select either Service or My Organization from the ribbon. When either of these actions are selected, you will be taken to either the Organization gallery in Power BI or to the services library in Power BI, which will also be filtered to only display content packs related to Dynamics 365 Business Central.
  3. In the Services box, select Get. This will open a window with the AppSource and Apps for Power BI apps.Get Services
  4. Select Apps from the Apps for Power BI apps tab, search for “Business Central” content pack that you want to use, and then select Get it now for the App you want to load. I choose “Microsoft Dynamics 365 Business Central – Sales. Dynamics 365 Business Central Apps
  5. When prompted, enter the name of your company in Dynamics 365 Business Central. This is not the display name. The company name can be found on the ‘Companies’ page within your Dynamics 365 Business Central instance. Company Name
  6. You can find the Company Name in Business Central by searching with the magnifying glass and typing in “companies”. Your screen should look like the below screenshot. Business Central Companies screen
  7. Once connected, a dashboard, report and dataset will automatically be loaded into your Power BI workspace. Power BI Business Central - Sales Report
  8. When completed, navigate back to Business Central’s home page and down to the bottom right corner. Click on the Ellipsis, “…”, in the top right of the Power BI section to Select Report and enable the reports you just deployed. Select ReportEnable Report
  9. You should now see your report and you can use the Previous Report and Next Report menu options to view all reports available.

Stay tuned for more on Business Central and Power BI

Power BI Dynamics GP Sales and Inventory Analysis Report Modifications

Recently, I’ve made some updates to my Power BI Sales and Inventory Analysis Reports and wanted to share those with you. On the Sales Analysis report, I’ve added the ability to segment my sales revenue by growth rate and compare that growth rate to the previous period selected. Here are the steps I used to add the changes to the report.

  1. First, I clicked on the “Enter Data” icon from the Home tab in Power BI Desktop and then entered the following information.Power BI Create Table
  2. I then created a DAX Sales Growth measure with the following code.
    Sales Growth = DIVIDE([Sales CY],[Sales LY],0)-1
  3. Next I created a SAX measure to segment the data based on the Sales Growth Groups that we created in step 1.
    Sales per Growth Group =
            CALCULATE (
                [Sales CY],
                FILTER (
                    VALUES ( Customers[Customer Class] ),
                    COUNTROWS (
                        FILTER (
                            'Sales Growth Groups',
                            [Sales Growth] >= 'Sales Growth Groups'[Min]
                                && [Sales Growth] < 'Sales Growth Groups'[Max]                     )                 )                     > 0
                )
            )
    
  4. Next, I modified my Product by Sales and Customer Class by Sales visuals to use the new Sales per Growth Group measure by dragging the segment from the Sales Growth Group table into the Legend field, Sales Growth measure into the Tooltips and Sales per Growth Group measure into the Values field.Screenshot of field wells
  5. Below is a screenshot of the end results. You can now clearly tell which segments are performing better or worse than last year.Power BI Sales Summary Report

My next modification was to my Inventory Order Analysis report. This one was just a small change by adding one DAX measure to calculate slow moving inventory and a Table visual to view the results.

  1. First, I added a DAX measure for slow moving inventory
    SM Inventory Value = 
    VAR SumOf = 
    CALCULATE(
        SUMX(
            Inventory,
            Inventory[Quantity on Hand] * Inventory[Unit Cost]
        )
    )
    VAR Range = DATESINPERIOD(Dates[Date], EDATE(TODAY(), -12), -24, MONTH)
    RETURN
    CALCULATE(
        IF(
            COUNTROWS(
            INTERSECT(
               VALUES(Inventory[Last Sale Date]),
               Range)
           ) > 0, 
           SumOf, 0
        )
    )

  1. Then I added the following columns and measure to a table visual to show my slow moving inventory.Table Visual Field Wells Screenshot
  2. Below is a screenshot of the modification to the Inventory Order Analysis report.Power BI Inventory Order Analysis

Until next time, Enjoy the code and Happy Analyzing!

How to Create a Better RMA Dashboard For Mammoth Saving

OK, maybe not mammoth saving but controlling quality and costs during the Return Material Authorizations (RMA) process has a multiplicative effect for growth in a company. Finding new ways to improve perfect order performance continually reduces RMA and increases customer satisfaction leading to more sales.

RMAs are a direct measure of product quality and a products’ nonconformance to customers’ specifications and requirements. They are issued for a wide variety of reasons. The RMA module within Dynamics GP is designed to provide detailed traceability for both customer and supplier returns. Before we get to designing our Power BI RMA dashboard let’s look at Dynamics GP RMA module.

Return Material Authorization Module

The Returns Management module for Microsoft Dynamics GP enables you to meet customer and vendor demands for product and part returns by streamlining tasks and improving your responsiveness to customer queries and complaints. Generate returns from historical customer invoices, service calls, or manually. Return an item to a vendor to fix it within your company and automatically transfer items from the returns warehouse to the main warehouse. Give your customers up-to-date information about the status of their returned items.

You can use Returns Management to enter, set up, and maintain your RMA and Return To Vendor (RTV) documents. If Returns Management is integrated with Service Call Management, an RMA is created automatically from a returnable parts line on a service call. If Returns Management is integrated with Sales Order Processing, you can select items directly from the historical Sales Order Processing invoice that was generated for a specific customer.

Dynamics GP’s RMA Life Cycle

When operating a returns warehouse, you sometimes interact with customers who need to return equipment for various reasons. When they bring you a part, you complete one of a few tasks: issue a credit, repair and return the item to the customer, or replace the item. You also may be accepting the item in exchange for an item you already provided to the customer. Once you’ve completed your transaction with the customer, you have a few more options: repair the item and return it to your inventory, return the item to the vendor, or scrap the item.

The following diagram outlines the life cycle of RMAs, from entry through completion. When Returns Management is integrated with the other modules of the Field Service Series (Service Call Management, Preventive Maintenance, Contract Administration, and Depot Management), many new options and paths become available.

RMA Life Cycle

You can create RMAs from two different points of access throughout Field Service:

  • Manual entry in the RMA Entry/Update window
  • From return lines for returnable items on a service call

Entry in the RMA Entry/Update window is the method described in this manual. Refer to the Service Call Management documentation for more information regarding service calls and returnable items.

RMA types Inside Dynamics GP

RMA documents are used to track an item return from your customers. The available RMA document types are as follows:

  • Credit – Provide a credit to your customer’s account in Receivables Management for the value of the items the customer returned to you.
  • Replacement – Provide the same item, or a similar item, as a replacement to your customer. You must receive the original item from your customer before you send the replacement item on a new order document in Sales Order Processing.
  • Advance Cross–ship – Provide the same item, or a similar item, as a replacement to your customer. You can send the replacement item using a new Sales Order Processing order document prior to receiving the original item from your customer.
  • Repair and Return – You, or your vendor, will repair the item that is received from the customer. Your customer will receive the item after it’s been repaired.
  • None – The customer’s original item is picked up by your field service technician and returned to your returns warehouse. This type of RMA document was designed to integrate directly with Service Call Management.

Analyzing Our RMA Data with Power BI

With all the RMA data we will be using the following three visuals and associated measures:

  • Rate of Return – This is an incredibly useful KPI in a distribution center, especially when segmented by cause for return. Identifying causes for returns — damage, late delivery, inaccurate product description, wrong item shipped, etc — helps warehouse managers address underlying issues, and make necessary improvements.Number of Units Returned/Number of Units Sold = Rate of Return.
  • Perfect Order Rate – This KPI measures how many orders your warehouse successfully delivers without incident: the correct item, shipped on time and received in good condition by the customer who ordered it. Lean practices help identify errors or inaccuracies before orders leave the warehouse.Orders Completed Without Incident/Total Orders Placed = Perfect Order Rate
  • RMA Pareto Analysis – Done on the top 20% of factors that drive 80% of the returns. This will make cause troubleshooting more efficient, leading to permanent solutions to problems that may be causing RMAs to begin with. I’m not going to go into detail on how to build this chart in this blog post. You can find the steps on how to complete it here: https://powerbi.tips/2016/10/pareto-charting/.

First, we need to get our Dynamics GP RMA data into Power BI and we’ll do that with the SQL script below:

WITH CTE
AS (SELECT
  300 AS Company_Key,
  b.RETDOCID AS [RMA Document ID],
  b.LNSEQNBR,
  a.Return_Record_Type,
  b.RETREF AS [RMA Reference],
  b.RETSTAT AS [RMA Status],
  b.RETTYPE AS [RMA Type],
  a.COMPDTE AS [DocDate],
  a.USERID,
  b.OFFID AS [Office ID],
  b.LOCNCODE AS [Location Code],
  a.CUSTNMBR,
  b.SOPTYPE,
  b.SOPNUMBE AS [Invoice Number],
  b.SVC_RMA_Reason_Code AS [RMA Reason Code],
  b.SVC_RMA_Reason_Code_Desc AS [RMA Reason Code Description],
  b.ITEMNMBR,
  b.QUANTITY,
  b.UNITCOST AS [Unit Cost],
  b.EXTDCOST AS [Extended Cost],
  b.[Credit_SOP_Number]
FROM dbo.SVC05200 b (NOLOCK)
JOIN dbo.SVC05000 a (NOLOCK)
  ON b.RETDOCID = a.RETDOCID
WHERE b.QUANTITY > 0
AND b.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0) -- Didn't want everything so limiting to last 4 years of RMA data 
AND a.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0) -- Didn't want everything so limiting to last 4 years of RMA data

UNION ALL

SELECT
  300 AS Company_Key,
  b.RETDOCID AS [RMA Document ID],
  b.LNSEQNBR,
  a.Return_Record_Type,
  b.RETREF AS [RMA Reference],
  b.RETSTAT AS [RMA Status],
  b.RETTYPE AS [RMA Type],
  a.COMPDTE AS [DocDate],
  a.USERID,
  b.OFFID AS [Office ID],
  b.LOCNCODE AS [Location Code],
  a.CUSTNMBR,
  b.SOPTYPE,
  b.SOPNUMBE AS [Invoice Number],
  b.SVC_RMA_Reason_Code AS [RMA Reason Code],
  b.SVC_RMA_Reason_Code_Desc AS [RMA Reason Code Description],
  b.ITEMNMBR,
  b.QUANTITY,
  b.UNITCOST AS [Unit Cost],
  b.EXTDCOST AS [Extended Cost],
  b.[Credit_SOP_Number]
FROM dbo.SVC35200 b (NOLOCK)
JOIN dbo.SVC35000 a (NOLOCK)
  ON b.RETDOCID = a.RETDOCID
WHERE b.QUANTITY > 0
AND b.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0) -- Didn't want everything so limiting to last 4 years of RMA data
AND a.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0)) -- Didn't want everything so limiting to last 4 years of RMA data

SELECT
  b.*,
  rs.STSDESCR AS [RMA Status Description],
  iv.ITMCLSCD AS [Item Class]
FROM CTE b (NOLOCK)
LEFT JOIN dbo.IV00101 iv (NOLOCK)
  ON iv.ITEMNMBR = b.ITEMNMBR
LEFT JOIN dbo.SVC05500(nolock) rs
  ON b.[RMA Status] = rs.RETSTAT
WHERE b.QUANTITY > 0

With the data now loaded into Power BI Desktop, I created relationships with my Inventory, Date and Customer dimension that I already had in Power BI Desktop from my previous blog posts.

Creating Our Measures

Time to build the Rate of Return Measure. Rather than create one big DAX measure we will be building several small measures that build upon each other. Here are the DAX formulas that we will be using for the Rate of Return measure:

  • Return of Lbs CY
Returns Lbs. CY = CALCULATE(
    ABS(
        SUM(Returns[Total Return Lbs])
        )
)
  • Lbs. CY

Lbs. CY = SUM(Sales[ExtendedWeight])

  • Rate of Return
Rate of Return = DIVIDE 
    ([Returns Lbs. CY], [Lbs. CY], 
    Blank()
 )
  • Monthly Average Rate of Return
Monthly Avg. Rate of Return = AVERAGEX( 
    DATESINPERIOD( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
        -30,
        DAY),
    [Rate of Return]
)

With our calculations now complete, we create a line chart with Date on the Axis and Monthly Avg. Rate of Return for the Value as seen in the bottom left line chart visual of our Returns Analysis report screen shot below.

Up next, our Perfect Order Rate measure. Once again we will build several small calculation and add them together to create our Monthly Avg. Perfect Order Rate. Here are the DAX formulas that I used for this:

  • Total Invoice Count – Since my data includes line level detail for each Sales Order type in the Sales Table I’ll count all lines invoice and evaluate each line later to make sure it was a perfect order.
Total Invoices Count = CALCULATE(
    COUNTROWS('Sales'),
    FILTER('Sales','Sales'[SOPTYPE] = 3)
    )
  • Perfect Order – Now we will determine if the order was fulfilled before the customer’s required date.
Perfect Order = CALCULATE(COUNTROWS
    ('Sales'),
    FILTER('Sales','Sales'[SOPTYPE] = 3),
    FILTER('Sales',[Document Date]<='Sales'[Req Ship Date])
    ) 
  • Perfect Order Rate
Perfect Order Rate = DIVIDE(
    [Perfect Order],[Total Invoices Count],
    BLANK()
    )
  • Monthly Average Perfect Order Rate
Monthly Avg. Perfect Order Rate = AVERAGEX( 
    DATESINPERIOD( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
        -30,
        DAY),
    [Perfect Order Rate]
)

With all of our DAX measures complete, we can create the visuals needs to Monthly Average Perfect Order Rate and Monthly Average Return Rate of Lbs. Here is what my final Returns Analysis report looks like.

RMA Dashboard

Until next time, Happy Analyzing!