Dataflows for Power BI with Dynamics GP Data

Part 2

In Part 1 of Dataflows for Power BI with Dynamics GP Data, we talked about what dataflows are and started creating our first dataflow entity by creating a Date dimension. Now let’s start creating the remaining entities for:

  • Dimensions from Dynamics GP card windows via SQL queries:
  • Customer data from Sales module’s customer card
  • Salesperson data from the Sales module’s Salesperson card
  • Sales Territory data from the Sales module’s Sales Territory card
  • Product data from Inventory module’s item card
  • Fact data from our Dynamics GP transaction entry windows via SQL queries:
  • Sales data from the Sales Transaction Entry window
  • Inventory data from the Inventory Transaction Entry window

select the “Add entities” button to add additional data.

Add entities

Select the “SQL Server database” tile.

Choose data source window

Fill out your Server, Database and Connection credentials for the Enterprise gateway.

Fill in your data source connection info

I like to connect to my data via SQL views and you can download the queries for those views here. Once you’re done creating all of the entities your dataflows screen should look something like this.

View of entities from data source based on SQL views

Creating Custom Functions in Dataflows

The easiest way to create custom function in Power BI service’s dataflow is to create it in Power BI Desktop’s Power Query and then open up the Advance Editor to copy and paste the M code into a blank query.

  Source = (input) =>

values = {
{"NEW YORK","NY"},
{"QUEBEC", "QC"},


Result = List.First(List.Select(values, each _{0}=input)){1}



Get data screen of custom function

Rename the custom function to “fnLookup”.

Edit queries

Now we need to modify our Customer entity to clean up the state column. Right click on the Customer entity and select “Advanced Editor”.

Advanced editor

Copy and paste the below Power Query code into the Advanced Editor window. This custom function helps replace the data in the State column with the accepted two-character State abbreviations.

  Source = Sql.Database("YOUR SQL SERVER", "YOUR DATABASE"),
  #"Navigation 1" = Source{[Schema = "dbo", Item = "view_Customers"]}[Data],
  #"Renamed columns" = Table.RenameColumns(#"Navigation 1", {{"State", "State_old"}}),
  #"Trimmed text" = Table.TransformColumns(#"Renamed columns", {{"State_old", each Text.Trim(_), type text}}),
  #"Inserted conditional column" = Table.AddColumn(#"Trimmed text", "State", each try fnLookup([State_old]) otherwise [State_old]),
  #"Reordered columns" = Table.ReorderColumns(#"Inserted conditional column", {"Customer_Key", "Company_Key", "CustomerNumber", "CustomerName", "Group Name", "SubGroup Name", "Address1", "Address2", "Address3", "City", "State", "State_old", "Zip", "Phone", "Region", "GeographyKey", "Active", "CreatedBy", "CreatedOn", "UpdatedBy", "UpdatedOn", "ActivatedOn", "DeactivatedOn", "Source", "CheckSum", "CompanyID"}),
  #"Transform columns" = Table.TransformColumnTypes(#"Reordered columns", {{"State", type text}}),
  #"Replace errors" = Table.ReplaceErrorValues(#"Transform columns", {{"State", null}}),
  #"Removed columns" = Table.RemoveColumns(#"Replace errors", {"State_old"})
  #"Removed columns"

After all your dataflow entities have been created, save your changes and create a schedule to refresh your data.

Connecting our Dataflows to Power BI Desktop

From your workstation with the December release of the Power BI Desktop, click the “Get Data” button and select the “Power BI dataflows (Beta)” connection.

Bringing our dataflow data into Power BI

Select the dataflow entities we created in the Power BI Service.

Selecting out entities

Once all of your entities have been imported, navigate to the Relationship section to verify the correct relationships exist and make any changes. Your relationship view should look like the screenshot below.

Power BI relationships

Don’t forget to mark our dataflow Date entities as a date table after importing it into Power BI Desktop.

Marking the date table

I’m going to build out Product Performance report from the dataflow entities that I created, I will add ABC segmentation, moving averages and time comparisons using DAX measures. ABC segmentation allows you to sort a list of values in three groups, which have different impact on the final result. ABC segmentation work on the famous Pareto principle, which states that 20% of efforts give 80% of the result. The meaning of the segments: A – the most important for the total of (20% gives 80% of the results). B – average in importance (30% – 15%). C – the least important (50% – 5%). Are three segments enough? Should we use more? What percentages? To answer these questions, you need to know your data and ABC segmentation is being done for. I have seen companies using 4 classes (ABCD) or even more. For this example, three classes have the advantage that they separate the assortment in three categories of high, medium, low importance, which is easy to communicate.

In the Product Performance report below, I used a scatter chart to analysis the Product classes profit margin and totals profits, the Sales ABC by ABC Class stacked bar chart segments the products using ABC segmentation and allows you to drill down into each segment to see the under-lining detail. Additionally, there is a total cost by warehouse location and weekly moving average cost comparison chart at the bottom of the report. Here are the DAX measures and DAX Columns that I used to create the Product Performance report:

DAX Columns:

ProductSales = 

CumulatedSales = 
    'Products'[ProductSales] >= EARLIER('Products'[ProductSales]))

CumulatedPercentage = 'Products'[CumulatedSales] / SUM('Products'[ProductSales])

DAX Measures:

Profit Margin = DIVIDE ([Total Profits], [Total Sales], 0)

Total Sales = SUM(Sales[ExtendedPrice])

Units Sold = SUM('Sales'[Quantity])

Total Profits = [Total Sales] -[Total Cost]

Costs Weekly MA = 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
    [Costs LY]

Costs Weekly MA LY = 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
    [Total Cost]

    SUM ( Sales[ExtendedCost] ),
    FILTER ( ALL ( 'Dates' ), 'Dates'[Year] = MAX ( 'Dates'[Year] ) )

Qty on Hand = 
SUM (Inventory[Quantity on Hand])

Turn-Earn Index =
 ( [Inventory Turnover Ratio] * 'Sales Measures'[Profit Margin] )

Total Cost = SUM(Sales[ExtendedCost])

Sales ABC = 
    [Total Sales] , 
    VALUES( Products[ItemDescription] ),
                    VALUES( Products[ItemDescription] ),
                    "OuterValue", [Total Sales]
                "CumulatedSalesPercentage" , DIVIDE(
                                VALUES( Products[ItemDescription] ),
                                "InnerValue", [Total Sales]
                            [InnerValue] >= [OuterValue]
                        [Total Sales],
                        VALUES( Products[ItemDescription] )
            ALL( Products )
        [CumulatedSalesPercentage] > [Min Boundary]
        && [CumulatedSalesPercentage] <= [Max Boundary]

Product performance visualization

Ready to do more with your data? Check out these other great websites:


Dataflows for Power BI with Dynamics GP data

Part 1

Been a while since my last blog post so this is rather late when talking about Dataflows with Power BI, but I was really excited about the announcement in November and finally got a chance to play around with it. If you haven’t heard about Dataflows or know what it is, here is a excerpt from that announcement:

“In the modern BI world, data preparation is considered the most difficult, expensive, and time-consuming task, estimated by experts as taking 60%-80% of the time and cost of a typical analytics project. Some of the challenges in those projects include fragmented and incomplete data, complex system integration, business data without any structural consistency, and of course, a high skillset barrier. Specialized expertise, typically reserved for data warehousing professionals, is often required. Such advanced skills are rare and expensive.

To answer many of these challenges, Power BI serves analysts today with industry leading data preparation capabilities using Power Query in Power BI Desktop. Now, With Power BI dataflows, we’re bringing these self-service data preparation capabilities into the Power BI online service, and significantly expanding the capabilities in the following ways:

  • Self-service data prep for big data in Power BI – Dataflows can be used to easily ingest, cleanse, transform, integrate, enrich, and schematize data from a large array of transactional and observational sources, encompassing all data preparation logic. Previously, ETL logic could only be included within datasets in Power BI, copied over and over between datasets and bound to dataset management settings. With dataflows, ETL logic is elevated to a first-class artifact within Power BI and includes dedicated authoring and management experiences. Business analysts and BI professionals can use dataflows to handle the most complex data preparation challenges and build on each other’s work, thanks to a revolutionary model-driven calculation engine, which takes care of all the transformation and dependency logic—cutting time, cost, and expertise to a fraction of what’s traditionally been required for those tasks. Better yet, analysts can now easily create dataflows using familiar self-service tools, such as the well known Power Query data preparation experience. Dataflows are created and easily managed in app workspaces, enjoying all the capabilities that the Power BI service has to offer, such as permission management, scheduled refreshes, and more.
  • Advanced Analytics and AI with Azure – Power BI dataflows store data in Azure Data Lake Storage Gen2 – which means that data ingested through a Power BI dataflow is now available to data engineers and data scientists to leverage the full power of Azure Data Services such as Azure Machine Learning, Azure Databricks, and Azure SQL Datawarehouse for advanced analytics and AI. This allows business analysts, data engineers, and data scientists to collaborate on the same data within their organization.
  • Support for the Common Data Model – The Common Data Model (CDM) is a set of a standardized data schemas and a metadata system to allow consistency of data and its meaning across applications and business processes.  Dataflows support the CDM by offering easy mapping from any data in any shape into the standard CDM entities, such as Account, Contact etc. Dataflows also land the data, both standard and custom entities, in schematized CDM form. Business analysts can take advantage of the standard schema and its semantic consistency, or customize their entities based on their unique needs. The Common Data Model continues to evolve as part of the recently announced Open Data Initiative. 

Once dataflows are created, users can use Power BI Desktop and the Power BI service to create datasets, reports, dashboards, and apps to drive deep insights into their business.”

Microsoft has release a white paper on Dataflows and you can find that here: or check out Matthew Rouche’s great blog posts at BI Polar: They really are a great guide to get you started and up to speed the Dataflows.

Getting started with Dataflows

So, Let’s try doing some of this ourselves with Dynamics GP data. To do this you will need to have already installed an Enterprise gateway to your Dynamics GP SQL server. Log into your Power BP Pro and Premium service and create a new workspace. You should now see a “Dataflows (Preview)” option. Click on the that option and then the “+ Create” button to start and select Dataflows from the dropdown menu.

Dataflows Preview Window

Dataflows Preview Menu

Creating your Entities

This will open up the below window with the option to “Add new entities” or “Add linked entities”. Let’s select “Add new entities” so we can add a date dimension to our Dataflow prep.

Define new entities window

From the “Choose data source” window, select the “Blank query” tile.

Choose data source window

This will open up the “Connect to data source” window. Copy and paste the below Power Query code to create your data dimension.

Blank query Window

  Source = List.Dates(StartDate, Length, #duration(1, 0, 0, 0)),
  #"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
  #"Renamed Columns" = Table.RenameColumns(#"Converted to Table", {{"Column1", "Date"}}),
  #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns", {{"Date", type date}}),
  StartDate = Date.StartOfYear(Date.AddYears(DateTime.Date(DateTime.LocalNow()),-10)),
  Today = Date.EndOfYear(Date.AddYears(DateTime.Date(DateTime.LocalNow()),5)),
  Length = Duration.Days(Today - StartDate),
  Custom1 = #"Changed Type",
  #"Inserted Year" = Table.AddColumn(Custom1, "Fin Year", each Date.Year([Date]+#duration(184,0,0,0)), Int64.Type),
  #"Inserted Month Name" = Table.AddColumn(#"Inserted Year", "Month Name", each Date.MonthName([Date]), type text),
  #"Inserted Day Name" = Table.AddColumn(#"Inserted Month Name", "Day Name", each Date.DayOfWeekName([Date]), type text),
  #"Inserted Month" = Table.AddColumn(#"Inserted Day Name", "Fin Month", each if Date.Month([Date]) >=7 then Date.Month([Date])-6 else Date.Month([Date])+6  , Int64.Type),
  #"Inserted Day of Week" = Table.AddColumn(#"Inserted Month", "Day of Week", each Date.DayOfWeek([Date])+1, Int64.Type),
  #"Inserted First Characters" = Table.AddColumn(#"Inserted Day of Week", "Short Month", each Text.Start([Month Name], 3), type text),
  #"Inserted First Characters1" = Table.AddColumn(#"Inserted First Characters", "DDD", each Text.Start([Day Name], 3), type text),
  #"Reordered Columns" = Table.ReorderColumns(#"Inserted First Characters1", {"Date", "Fin Year", "Month Name", "Short Month", "Fin Month", "Day Name", "DDD", "Day of Week"}),
  #"Added Custom1" = Table.AddColumn(#"Reordered Columns", "Month Number", each (Date.Month([Date]))),
  #"Inserted Start of Month" = Table.AddColumn(#"Added Custom1", "Start of Month", each Date.StartOfMonth([Date]), type date),
  #"Inserted End of Month" = Table.AddColumn(#"Inserted Start of Month", "End of Month", each Date.EndOfMonth([Start of Month]), type date),
  #"Duplicated Column" = Table.DuplicateColumn(#"Inserted End of Month", "Date", "Date - Copy"),
  #"Calculated Quarter" = Table.TransformColumns(#"Duplicated Column",{{"Date - Copy", Date.QuarterOfYear, Int64.Type}}),
  #"Renamed Columns1" = Table.RenameColumns(#"Calculated Quarter", {{"Date - Copy", "Quarter Number"}}),
  #"Inserted Merged Column" = Table.AddColumn(#"Renamed Columns1", "Merged", each Text.Combine({"Q", Text.From([Quarter Number], "en-US")}), type text),
  #"Current Date" = Table.AddColumn(#"Inserted Merged Column", "Current Date", each Date.From(DateTimeZone.FixedLocalNow())),
  #"Added Custom10" = Table.AddColumn(#"Current Date", "Is Work Day", each if Date.DayOfWeek([Date]) >=0 and Date.DayOfWeek([Date]) <= 4 then "Is Work Day" else "Weekend"),
  #"Renamed Columns2" = Table.RenameColumns(#"Added Custom10", {{"Merged", "Calendar Quarter"}, {"DDD", "Short Day"}}),
  #"Duplicated Column1" = Table.DuplicateColumn(#"Renamed Columns2", "Date", "Date - Copy"),
  #"Extracted Year" = Table.TransformColumns(#"Duplicated Column1",{{"Date - Copy", Date.Year, Int64.Type}}),
  #"Renamed Columns3" = Table.RenameColumns(#"Extracted Year", {{"Date - Copy", "Calendar Year"}}),
  #"Changed Type3" = Table.TransformColumnTypes(#"Renamed Columns3", {{"Fin Year", Int64.Type}, {"Current Date", type date}}),
  #"Added Custom Column" = Table.AddColumn(#"Changed Type3", "DateKey", each Text.Combine({Date.ToText([Date], "yyyy"), Date.ToText([Date], "MM"), Date.ToText([Date], "dd")}), Int64.Type),
  #"Transform columns" = Table.TransformColumnTypes(#"Added Custom Column", {{"Month Number", type text}, {"Is Work Day", type text}}),
  #"Replace errors" = Table.ReplaceErrorValues(#"Transform columns", {{"Month Number", null}, {"Is Work Day", null}}),
  #"Added Custom" = Table.AddColumn(#"Replace errors", "YYYYMM", each Text.Combine({Date.ToText([Date],"yyyy"), Date.ToText([Date], "MM")})),
  #"Added Custom3" = Table.AddColumn(#"Added Custom", "YYYY-MM", each Text.Combine({Date.ToText([Date],"yyyy"),"-", Date.ToText([Date], "MM")})),
    #"Changed Type1" = Table.TransformColumnTypes(#"Added Custom3",{{"YYYYMM", type text}, {"YYYY-MM", type text}})
    #"Changed Type1"

On the “Edit queries” window. Here I changed the Name to “Date” and then clicked “Done”.

Edit queries Window

That’s enough for today blog post. In part 2, we will create a entities for our customer, salesperson, sales territory, product, sales transactions, inventory transactions and a custom function in are dataflow to help clean up our customer data and then connect Power BI Desktop to our dataflows so we can create any relationship and DAX measures needed to start visualizing our Dynamics GP data.

Power BI Dynamics GP Sales and Inventory Analysis Report Modifications

Recently, I’ve made some updates to my Power BI Sales and Inventory Analysis Reports and wanted to share those with you. On the Sales Analysis report, I’ve added the ability to segment my sales revenue by growth rate and compare that growth rate to the previous period selected. Here are the steps I used to add the changes to the report.

  1. First, I clicked on the “Enter Data” icon from the Home tab in Power BI Desktop and then entered the following information.Power BI Create Table
  2. I then created a DAX Sales Growth measure with the following code.
    Sales Growth = DIVIDE([Sales CY],[Sales LY],0)-1
  3. Next I created a SAX measure to segment the data based on the Sales Growth Groups that we created in step 1.
    Sales per Growth Group =
            CALCULATE (
                [Sales CY],
                FILTER (
                    VALUES ( Customers[Customer Class] ),
                    COUNTROWS (
                        FILTER (
                            'Sales Growth Groups',
                            [Sales Growth] >= 'Sales Growth Groups'[Min]
                                && [Sales Growth] < 'Sales Growth Groups'[Max]                     )                 )                     > 0
  4. Next, I modified my Product by Sales and Customer Class by Sales visuals to use the new Sales per Growth Group measure by dragging the segment from the Sales Growth Group table into the Legend field, Sales Growth measure into the Tooltips and Sales per Growth Group measure into the Values field.Screenshot of field wells
  5. Below is a screenshot of the end results. You can now clearly tell which segments are performing better or worse than last year.Power BI Sales Summary Report

My next modification was to my Inventory Order Analysis report. This one was just a small change by adding one DAX measure to calculate slow moving inventory and a Table visual to view the results.

  1. First, I added a DAX measure for slow moving inventory
    SM Inventory Value = 
    VAR SumOf = 
            Inventory[Quantity on Hand] * Inventory[Unit Cost]
    VAR Range = DATESINPERIOD(Dates[Date], EDATE(TODAY(), -12), -24, MONTH)
               VALUES(Inventory[Last Sale Date]),
           ) > 0, 
           SumOf, 0

  1. Then I added the following columns and measure to a table visual to show my slow moving inventory.Table Visual Field Wells Screenshot
  2. Below is a screenshot of the modification to the Inventory Order Analysis report.Power BI Inventory Order Analysis

Until next time, Enjoy the code and Happy Analyzing!

How to Create a Better RMA Dashboard For Mammoth Saving

OK, maybe not mammoth saving but controlling quality and costs during the Return Material Authorizations (RMA) process has a multiplicative effect for growth in a company. Finding new ways to improve perfect order performance continually reduces RMA and increases customer satisfaction leading to more sales.

RMAs are a direct measure of product quality and a products’ nonconformance to customers’ specifications and requirements. They are issued for a wide variety of reasons. The RMA module within Dynamics GP is designed to provide detailed traceability for both customer and supplier returns. Before we get to designing our Power BI RMA dashboard let’s look at Dynamics GP RMA module.

Return Material Authorization Module

The Returns Management module for Microsoft Dynamics GP enables you to meet customer and vendor demands for product and part returns by streamlining tasks and improving your responsiveness to customer queries and complaints. Generate returns from historical customer invoices, service calls, or manually. Return an item to a vendor to fix it within your company and automatically transfer items from the returns warehouse to the main warehouse. Give your customers up-to-date information about the status of their returned items.

You can use Returns Management to enter, set up, and maintain your RMA and Return To Vendor (RTV) documents. If Returns Management is integrated with Service Call Management, an RMA is created automatically from a returnable parts line on a service call. If Returns Management is integrated with Sales Order Processing, you can select items directly from the historical Sales Order Processing invoice that was generated for a specific customer.

Dynamics GP’s RMA Life Cycle

When operating a returns warehouse, you sometimes interact with customers who need to return equipment for various reasons. When they bring you a part, you complete one of a few tasks: issue a credit, repair and return the item to the customer, or replace the item. You also may be accepting the item in exchange for an item you already provided to the customer. Once you’ve completed your transaction with the customer, you have a few more options: repair the item and return it to your inventory, return the item to the vendor, or scrap the item.

The following diagram outlines the life cycle of RMAs, from entry through completion. When Returns Management is integrated with the other modules of the Field Service Series (Service Call Management, Preventive Maintenance, Contract Administration, and Depot Management), many new options and paths become available.

RMA Life Cycle

You can create RMAs from two different points of access throughout Field Service:

  • Manual entry in the RMA Entry/Update window
  • From return lines for returnable items on a service call

Entry in the RMA Entry/Update window is the method described in this manual. Refer to the Service Call Management documentation for more information regarding service calls and returnable items.

RMA types Inside Dynamics GP

RMA documents are used to track an item return from your customers. The available RMA document types are as follows:

  • Credit – Provide a credit to your customer’s account in Receivables Management for the value of the items the customer returned to you.
  • Replacement – Provide the same item, or a similar item, as a replacement to your customer. You must receive the original item from your customer before you send the replacement item on a new order document in Sales Order Processing.
  • Advance Cross–ship – Provide the same item, or a similar item, as a replacement to your customer. You can send the replacement item using a new Sales Order Processing order document prior to receiving the original item from your customer.
  • Repair and Return – You, or your vendor, will repair the item that is received from the customer. Your customer will receive the item after it’s been repaired.
  • None – The customer’s original item is picked up by your field service technician and returned to your returns warehouse. This type of RMA document was designed to integrate directly with Service Call Management.

Analyzing Our RMA Data with Power BI

With all the RMA data we will be using the following three visuals and associated measures:

  • Rate of Return – This is an incredibly useful KPI in a distribution center, especially when segmented by cause for return. Identifying causes for returns — damage, late delivery, inaccurate product description, wrong item shipped, etc — helps warehouse managers address underlying issues, and make necessary improvements.Number of Units Returned/Number of Units Sold = Rate of Return.
  • Perfect Order Rate – This KPI measures how many orders your warehouse successfully delivers without incident: the correct item, shipped on time and received in good condition by the customer who ordered it. Lean practices help identify errors or inaccuracies before orders leave the warehouse.Orders Completed Without Incident/Total Orders Placed = Perfect Order Rate
  • RMA Pareto Analysis – Done on the top 20% of factors that drive 80% of the returns. This will make cause troubleshooting more efficient, leading to permanent solutions to problems that may be causing RMAs to begin with. I’m not going to go into detail on how to build this chart in this blog post. You can find the steps on how to complete it here:

First, we need to get our Dynamics GP RMA data into Power BI and we’ll do that with the SQL script below:

  300 AS Company_Key,
  b.RETDOCID AS [RMA Document ID],
  b.RETREF AS [RMA Reference],
  b.RETSTAT AS [RMA Status],
  b.RETTYPE AS [RMA Type],
  a.COMPDTE AS [DocDate],
  b.OFFID AS [Office ID],
  b.LOCNCODE AS [Location Code],
  b.SOPNUMBE AS [Invoice Number],
  b.SVC_RMA_Reason_Code AS [RMA Reason Code],
  b.SVC_RMA_Reason_Code_Desc AS [RMA Reason Code Description],
  b.UNITCOST AS [Unit Cost],
  b.EXTDCOST AS [Extended Cost],
FROM dbo.SVC05200 b (NOLOCK)
JOIN dbo.SVC05000 a (NOLOCK)
AND b.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0) -- Didn't want everything so limiting to last 4 years of RMA data 
AND a.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0) -- Didn't want everything so limiting to last 4 years of RMA data


  300 AS Company_Key,
  b.RETDOCID AS [RMA Document ID],
  b.RETREF AS [RMA Reference],
  b.RETSTAT AS [RMA Status],
  b.RETTYPE AS [RMA Type],
  a.COMPDTE AS [DocDate],
  b.OFFID AS [Office ID],
  b.LOCNCODE AS [Location Code],
  b.SOPNUMBE AS [Invoice Number],
  b.SVC_RMA_Reason_Code AS [RMA Reason Code],
  b.SVC_RMA_Reason_Code_Desc AS [RMA Reason Code Description],
  b.UNITCOST AS [Unit Cost],
  b.EXTDCOST AS [Extended Cost],
FROM dbo.SVC35200 b (NOLOCK)
JOIN dbo.SVC35000 a (NOLOCK)
AND b.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0) -- Didn't want everything so limiting to last 4 years of RMA data
AND a.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0)) -- Didn't want everything so limiting to last 4 years of RMA data

  rs.STSDESCR AS [RMA Status Description],
  iv.ITMCLSCD AS [Item Class]
LEFT JOIN dbo.IV00101 iv (NOLOCK)
LEFT JOIN dbo.SVC05500(nolock) rs
  ON b.[RMA Status] = rs.RETSTAT

With the data now loaded into Power BI Desktop, I created relationships with my Inventory, Date and Customer dimension that I already had in Power BI Desktop from my previous blog posts.

Creating Our Measures

Time to build the Rate of Return Measure. Rather than create one big DAX measure we will be building several small measures that build upon each other. Here are the DAX formulas that we will be using for the Rate of Return measure:

  • Return of Lbs CY
Returns Lbs. CY = CALCULATE(
        SUM(Returns[Total Return Lbs])
  • Lbs. CY

Lbs. CY = SUM(Sales[ExtendedWeight])

  • Rate of Return
Rate of Return = DIVIDE 
    ([Returns Lbs. CY], [Lbs. CY], 
  • Monthly Average Rate of Return
Monthly Avg. Rate of Return = AVERAGEX( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
    [Rate of Return]

With our calculations now complete, we create a line chart with Date on the Axis and Monthly Avg. Rate of Return for the Value as seen in the bottom left line chart visual of our Returns Analysis report screen shot below.

Up next, our Perfect Order Rate measure. Once again we will build several small calculation and add them together to create our Monthly Avg. Perfect Order Rate. Here are the DAX formulas that I used for this:

  • Total Invoice Count – Since my data includes line level detail for each Sales Order type in the Sales Table I’ll count all lines invoice and evaluate each line later to make sure it was a perfect order.
Total Invoices Count = CALCULATE(
    FILTER('Sales','Sales'[SOPTYPE] = 3)
  • Perfect Order – Now we will determine if the order was fulfilled before the customer’s required date.
    FILTER('Sales','Sales'[SOPTYPE] = 3),
    FILTER('Sales',[Document Date]<='Sales'[Req Ship Date])
  • Perfect Order Rate
Perfect Order Rate = DIVIDE(
    [Perfect Order],[Total Invoices Count],
  • Monthly Average Perfect Order Rate
Monthly Avg. Perfect Order Rate = AVERAGEX( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
    [Perfect Order Rate]

With all of our DAX measures complete, we can create the visuals needs to Monthly Average Perfect Order Rate and Monthly Average Return Rate of Lbs. Here is what my final Returns Analysis report looks like.

RMA Dashboard

Until next time, Happy Analyzing!

Enhanced Dynamics GP Payables Workflow Reporting

A couple days ago, Gina left a comment on my 3 Secrets To An Awesome ERP System! blog post. She was wondering if it was possible on the Payables Transaction Workflow to include Document Type, Document Date, Document Number, Transaction Description, Vendor Name and Amount. The answer is YES, it is possible. The below SQL view only works with the Payables Transaction Workflow but can be modified to work with other transactional workflows. It will not work with batch workflows because batch IDs lack an unique ID to create the relationship against. Here is the modified SQL view to complete the request:


CREATE VIEW [dbo].[vw_WorkFlow_Status]


 WITH CTE_FINAL (WorkflowInstanceID, Workflow_Name, Workflow_Step_Name, Approver, Workflow_Action, Completion_Date, Completion_Time, Comments  )
(select  d.WorkflowInstanceID, 
       CASE WHEN a.ADDisplayName is null THEN ''
          ELSE a.ADDisplayName
          END as [Assigned Approver],
       CASE WHEN d.Workflow_Action = 1 THEN 'Submit'
          WHEN d.Workflow_Action = 2 THEN 'Resubmit'
          WHEN d.Workflow_Action = 3 THEN 'Approve'
          WHEN d.Workflow_Action = 4 THEN 'Task Complete'
          WHEN d.Workflow_Action = 5 THEN 'Reject'
          WHEN d.Workflow_Action = 6 THEN 'Delegate'
          WHEN d.Workflow_Action = 7 THEN 'Recall'
          WHEN d.Workflow_Action = 8 THEN 'Escalate'
          WHEN d.Workflow_Action = 9 THEN 'Edit'
          ELSE 'Final Approve'
          END as Workflow_Action,
       convert(varchar(10),d.Workflow_Completion_Date, 101) as [Completion_Date],
       right('0'+LTRIM(right(convert(varchar,d.Workflow_Completion_Time,100),8)),7) as Completion_Time,
         from dbo.WF30100 d
LEFT JOIN WF40200 a ON d.Workflow_Step_Assign_To = a.UsersListGuid
WHERE d.Workflow_Action = 10), CTE_PM as (SELECT    P.VENDORID Vendor_ID,
    V.VENDNAME Vendor_Name,
    P.VCHRNMBR Voucher,
         WHEN 1 THEN 'Invoice'
         WHEN 2 THEN 'Finance Charge'
             WHEN 3 THEN 'Misc Charge'
             WHEN 4 THEN 'Return'
             WHEN 5 THEN 'Credit Memo'
         WHEN 6 THEN 'Payment'
         END Document_Type,
    P.DOCDATE Document_Date,
    P.PSTGDATE GL_Posting_Date,
    P.DUEDATE Due_Date,
    P.DOCNUMBR Document_Number,
    P.DOCAMNT Document_Amount,
    P.CURTRXAM Unapplied_Amount,
    P.TRXDSCRN [Description],
         WHEN 0 THEN 'No'
         WHEN 1 THEN 'Yes'
         END Voided

     FROM PM20000
     FROM PM30200
     FROM PM10000) P

    PM00200 V
 Select  WF.* 
        ,coalesce(pm.Document_Type,'') as Document_Type
        ,coalesce(pm.Document_Date,'') as Document_Date
        ,coalesce(pm.Document_Number, '') as Document_Number
        ,coalesce(pm.Description, '') as Description
        ,coalesce(pm.Vendor_Name,'') as Vendor_Name
        ,coalesce(pm.Document_Amount,0) as Document_Amount 
            ISNULL(C.WfBusObjKey, '') WFBusObjKey ,
            LEFT(WfBusObjKey, ISNULL(NULLIF(CHARINDEX('~', WFBusObjKey) - 1, -1), LEN(WfBusObjKey))) as Split,
            Workflow_History_User ,
            A.Workflow_Name ,
            A.Workflow_Step_Name ,
            ISNULL(B.WorkflowTaskAssignedTo, '') WorkflowTaskAssignedTo ,
            CASE C.Workflow_Status
              WHEN 1 THEN 'Not Submitted'
              WHEN 2 THEN 'Submitted (Deprecated)'
              WHEN 3 THEN 'No Action Needed'
              WHEN 4 THEN 'Pending User Action'
              WHEN 5 THEN 'Recalled'
              WHEN 6 THEN 'Completed'
              WHEN 7 THEN 'Rejected'
              WHEN 8 THEN 'Workflow Ended (Depricated)'
              WHEN 9 THEN 'Not Activated'
              WHEN 10 THEN 'Deactivated (Depricated)'
              ELSE ''
            CASE A.Workflow_Action
              WHEN 1 THEN 'Submit'
              WHEN 2 THEN 'Resubmit'
              WHEN 3 THEN 'Approve'
              WHEN 4 THEN 'Task Complete'
              WHEN 5 THEN 'Reject'
              WHEN 6 THEN 'Delegate'
              WHEN 7 THEN 'Recall'
              WHEN 8 THEN 'Escalate'
              WHEN 9 THEN 'Edit'
              WHEN 10 THEN 'Final Approve'
              ELSE ''
            END AS Workflow_Action ,
            A.Workflow_Due_Date ,
            A.Workflow_Completion_Date ,
            A.DEX_ROW_ID ,
        CASE WHEN d.Approver is null THEN ''
          ELSE d.Approver
       END as Approver,
       CASE WHEN d.Completion_Date is null THEN ''
          ELSE d.Completion_Date
       END as Completion_Date,
       CASE WHEN d.Completion_Time is null THEN ''
          ELSE d.Completion_Time
       END as Completion_Time,
       CASE WHEN d.Comments is null THEN ''
          ELSE d.Comments
       END as Comments
    FROM    WF30100 AS A
            LEFT OUTER JOIN WFI10004 AS B ON A.WorkflowInstanceID = B.WorkflowInstanceID
            AND A.WorkflowStepInstanceID = B.WorkflowStepInstanceID
            LEFT OUTER JOIN WFI10002 AS C ON C.WorkflowInstanceID = A.WorkflowInstanceID
            LEFT OUTER JOIN CTE_FINAL d ON D.WorkflowInstanceID = c.WorkflowInstanceID) WF
            LEFT OUTER JOIN CTE_PM pm on pm.Voucher = WF.Split

Until next time. Enjoy the code!

3 Secrets To An Awesome ERP System!

Part of being a Microsoft Dynamics Consultant is seeing all the inventive ways clients use the software we deploy and take that knowledge and provide best practices to everyone! Here are just some of the things to enhance your system processes:

  • Best Practices
  • Customization and Modifications
  • Reporting Enhancements

Best Practices

Customization, Modifications and Third Party Add-ins

  • General Ledger, Sales Distribution, and Payables Distribution intercompany Excel paste – James Lyn’s Excel Paste add-in. is great at extending Dynamics GP’s out of the box functionality. While you’re on his site, check out his other add-ins like GP batch attach for Payables.
  • Create Dynamics GP Marcos or use PowerShell scripts to automate tasks – i.e. Macro to log into Dynamics GP and run inventory reconcile process or reboot your web client servers to remove hung processes.
  • Custom workflows and reporting – reporting to provide detail information on current/open and historical/approved workflows. Find out how to do that here.

Reporting Enhancements

Dynamics GP comes with some good reporting capabilities:

  • Management Reporter.
  • Excel refreshable reports.
  • SmartList.
  • Jet Express for Dynamics GP.
  • Solver’s BI360.
  • Power BI.

With Dynamics GP 2018 you can now deploy Power BI GP content pack or embedded Power BI visual inside of Dynamics GP. So what do the Power BI content pack visuals look like and how do we get them installed? As of Microsoft Dynamics GP 2018, the GP OData service was updated to OData version 4. This redesign also brought on paging and filtering of OData requests. This will create a more stable and robust platform for delivering Microsoft Dynamics GP content to authenticated users. The Power BI content pack features sample reports for Financial, Sales, Purchasing and Inventory data. Each report utilizes relationships built between GP tables and various Filters that can be used to display the information that is important to you. You can also review the included Calculated Columns as examples for including calculations on your Power BI reports such as Net Debit/Credit, Profit, and Item Sales amounts.

In order to use the Power BI Content Pack with Dynamics GP, install OData Services. Once this is complete, you will have to publish the following Data Sources inside Dynamics GP. (Administration > System > OData > Data Sources) The following Data Sources will need to be published for the GP content pack.

  • Accounts
  • Account Transactions
  • Customers
  • Inventory Sales Summary Period History
  • Inventory Transactions
  • Item Quantities
  • Purchase Line Items
  • Purchase Requisition Lines
  • Purchase Requisitions
  • Receiving Line Items
  • Sales Line Items
  • Vendors

The Power BI Content Pack will also have to be configured to point to you’re existing Microsoft Dynamics GP OData Service. This can be done by modifying the existing Data source in Power BI, or by creating a new data source and configuring the content to use the new source. The screenshots below show what the GP content pack sample reports look like.

Finance Dashboard


Sales Dashboard


Purchasing Dashboard


Inventory Dashboard


Enhancing Our Dashboards

These GP content pack reports are a good starting point and can speed up the process of implementing a Power BI solution. With a little work from your Microsoft Dynamics Consultant, we can provide you so much more. In February of 2017, I started a blog series that provided a “how to” on building a Finance, Sales, Customer, Product, and Inventory dashboard. Since my main goal was to show an update of the Excel multi-company dashboard, I choose to first build a small DataMart and integrate the data from Dynamics GP databases before building my visuals. Follow the links below to find out how I built each one of the enhanced dashboards.

Enhanced Finance Dashboard

The finance dashboard, from the blog series, now provides a summary profit & loss statement that you can drill down into line level detail. Additionally, you see total sales by inventory class and customer class and a weekly moving average.

Enhanced Finance Dashboard

Enhanced Sales Dashboard

The sales summary dashboard provides an analysis of sales by inventory item class, customer class and weekly moving averages for total sales dollars and transactions. The report also includes a cumulative sales and cost comparison.

Enhanced Sales

Product Performance and Inventory Reorder Dashboard

In the product performance dashboard, I’ve added to compare total profits and profit margin, cost by warehouse and ABC segmentation analysis. Additionally, there is a weekly moving average cost comparison chart at the bottom of the report.

Enhanced Product Performance

For my inventory reorder analysis report, I’m using some discussed by Belinda Allen in her, Inventory Item Reorder Dashboard. I converted it to Power BI to help your procurement manager evaluate what’s on hand, allocated to open orders and items sold within a given time period.

Enhanced Inventory Dashboard

Future Developments

Returns are generally thought of a loses and return percentages can be dependent both the type of product and the company’s returns policy. While the average industry rate is four percent, consumer durable goods can range from two to 10 percent and apparel can be in excess of 20 percent. There are several reasons for merchandise returns and tracking the costs and reasons associated with them can increase revenues, lower costs, improved profitability and enhanced levels of customer service. Using Dynamics GP’s RMA module can help and analyze that data in Power BI is one of the things that I’m currently working on. Below is a screenshot of that analysis and a subject of a future blog post.

Sales Returns Dashboard

Ready to Do Even More with Dynamics and Power BI?

Stay tuned for more help in lead your organization into becoming data-driven organization by exploring your Business Intelligence, BI360, Power BI, Microsoft Dynamics GP journey.

How to Enhance Dynamics GP’s Inventory BOM Report Using SSRS

During a recent project for a client they want to enhance their custom inventory BOM report made by another vendor. Not having access to the original code, I proposed modifying the Bill of Materials Maintenance window to add a print button and developing the report in SSRS.

Modifying the window

1. Log into Dynamics GP and open the Bill of Material Maintenance window in the Inventory Module.

2. On the window navigate to Tools>>Customize>>Modify Current Window. This will open modifier.

Dynamics GP BOM Window

3. Inside of Modifier, click on the “OK” button from the toolbox on the left and drag it onto menu bar of the Bill of Material Maintenance window. As shown on the below screen shot.

BOM window in Modifier

4. Open the Properties of the new “OK” button and change the text for the button. I changed mine to “Print BOM”. Save the changes and Exit Modifier to get back into Dynamics GP.

BOM window in Modifier 2

Adding VBA code to our window

1. With the Window now modified, we need to navigate back to Tools>>Customize>>Add Current Window to Visual Basic. We also need to add the “Print BOM” button and Bill Number field to Visual Basic by selecting the “Add Fields to Visual Basic…” menu.

Add BOM window to VBA Project

2. Now open up Visual Basic Editor, Tools>>Customize. We need to make the button functional by adding some VBA code.

BOM window in VBA

a. First let’s add the References that we need for the project by navigating to Tools>>References from the menu bar. Add or verify that the following References are selected: Visual Basic for Applications, Microsoft Dynamics GP VBA 18.0 Type Library, OLE Automation and Microsoft ActiveX Data Objects 2.1 Library.

Dynamics GP VBA project references

b. Before writing our VBA code we should verified that the window and both fields were added to the VBA project. We can do that by click on the dropdown menu where you see “(General)”. Your screen should look similar to my screen shot below.

VBA project windows and fields

c. Add the following VBA code Object Browser. You will need to change the highlighted VBA code based your SSRS server name, folder location of the report that we will be building and the name of the SSRS report. In the VBA code below:

i. XXXXX – would be my SSRS server name

ii. TWO – is the Folder that I saved the report in

iii. BOM Indented is what I named my report

Sub window_Open(strLocation As String, Menubar As Boolean, top As Long, left As Long, height As Long, width As Long, resizable As Boolean)

With CreateObject(“InternetExplorer.Application”)

.Visible = False

.top = top

.left = left

.height = height

.width = width

.Menubar = Menubar

.Visible = True

.resizable = resizable

.Navigate strLocation

End With

End Sub

Private Sub PrintBOM_BeforeUserChanged(KeepFocus As Boolean, CancelLogic As Boolean)


Dim IE As Object

IBOM = CStr(BillNumber)

window_Open “http://XXXXXXX/ReportServer/Pages/ReportViewer.aspx?%2fTWO%2fBOM+Indented&rs:Command=Render&BOMItem=&#8221; & IBOM & vtype, False, 10, 10, 750, 1250, True

End Sub

Creating the stored procedures for the SSRS report.

Now the tricky part, dealing with the undefined amount of levels that a BOM hierarchy can go down. With the help of Google, I found this forum post – Original credit and thanks to Beat Bucher and Tim Foster for the post and original SQL code. I have modified the original stored procedure to fit my client’s needs. Here is the stored procedure code that I used:





CREATE PROCEDURE [dbo].[usp_BOM_Level_Details] (@current char(31)–, @BOMType smallint

) AS


DROP TABLE #result;

Drop Table #stack;


–DECLARE @current char(31) –> use the 2 lines here to test directly in SQL Studio Mgmt

–SET @current=’BB00069E0000BT9′


–if @current is null set @current = ‘BB00069E0000BT9’

DECLARE @lvl int, @line char(31), @Qty numeric(19,5), @ORD1 int, @UOFM char(9), @comptitm char(31)

CREATE TABLE #stack (item char(31), comptitm char(31), Design_Qty numeric(19,5), ORD1 int, UOFM char(9), lvl int)

CREATE TABLE #result (lvl int, item char(31), comptitm char(31), Qty numeric(19,5), ORD1 int, UOFM char(9),

ord int identity(1,1))

INSERT INTO #stack VALUES (@current, ”,1,1,”,1)

SELECT @lvl = 1

WHILE @lvl > 0


IF EXISTS (SELECT * FROM #stack WHERE lvl = @lvl)


SELECT @current = item, @comptitm = comptitm, @Qty = Design_Qty, @ORD1 = ORD1, @UOFM = UOFM

FROM #stack

WHERE lvl = @lvl

— SELECT @line = replicate(‘-‘,(@lvl – 1)) + ‘ ‘ + @current –> spacing by level

–PRINT @line –> replace this print with an INSERT to another table like #BOM

INSERT #result SELECT @lvl,@current, @comptitm, @Qty, @ORD1, @UOFM


WHERE lvl = @lvl

AND item = @current

INSERT #stack


FROM BM00111 –> I edited this for GP

WHERE ITEMNMBR = @current and Component_Status = 1 –and BOMCAT_I = @BOMType –1 –> added BOMCAT for MFG(1) or ENG(2)

ORDER BY ORD –Order by Part Number as in GP Reports CMPTITNM

–ORDER BY POSITION_NUMBER ASC –Order by Position Number–other sequence in GP


SELECT @lvl = @lvl + 1



SELECT @lvl = @lvl – 1


drop table #stack

;WITH CTE as (select ITEMNMBR FROM BM00101)

SELECT T1.lvl,

CASE WHEN T1.lvl = 1 THEN ”


END as item,

CASE WHEN T1.lvl = 1 THEN RTRIM(T1.item)

ELSE RTRIM(T1.comptitm)

END as comptitm,

T1.Qty, T1.ORD1,





,CASE WHEN T1.item in (select ITEMNMBR FROM CTE)






END as [Ext Cost]


#result T1 INNER JOIN IV00101 T2

ON T2.ITEMNMBR = T1.item


DROP TABLE #result;


Building our SSRS BOM Indented Report

SSRS (SQL Server Reporting Services) is a server-based reporting system from Microsoft and part of the SQL server stack. It can be used to prepare and deliver prints through a web site. You can build your reports with either Report Builder or Visual Studio. For this project, we are going to be using Report Builder. Let’s start creating our Indented BOM report using a wizard.

Creating the Data Connection

1. Start Report Builder either from your computer or the Reporting Services web portal. Select New Report in the left pane and Table or Matrix Wizard in the right pane.SSRS wizard

2. Specify a Data Connection in the Table Wizard

A data connection contains the information to connect to an external data source such as a SQL Server database. Usually, you get the connection information and the type of credentials to use from the data source owner. To specify a data connection, you can use a shared data source from the report server or create an embedded data source that is used only in this report.

3. Click the New Button to create a new Data Source and then build on the Data Source Properties window that opens.SSRS Data Source

4. Enter your Connection Properties for the SQL Server and Dynamics GP company database.

SSRS Data Source connection

5. Click the General tab again. To verify that you can connect to the data source, click Test Connection.

The message “Connection created successfully” appears.

6. Click OK.

Creating our Datasets

1. Right click on your new Data Source to create a Dataset.

2. Choose to use a dataset embedded in my report and a Query type of Stored Procedure. Choose the stored procedure we created earlier. (usp_BOM_Level_Details)

SSRS Dataset

3. Add a Parameter for the Dataset Properties. I named mine @BOMItem

Organize Data into Groups in the Table Wizard

When you select fields to group on, you design a table that has rows and columns that display detail data and aggregated data. To start organizing your data into groups:

1. Navigate to the Insert menu, click on the Table icon and select Table Wizard.

SSRS Table Wizard

2. On the New Table or Matrix window drag the lvl, comptitm, item, ITEMDESC and UOFM to the Row Groups section. Drag the Qty, CURRCOST and Ext_Cost fields to the Values Section.

SSRS Table or Matrix Wizard

3. Click Next and de-select Expand/collapse groups.

Table Wizard step two

4. Remove the sub-grouping totals, format the numbers to the forth decimal place and change the row height to 0.025. Your table should look like the below screen shot when your complete.


5. Change the Tablix Properties for the Row and Column Headers to Repeat on each page.

Tablix Properties

The Final Product

Once complete, you should have a Dynamics GP Bill of Materials Maintenance window like the below screen shot and when you select a Bill Number and click the Print BOM button the SSRS report will open and produce the results in the second screen shot. You can download zip file the stored procedure, Dynamics GP package file and SSRS rdl report from here.

Modified Dynamics GP BOM Window

SSRS Indented Inventory BOM Report

Ready to Do Even More with Dynamics and Business Intelligence?

Check out my other blog posts that can help your company become data-driven organization by exploring your Business Intelligence, BI360, Power BI, Microsoft Dynamics GP and CRM.