How to Use Power BI to Analyze Microsoft Dynamics Data

Over the past several months, I had the pleasure of presenting at Power Platform World Tour in Reston at the Microsoft Center and teaching Dashboard in a Day at the Bethesda Microsoft Center. If you haven’t done either I highly recommend them.  The Power Platform World Tour’s lineup of great content; showcasing modern data visualization, app customization and innovation at its core. Our event series brought to you by your Power Platform User Groups, gives you the opportunity to come and meet both Microsoft and industry experts, explore the latest business challenges and solutions across various industries and see the innovative world of technology in action. Join us on this exploration around the world, gaining peer-to-peer networking opportunities and building a local support system sure to provide value 365 days a year.

Innovate for Greatness

The theme of this year’s Power Platform World Tour is pursuing greatness through innovation. We’ll be focusing on how the power platform applications can push the envelope on your business needs. Industry experts will be showing you the power and agility of Power BI, PowerApps and Microsoft Flow through robust learning sessions, demos and Q&A opportunities.

The Power Platform World Tour approach is one that focuses on you, the end user, pushing the boundaries of current business practice to becoming focused on empowerment, customization and innovation through a community approach. We believe these applications, along with this incredible event, are the recipe for success in 2019. We invite you to join us as we take a trip around the world.

Power BI – make sense of your data, produce beautiful reports, and uncover hidden insights for your business.

PowerApps – listen in as peers talk through how they have created apps that extend and mobilize their business using the no code PowerApps option. 

Microsoft Flow – find out how to automate processes, making your business more efficient.

 My Power Platform Presentation

During the Power Platform World Tour, I was able to present a Power BI solution that I created for one of KTL Solutions clients.  The below screenshots aren’t the full solutions provided to the client.  The Power BI PBIX file has been updated to embed the SQL queries in the Power Query code and SQL Server and database parameters so it can easily be added to your environment.  This Power BI Desktop works with Dynamics GP and has the Fabrikam demo data loaded into the reports.  You will need to modify the PBIX file to work with other ERP systems. So, let’s go over some of the reports within the file to get a general understanding for them.

Sales Overview

The first report is a Sales Overview that as the name implies provides an overview of the current years sales.  Across the top is the break out of sales by location and the ability to filter the overview by ABC classification.  The report also includes performance trends, sales growth by segment, sales growth by salesperson and sales by inventory item category that can be drilled down to the inventory item level.

Sales Overview

Customer Analysis Report

The Customer Analysis Report allows you to dig into Dynamics GP sales data by Customer, Year and Month.  By selecting a customer on the right-hand filter pane, you get to see three years of comparative sales by month, sales by item category, open sales orders, weekly moving average. Across the top of the report, you last sales amount, last sales date, and total sales amount, total Qty sold CY (current year), Total sales LY (last year), and percent sales growth for that customer.

Customer Analysis

Inventory Order Analysis

Inventory Analysis refers to a set of metrics used to optimize inventory levels — minimizing stock outs without overstocking. To remain liquid, you must turn inventory into cash so you can pay your bills, including employee wages. The more frequently you can turn over inventory, the higher your gross profits. Stock outs and backorders cut into gross profits, but so do overstocked items, known as safety stock. These items take up valuable space and require extra operations that drain cash from the business. In order to help you better analyze and manage your inventory levels, we’ve broken inventory analysis into a set of three standard equations you can easily compute. A good Inventory Reorder Analysis includes the recalculations to forecast future demand, safety stock, and find your reorder point.  For the Sales Quantity Forecasts for my lead time demand, I’m going to use the DAX time series discussed here by Fountain Analytics.

Inventory Analysis

Product Performance

Procurement is an important part of any business.  Regularly analyzing this data is easier with the use of Power BI. It ensures that you are making the best use of your company’s resources. Here are a couple of things to look at when analyzing your purchasing department – Cost and Delivery Time Analysis.

Cost Analysis

There are several things to consider when analyzing Inventory Costs. Obviously, the unit cost of the item, but we also have to look at delivery, handling, storage, deposits, and item returns from customers.  In the Power BI analysis below, I’m simply looking at the Item’s Current Cost from Dynamics GP Inventory Item Card vs. the Average Purchasing and Per Order Unit Costs.

Delivery/Lead Time Analysis

Delivery/Lead times can be important to companies, particularly when perishable and large goods are concerned.  Delivery/Lead times that start to creep higher will hamper your Item Resource Planning Process. Your staff needs to analysis delivery/lead times and plan accordingly to prevent stock outs, limit storage area, and cash resource requirements.

Product Performance

Updated Power BI Desktop File

The updated Power BI Desktop file is no longer a template but pre-loaded with Dynamics GP Fabrikam demo data.  The data is limited since this is the demo company within Dynamics GP but at least you can see how the reports work.  You can download it here.  Additionally, all of the SQL queries are embedded in the Power Queries that pull the data.  To add these reports to your system, you will need your Active Directory/window login ID added to SQL server.  Once that is done follow these steps to add your own data:

  1. Click on the “Edit Queries” button to open Power Query EditorEdit Queries
  2. Navigate to either the SQLServer or Database parameters and then click on the Manage Parameters button.Manage Parameters
  3. Enter in your SQL Server and Dynamics GP company database information.Server Settings
  4. Close & Apply to save the changes and load your Dynamics GP data.

Ready to Do Even More with Your Data?

Start organizing, knowing and executing on your data today with dataflows and Power BI to provide a self-service data lake in the future.

Dataflows for Power BI with Dynamics GP Data

Part 2

In Part 1 of Dataflows for Power BI with Dynamics GP Data, we talked about what dataflows are and started creating our first dataflow entity by creating a Date dimension. Now let’s start creating the remaining entities for:

  • Dimensions from Dynamics GP card windows via SQL queries:
  • Customer data from Sales module’s customer card
  • Salesperson data from the Sales module’s Salesperson card
  • Sales Territory data from the Sales module’s Sales Territory card
  • Product data from Inventory module’s item card
  • Fact data from our Dynamics GP transaction entry windows via SQL queries:
  • Sales data from the Sales Transaction Entry window
  • Inventory data from the Inventory Transaction Entry window

select the “Add entities” button to add additional data.

Add entities

Select the “SQL Server database” tile.

Choose data source window

Fill out your Server, Database and Connection credentials for the Enterprise gateway.

Fill in your data source connection info

I like to connect to my data via SQL views and you can download the queries for those views here. Once you’re done creating all of the entities your dataflows screen should look something like this.

View of entities from data source based on SQL views

Creating Custom Functions in Dataflows

The easiest way to create custom function in Power BI service’s dataflow is to create it in Power BI Desktop’s Power Query and then open up the Advance Editor to copy and paste the M code into a blank query.

let
  Source = (input) =>

let
values = {
{"ALABAMA","AL"},
{"ALASKA","AK"},
{"ARIZONA","AZ"},
{"ARKANSAS","AR"},
{"CALIFORNIA","CA"},
{"COLORADO","CO"},
{"CONNECTICUT","CT"},
{"DELAWARE","DE"},
{"FLORIDA","FL"},
{"GEORGIA","GA"},
{"HAWAII","HI"},
{"IDAHO","ID"},
{"ILLINOIS","IL"},
{"INDIANA","IN"},
{"IOWA","IA"},
{"KANSAS","KS"},
{"KENTUCKY","KY"},
{"LOUISIANA","LA"},
{"MAINE","ME"},
{"MARYLAND","MD"},
{"MASSACHUSETTS","MA"},
{"MICHIGAN","MI"},
{"MINNESOTA","MN"},
{"MISSISSIPPI","MS"},
{"MISSOURI","MO"},
{"MONTANA","MT"},
{"NEBRASKA","NE"},
{"NEVADA","NV"},
{"NEW HAMPSHIRE","NH"},
{"NEW JERSEY","NJ"},
{"NEW MEXICO","NM"},
{"NEW YORK","NY"},
{"NORTH CAROLINA","NC"},
{"NORTH DAKOTA","ND"},
{"OHIO","OH"},
{"OKLAHOMA","OK"},
{"OREGON","OR"},
{"PENNSYLVANIA","PA"},
{"RHODE ISLAND","RI"},
{"SOUTH CAROLINA","SC"},
{"SOUTH DAKOTA","SD"},
{"TENNESSEE","TN"},
{"TEXAS","TX"},
{"UTAH","UT"},
{"VERMONT","VT"},
{"VIRGINIA","VA"},
{"WASHINGTON","WA"},
{"WEST VIRGINIA","WV"},
{"WISCONSIN","WI"},
{"WYOMING","WY"},
{"WASHINGTON, D.C.","DC"},
{"WAHINGTON","WA"},
{"ONTARIO","ON"},
{"ONTARIO CANADA","ON"},
{"QUEBEC", "QC"},
{"QUEBEC CANADA","QC"},
{"NEWFOUNDLAND","NL"},
{"ALBERTA","AB"},
{"ALBERTA CANADA","AB"},
{"BRITISH COLUMBIA","BC"},
{"BRITISH COLUMBIA CANADA","BC"},
{"MANITOBA","MB"}

},

Result = List.First(List.Select(values, each _{0}=input)){1}

in

Result
in
  Source

Get data screen of custom function

Rename the custom function to “fnLookup”.

Edit queries

Now we need to modify our Customer entity to clean up the state column. Right click on the Customer entity and select “Advanced Editor”.

Advanced editor

Copy and paste the below Power Query code into the Advanced Editor window. This custom function helps replace the data in the State column with the accepted two-character State abbreviations.

let
  Source = Sql.Database("YOUR SQL SERVER", "YOUR DATABASE"),
  #"Navigation 1" = Source{[Schema = "dbo", Item = "view_Customers"]}[Data],
  #"Renamed columns" = Table.RenameColumns(#"Navigation 1", {{"State", "State_old"}}),
  #"Trimmed text" = Table.TransformColumns(#"Renamed columns", {{"State_old", each Text.Trim(_), type text}}),
  #"Inserted conditional column" = Table.AddColumn(#"Trimmed text", "State", each try fnLookup([State_old]) otherwise [State_old]),
  #"Reordered columns" = Table.ReorderColumns(#"Inserted conditional column", {"Customer_Key", "Company_Key", "CustomerNumber", "CustomerName", "Group Name", "SubGroup Name", "Address1", "Address2", "Address3", "City", "State", "State_old", "Zip", "Phone", "Region", "GeographyKey", "Active", "CreatedBy", "CreatedOn", "UpdatedBy", "UpdatedOn", "ActivatedOn", "DeactivatedOn", "Source", "CheckSum", "CompanyID"}),
  #"Transform columns" = Table.TransformColumnTypes(#"Reordered columns", {{"State", type text}}),
  #"Replace errors" = Table.ReplaceErrorValues(#"Transform columns", {{"State", null}}),
  #"Removed columns" = Table.RemoveColumns(#"Replace errors", {"State_old"})
in
  #"Removed columns"

After all your dataflow entities have been created, save your changes and create a schedule to refresh your data.

Connecting our Dataflows to Power BI Desktop

From your workstation with the December release of the Power BI Desktop, click the “Get Data” button and select the “Power BI dataflows (Beta)” connection.

Bringing our dataflow data into Power BI

Select the dataflow entities we created in the Power BI Service.

Selecting out entities

Once all of your entities have been imported, navigate to the Relationship section to verify the correct relationships exist and make any changes. Your relationship view should look like the screenshot below.

Power BI relationships

Don’t forget to mark our dataflow Date entities as a date table after importing it into Power BI Desktop.

Marking the date table

I’m going to build out Product Performance report from the dataflow entities that I created, I will add ABC segmentation, moving averages and time comparisons using DAX measures. ABC segmentation allows you to sort a list of values in three groups, which have different impact on the final result. ABC segmentation work on the famous Pareto principle, which states that 20% of efforts give 80% of the result. The meaning of the segments: A – the most important for the total of (20% gives 80% of the results). B – average in importance (30% – 15%). C – the least important (50% – 5%). Are three segments enough? Should we use more? What percentages? To answer these questions, you need to know your data and ABC segmentation is being done for. I have seen companies using 4 classes (ABCD) or even more. For this example, three classes have the advantage that they separate the assortment in three categories of high, medium, low importance, which is easy to communicate.

In the Product Performance report below, I used a scatter chart to analysis the Product classes profit margin and totals profits, the Sales ABC by ABC Class stacked bar chart segments the products using ABC segmentation and allows you to drill down into each segment to see the under-lining detail. Additionally, there is a total cost by warehouse location and weekly moving average cost comparison chart at the bottom of the report. Here are the DAX measures and DAX Columns that I used to create the Product Performance report:

DAX Columns:

ProductSales = 
CALCULATE(SUM('Sales'[ExtendedPrice]))

CumulatedSales = 
CALCULATE(
    SUM('Products'[ProductSales]),
    ALL('Products'),
    'Products'[ProductSales] >= EARLIER('Products'[ProductSales]))

CumulatedPercentage = 'Products'[CumulatedSales] / SUM('Products'[ProductSales])

DAX Measures:

Profit Margin = DIVIDE ([Total Profits], [Total Sales], 0)

Total Sales = SUM(Sales[ExtendedPrice])

Units Sold = SUM('Sales'[Quantity])

Total Profits = [Total Sales] -[Total Cost]

Costs Weekly MA = 
AVERAGEX( 
    DATESINPERIOD( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
        -7,
        DAY),
    [Costs LY]
)

Costs Weekly MA LY = 
AVERAGEX( 
    DATESINPERIOD( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
        -7,
        DAY),
    [Total Cost]
)


YTD COGS =
CALCULATE (
    SUM ( Sales[ExtendedCost] ),
    FILTER ( ALL ( 'Dates' ), 'Dates'[Year] = MAX ( 'Dates'[Year] ) )
)

Qty on Hand = 
SUM (Inventory[Quantity on Hand])

Turn-Earn Index =
 ( [Inventory Turnover Ratio] * 'Sales Measures'[Profit Margin] )

Total Cost = SUM(Sales[ExtendedCost])

Sales ABC = 
CALCULATE(
    [Total Sales] , 
    VALUES( Products[ItemDescription] ),
    FILTER(
        CALCULATETABLE(
            ADDCOLUMNS(
                ADDCOLUMNS(
                    VALUES( Products[ItemDescription] ),
                    "OuterValue", [Total Sales]
                ),
                "CumulatedSalesPercentage" , DIVIDE(
                    SUMX(
                        FILTER(
                            ADDCOLUMNS(
                                VALUES( Products[ItemDescription] ),
                                "InnerValue", [Total Sales]
                            ),
                            [InnerValue] >= [OuterValue]
                        ),
                        [InnerValue]
                    ),
                    CALCULATE(
                        [Total Sales],
                        VALUES( Products[ItemDescription] )
                    )
                )
            ),
            ALL( Products )
        ),
        [CumulatedSalesPercentage] > [Min Boundary]
        && [CumulatedSalesPercentage] <= [Max Boundary]
    )
)

Product performance visualization

Ready to do more with your data? Check out these other great websites:

Dataflows for Power BI with Dynamics GP data

Part 1

Been a while since my last blog post so this is rather late when talking about Dataflows with Power BI, but I was really excited about the announcement in November and finally got a chance to play around with it. If you haven’t heard about Dataflows or know what it is, here is a excerpt from that announcement:

“In the modern BI world, data preparation is considered the most difficult, expensive, and time-consuming task, estimated by experts as taking 60%-80% of the time and cost of a typical analytics project. Some of the challenges in those projects include fragmented and incomplete data, complex system integration, business data without any structural consistency, and of course, a high skillset barrier. Specialized expertise, typically reserved for data warehousing professionals, is often required. Such advanced skills are rare and expensive.

To answer many of these challenges, Power BI serves analysts today with industry leading data preparation capabilities using Power Query in Power BI Desktop. Now, With Power BI dataflows, we’re bringing these self-service data preparation capabilities into the Power BI online service, and significantly expanding the capabilities in the following ways:

  • Self-service data prep for big data in Power BI – Dataflows can be used to easily ingest, cleanse, transform, integrate, enrich, and schematize data from a large array of transactional and observational sources, encompassing all data preparation logic. Previously, ETL logic could only be included within datasets in Power BI, copied over and over between datasets and bound to dataset management settings. With dataflows, ETL logic is elevated to a first-class artifact within Power BI and includes dedicated authoring and management experiences. Business analysts and BI professionals can use dataflows to handle the most complex data preparation challenges and build on each other’s work, thanks to a revolutionary model-driven calculation engine, which takes care of all the transformation and dependency logic—cutting time, cost, and expertise to a fraction of what’s traditionally been required for those tasks. Better yet, analysts can now easily create dataflows using familiar self-service tools, such as the well known Power Query data preparation experience. Dataflows are created and easily managed in app workspaces, enjoying all the capabilities that the Power BI service has to offer, such as permission management, scheduled refreshes, and more.
  • Advanced Analytics and AI with Azure – Power BI dataflows store data in Azure Data Lake Storage Gen2 – which means that data ingested through a Power BI dataflow is now available to data engineers and data scientists to leverage the full power of Azure Data Services such as Azure Machine Learning, Azure Databricks, and Azure SQL Datawarehouse for advanced analytics and AI. This allows business analysts, data engineers, and data scientists to collaborate on the same data within their organization.
  • Support for the Common Data Model – The Common Data Model (CDM) is a set of a standardized data schemas and a metadata system to allow consistency of data and its meaning across applications and business processes.  Dataflows support the CDM by offering easy mapping from any data in any shape into the standard CDM entities, such as Account, Contact etc. Dataflows also land the data, both standard and custom entities, in schematized CDM form. Business analysts can take advantage of the standard schema and its semantic consistency, or customize their entities based on their unique needs. The Common Data Model continues to evolve as part of the recently announced Open Data Initiative. 

Once dataflows are created, users can use Power BI Desktop and the Power BI service to create datasets, reports, dashboards, and apps to drive deep insights into their business.”

Microsoft has release a white paper on Dataflows and you can find that here: https://docs.microsoft.com/en-us/power-bi/whitepapers or check out Matthew Rouche’s great blog posts at BI Polar: https://ssbipolar.com/2018/10/23/dataflows-in-power-bi/. They really are a great guide to get you started and up to speed the Dataflows.

Getting started with Dataflows

So, Let’s try doing some of this ourselves with Dynamics GP data. To do this you will need to have already installed an Enterprise gateway to your Dynamics GP SQL server. Log into your Power BP Pro and Premium service and create a new workspace. You should now see a “Dataflows (Preview)” option. Click on the that option and then the “+ Create” button to start and select Dataflows from the dropdown menu.

Dataflows Preview Window

Dataflows Preview Menu

Creating your Entities

This will open up the below window with the option to “Add new entities” or “Add linked entities”. Let’s select “Add new entities” so we can add a date dimension to our Dataflow prep.

Define new entities window

From the “Choose data source” window, select the “Blank query” tile.

Choose data source window

This will open up the “Connect to data source” window. Copy and paste the below Power Query code to create your data dimension.

Blank query Window

let
  Source = List.Dates(StartDate, Length, #duration(1, 0, 0, 0)),
  #"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
  #"Renamed Columns" = Table.RenameColumns(#"Converted to Table", {{"Column1", "Date"}}),
  #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns", {{"Date", type date}}),
  StartDate = Date.StartOfYear(Date.AddYears(DateTime.Date(DateTime.LocalNow()),-10)),
  Today = Date.EndOfYear(Date.AddYears(DateTime.Date(DateTime.LocalNow()),5)),
  Length = Duration.Days(Today - StartDate),
  Custom1 = #"Changed Type",
  #"Inserted Year" = Table.AddColumn(Custom1, "Fin Year", each Date.Year([Date]+#duration(184,0,0,0)), Int64.Type),
  #"Inserted Month Name" = Table.AddColumn(#"Inserted Year", "Month Name", each Date.MonthName([Date]), type text),
  #"Inserted Day Name" = Table.AddColumn(#"Inserted Month Name", "Day Name", each Date.DayOfWeekName([Date]), type text),
  #"Inserted Month" = Table.AddColumn(#"Inserted Day Name", "Fin Month", each if Date.Month([Date]) >=7 then Date.Month([Date])-6 else Date.Month([Date])+6  , Int64.Type),
  #"Inserted Day of Week" = Table.AddColumn(#"Inserted Month", "Day of Week", each Date.DayOfWeek([Date])+1, Int64.Type),
  #"Inserted First Characters" = Table.AddColumn(#"Inserted Day of Week", "Short Month", each Text.Start([Month Name], 3), type text),
  #"Inserted First Characters1" = Table.AddColumn(#"Inserted First Characters", "DDD", each Text.Start([Day Name], 3), type text),
  #"Reordered Columns" = Table.ReorderColumns(#"Inserted First Characters1", {"Date", "Fin Year", "Month Name", "Short Month", "Fin Month", "Day Name", "DDD", "Day of Week"}),
  #"Added Custom1" = Table.AddColumn(#"Reordered Columns", "Month Number", each (Date.Month([Date]))),
  #"Inserted Start of Month" = Table.AddColumn(#"Added Custom1", "Start of Month", each Date.StartOfMonth([Date]), type date),
  #"Inserted End of Month" = Table.AddColumn(#"Inserted Start of Month", "End of Month", each Date.EndOfMonth([Start of Month]), type date),
  #"Duplicated Column" = Table.DuplicateColumn(#"Inserted End of Month", "Date", "Date - Copy"),
  #"Calculated Quarter" = Table.TransformColumns(#"Duplicated Column",{{"Date - Copy", Date.QuarterOfYear, Int64.Type}}),
  #"Renamed Columns1" = Table.RenameColumns(#"Calculated Quarter", {{"Date - Copy", "Quarter Number"}}),
  #"Inserted Merged Column" = Table.AddColumn(#"Renamed Columns1", "Merged", each Text.Combine({"Q", Text.From([Quarter Number], "en-US")}), type text),
  #"Current Date" = Table.AddColumn(#"Inserted Merged Column", "Current Date", each Date.From(DateTimeZone.FixedLocalNow())),
  #"Added Custom10" = Table.AddColumn(#"Current Date", "Is Work Day", each if Date.DayOfWeek([Date]) >=0 and Date.DayOfWeek([Date]) <= 4 then "Is Work Day" else "Weekend"),
  #"Renamed Columns2" = Table.RenameColumns(#"Added Custom10", {{"Merged", "Calendar Quarter"}, {"DDD", "Short Day"}}),
  #"Duplicated Column1" = Table.DuplicateColumn(#"Renamed Columns2", "Date", "Date - Copy"),
  #"Extracted Year" = Table.TransformColumns(#"Duplicated Column1",{{"Date - Copy", Date.Year, Int64.Type}}),
  #"Renamed Columns3" = Table.RenameColumns(#"Extracted Year", {{"Date - Copy", "Calendar Year"}}),
  #"Changed Type3" = Table.TransformColumnTypes(#"Renamed Columns3", {{"Fin Year", Int64.Type}, {"Current Date", type date}}),
  #"Added Custom Column" = Table.AddColumn(#"Changed Type3", "DateKey", each Text.Combine({Date.ToText([Date], "yyyy"), Date.ToText([Date], "MM"), Date.ToText([Date], "dd")}), Int64.Type),
  #"Transform columns" = Table.TransformColumnTypes(#"Added Custom Column", {{"Month Number", type text}, {"Is Work Day", type text}}),
  #"Replace errors" = Table.ReplaceErrorValues(#"Transform columns", {{"Month Number", null}, {"Is Work Day", null}}),
  #"Added Custom" = Table.AddColumn(#"Replace errors", "YYYYMM", each Text.Combine({Date.ToText([Date],"yyyy"), Date.ToText([Date], "MM")})),
  #"Added Custom3" = Table.AddColumn(#"Added Custom", "YYYY-MM", each Text.Combine({Date.ToText([Date],"yyyy"),"-", Date.ToText([Date], "MM")})),
    #"Changed Type1" = Table.TransformColumnTypes(#"Added Custom3",{{"YYYYMM", type text}, {"YYYY-MM", type text}})
in
    #"Changed Type1"

On the “Edit queries” window. Here I changed the Name to “Date” and then clicked “Done”.

Edit queries Window

That’s enough for today blog post. In part 2, we will create a entities for our customer, salesperson, sales territory, product, sales transactions, inventory transactions and a custom function in are dataflow to help clean up our customer data and then connect Power BI Desktop to our dataflows so we can create any relationship and DAX measures needed to start visualizing our Dynamics GP data.

Power BI Dynamics GP Sales and Inventory Analysis Report Modifications

Recently, I’ve made some updates to my Power BI Sales and Inventory Analysis Reports and wanted to share those with you. On the Sales Analysis report, I’ve added the ability to segment my sales revenue by growth rate and compare that growth rate to the previous period selected. Here are the steps I used to add the changes to the report.

  1. First, I clicked on the “Enter Data” icon from the Home tab in Power BI Desktop and then entered the following information.Power BI Create Table
  2. I then created a DAX Sales Growth measure with the following code.
    Sales Growth = DIVIDE([Sales CY],[Sales LY],0)-1
  3. Next I created a SAX measure to segment the data based on the Sales Growth Groups that we created in step 1.
    Sales per Growth Group =
            CALCULATE (
                [Sales CY],
                FILTER (
                    VALUES ( Customers[Customer Class] ),
                    COUNTROWS (
                        FILTER (
                            'Sales Growth Groups',
                            [Sales Growth] >= 'Sales Growth Groups'[Min]
                                && [Sales Growth] < 'Sales Growth Groups'[Max]                     )                 )                     > 0
                )
            )
    
  4. Next, I modified my Product by Sales and Customer Class by Sales visuals to use the new Sales per Growth Group measure by dragging the segment from the Sales Growth Group table into the Legend field, Sales Growth measure into the Tooltips and Sales per Growth Group measure into the Values field.Screenshot of field wells
  5. Below is a screenshot of the end results. You can now clearly tell which segments are performing better or worse than last year.Power BI Sales Summary Report

My next modification was to my Inventory Order Analysis report. This one was just a small change by adding one DAX measure to calculate slow moving inventory and a Table visual to view the results.

  1. First, I added a DAX measure for slow moving inventory
    SM Inventory Value = 
    VAR SumOf = 
    CALCULATE(
        SUMX(
            Inventory,
            Inventory[Quantity on Hand] * Inventory[Unit Cost]
        )
    )
    VAR Range = DATESINPERIOD(Dates[Date], EDATE(TODAY(), -12), -24, MONTH)
    RETURN
    CALCULATE(
        IF(
            COUNTROWS(
            INTERSECT(
               VALUES(Inventory[Last Sale Date]),
               Range)
           ) > 0, 
           SumOf, 0
        )
    )

  1. Then I added the following columns and measure to a table visual to show my slow moving inventory.Table Visual Field Wells Screenshot
  2. Below is a screenshot of the modification to the Inventory Order Analysis report.Power BI Inventory Order Analysis

Until next time, Enjoy the code and Happy Analyzing!

How to Create a Better RMA Dashboard For Mammoth Saving

OK, maybe not mammoth saving but controlling quality and costs during the Return Material Authorizations (RMA) process has a multiplicative effect for growth in a company. Finding new ways to improve perfect order performance continually reduces RMA and increases customer satisfaction leading to more sales.

RMAs are a direct measure of product quality and a products’ nonconformance to customers’ specifications and requirements. They are issued for a wide variety of reasons. The RMA module within Dynamics GP is designed to provide detailed traceability for both customer and supplier returns. Before we get to designing our Power BI RMA dashboard let’s look at Dynamics GP RMA module.

Return Material Authorization Module

The Returns Management module for Microsoft Dynamics GP enables you to meet customer and vendor demands for product and part returns by streamlining tasks and improving your responsiveness to customer queries and complaints. Generate returns from historical customer invoices, service calls, or manually. Return an item to a vendor to fix it within your company and automatically transfer items from the returns warehouse to the main warehouse. Give your customers up-to-date information about the status of their returned items.

You can use Returns Management to enter, set up, and maintain your RMA and Return To Vendor (RTV) documents. If Returns Management is integrated with Service Call Management, an RMA is created automatically from a returnable parts line on a service call. If Returns Management is integrated with Sales Order Processing, you can select items directly from the historical Sales Order Processing invoice that was generated for a specific customer.

Dynamics GP’s RMA Life Cycle

When operating a returns warehouse, you sometimes interact with customers who need to return equipment for various reasons. When they bring you a part, you complete one of a few tasks: issue a credit, repair and return the item to the customer, or replace the item. You also may be accepting the item in exchange for an item you already provided to the customer. Once you’ve completed your transaction with the customer, you have a few more options: repair the item and return it to your inventory, return the item to the vendor, or scrap the item.

The following diagram outlines the life cycle of RMAs, from entry through completion. When Returns Management is integrated with the other modules of the Field Service Series (Service Call Management, Preventive Maintenance, Contract Administration, and Depot Management), many new options and paths become available.

RMA Life Cycle

You can create RMAs from two different points of access throughout Field Service:

  • Manual entry in the RMA Entry/Update window
  • From return lines for returnable items on a service call

Entry in the RMA Entry/Update window is the method described in this manual. Refer to the Service Call Management documentation for more information regarding service calls and returnable items.

RMA types Inside Dynamics GP

RMA documents are used to track an item return from your customers. The available RMA document types are as follows:

  • Credit – Provide a credit to your customer’s account in Receivables Management for the value of the items the customer returned to you.
  • Replacement – Provide the same item, or a similar item, as a replacement to your customer. You must receive the original item from your customer before you send the replacement item on a new order document in Sales Order Processing.
  • Advance Cross–ship – Provide the same item, or a similar item, as a replacement to your customer. You can send the replacement item using a new Sales Order Processing order document prior to receiving the original item from your customer.
  • Repair and Return – You, or your vendor, will repair the item that is received from the customer. Your customer will receive the item after it’s been repaired.
  • None – The customer’s original item is picked up by your field service technician and returned to your returns warehouse. This type of RMA document was designed to integrate directly with Service Call Management.

Analyzing Our RMA Data with Power BI

With all the RMA data we will be using the following three visuals and associated measures:

  • Rate of Return – This is an incredibly useful KPI in a distribution center, especially when segmented by cause for return. Identifying causes for returns — damage, late delivery, inaccurate product description, wrong item shipped, etc — helps warehouse managers address underlying issues, and make necessary improvements.Number of Units Returned/Number of Units Sold = Rate of Return.
  • Perfect Order Rate – This KPI measures how many orders your warehouse successfully delivers without incident: the correct item, shipped on time and received in good condition by the customer who ordered it. Lean practices help identify errors or inaccuracies before orders leave the warehouse.Orders Completed Without Incident/Total Orders Placed = Perfect Order Rate
  • RMA Pareto Analysis – Done on the top 20% of factors that drive 80% of the returns. This will make cause troubleshooting more efficient, leading to permanent solutions to problems that may be causing RMAs to begin with. I’m not going to go into detail on how to build this chart in this blog post. You can find the steps on how to complete it here: https://powerbi.tips/2016/10/pareto-charting/.

First, we need to get our Dynamics GP RMA data into Power BI and we’ll do that with the SQL script below:

WITH CTE
AS (SELECT
  300 AS Company_Key,
  b.RETDOCID AS [RMA Document ID],
  b.LNSEQNBR,
  a.Return_Record_Type,
  b.RETREF AS [RMA Reference],
  b.RETSTAT AS [RMA Status],
  b.RETTYPE AS [RMA Type],
  a.COMPDTE AS [DocDate],
  a.USERID,
  b.OFFID AS [Office ID],
  b.LOCNCODE AS [Location Code],
  a.CUSTNMBR,
  b.SOPTYPE,
  b.SOPNUMBE AS [Invoice Number],
  b.SVC_RMA_Reason_Code AS [RMA Reason Code],
  b.SVC_RMA_Reason_Code_Desc AS [RMA Reason Code Description],
  b.ITEMNMBR,
  b.QUANTITY,
  b.UNITCOST AS [Unit Cost],
  b.EXTDCOST AS [Extended Cost],
  b.[Credit_SOP_Number]
FROM dbo.SVC05200 b (NOLOCK)
JOIN dbo.SVC05000 a (NOLOCK)
  ON b.RETDOCID = a.RETDOCID
WHERE b.QUANTITY > 0
AND b.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0) -- Didn't want everything so limiting to last 4 years of RMA data 
AND a.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0) -- Didn't want everything so limiting to last 4 years of RMA data

UNION ALL

SELECT
  300 AS Company_Key,
  b.RETDOCID AS [RMA Document ID],
  b.LNSEQNBR,
  a.Return_Record_Type,
  b.RETREF AS [RMA Reference],
  b.RETSTAT AS [RMA Status],
  b.RETTYPE AS [RMA Type],
  a.COMPDTE AS [DocDate],
  a.USERID,
  b.OFFID AS [Office ID],
  b.LOCNCODE AS [Location Code],
  a.CUSTNMBR,
  b.SOPTYPE,
  b.SOPNUMBE AS [Invoice Number],
  b.SVC_RMA_Reason_Code AS [RMA Reason Code],
  b.SVC_RMA_Reason_Code_Desc AS [RMA Reason Code Description],
  b.ITEMNMBR,
  b.QUANTITY,
  b.UNITCOST AS [Unit Cost],
  b.EXTDCOST AS [Extended Cost],
  b.[Credit_SOP_Number]
FROM dbo.SVC35200 b (NOLOCK)
JOIN dbo.SVC35000 a (NOLOCK)
  ON b.RETDOCID = a.RETDOCID
WHERE b.QUANTITY > 0
AND b.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0) -- Didn't want everything so limiting to last 4 years of RMA data
AND a.COMPDTE >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 4, 0)) -- Didn't want everything so limiting to last 4 years of RMA data

SELECT
  b.*,
  rs.STSDESCR AS [RMA Status Description],
  iv.ITMCLSCD AS [Item Class]
FROM CTE b (NOLOCK)
LEFT JOIN dbo.IV00101 iv (NOLOCK)
  ON iv.ITEMNMBR = b.ITEMNMBR
LEFT JOIN dbo.SVC05500(nolock) rs
  ON b.[RMA Status] = rs.RETSTAT
WHERE b.QUANTITY > 0

With the data now loaded into Power BI Desktop, I created relationships with my Inventory, Date and Customer dimension that I already had in Power BI Desktop from my previous blog posts.

Creating Our Measures

Time to build the Rate of Return Measure. Rather than create one big DAX measure we will be building several small measures that build upon each other. Here are the DAX formulas that we will be using for the Rate of Return measure:

  • Return of Lbs CY
Returns Lbs. CY = CALCULATE(
    ABS(
        SUM(Returns[Total Return Lbs])
        )
)
  • Lbs. CY

Lbs. CY = SUM(Sales[ExtendedWeight])

  • Rate of Return
Rate of Return = DIVIDE 
    ([Returns Lbs. CY], [Lbs. CY], 
    Blank()
 )
  • Monthly Average Rate of Return
Monthly Avg. Rate of Return = AVERAGEX( 
    DATESINPERIOD( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
        -30,
        DAY),
    [Rate of Return]
)

With our calculations now complete, we create a line chart with Date on the Axis and Monthly Avg. Rate of Return for the Value as seen in the bottom left line chart visual of our Returns Analysis report screen shot below.

Up next, our Perfect Order Rate measure. Once again we will build several small calculation and add them together to create our Monthly Avg. Perfect Order Rate. Here are the DAX formulas that I used for this:

  • Total Invoice Count – Since my data includes line level detail for each Sales Order type in the Sales Table I’ll count all lines invoice and evaluate each line later to make sure it was a perfect order.
Total Invoices Count = CALCULATE(
    COUNTROWS('Sales'),
    FILTER('Sales','Sales'[SOPTYPE] = 3)
    )
  • Perfect Order – Now we will determine if the order was fulfilled before the customer’s required date.
Perfect Order = CALCULATE(COUNTROWS
    ('Sales'),
    FILTER('Sales','Sales'[SOPTYPE] = 3),
    FILTER('Sales',[Document Date]<='Sales'[Req Ship Date])
    ) 
  • Perfect Order Rate
Perfect Order Rate = DIVIDE(
    [Perfect Order],[Total Invoices Count],
    BLANK()
    )
  • Monthly Average Perfect Order Rate
Monthly Avg. Perfect Order Rate = AVERAGEX( 
    DATESINPERIOD( 
        Dates[Date] , 
        LASTDATE( Dates[Date] ),
        -30,
        DAY),
    [Perfect Order Rate]
)

With all of our DAX measures complete, we can create the visuals needs to Monthly Average Perfect Order Rate and Monthly Average Return Rate of Lbs. Here is what my final Returns Analysis report looks like.

RMA Dashboard

Until next time, Happy Analyzing!

Enhanced Dynamics GP Payables Workflow Reporting

A couple days ago, Gina left a comment on my 3 Secrets To An Awesome ERP System! blog post. She was wondering if it was possible on the Payables Transaction Workflow to include Document Type, Document Date, Document Number, Transaction Description, Vendor Name and Amount. The answer is YES, it is possible. The below SQL view only works with the Payables Transaction Workflow but can be modified to work with other transactional workflows. It will not work with batch workflows because batch IDs lack an unique ID to create the relationship against. Here is the modified SQL view to complete the request:

GO
SET QUOTED_IDENTIFIER OFF
GO


CREATE VIEW [dbo].[vw_WorkFlow_Status]

AS

 WITH CTE_FINAL (WorkflowInstanceID, Workflow_Name, Workflow_Step_Name, Approver, Workflow_Action, Completion_Date, Completion_Time, Comments  )
AS
(select  d.WorkflowInstanceID, 
       d.Workflow_Name, 
       d.Workflow_Step_Name,
       CASE WHEN a.ADDisplayName is null THEN ''
          ELSE a.ADDisplayName
          END as [Assigned Approver],
       CASE WHEN d.Workflow_Action = 1 THEN 'Submit'
          WHEN d.Workflow_Action = 2 THEN 'Resubmit'
          WHEN d.Workflow_Action = 3 THEN 'Approve'
          WHEN d.Workflow_Action = 4 THEN 'Task Complete'
          WHEN d.Workflow_Action = 5 THEN 'Reject'
          WHEN d.Workflow_Action = 6 THEN 'Delegate'
          WHEN d.Workflow_Action = 7 THEN 'Recall'
          WHEN d.Workflow_Action = 8 THEN 'Escalate'
          WHEN d.Workflow_Action = 9 THEN 'Edit'
          ELSE 'Final Approve'
          END as Workflow_Action,
       convert(varchar(10),d.Workflow_Completion_Date, 101) as [Completion_Date],
       right('0'+LTRIM(right(convert(varchar,d.Workflow_Completion_Time,100),8)),7) as Completion_Time,
       d.Workflow_Comments
         from dbo.WF30100 d
LEFT JOIN WF40200 a ON d.Workflow_Step_Assign_To = a.UsersListGuid
WHERE d.Workflow_Action = 10), CTE_PM as (SELECT    P.VENDORID Vendor_ID,
    V.VENDNAME Vendor_Name,
    P.VCHRNMBR Voucher,
    CASE P.DOCTYPE
         WHEN 1 THEN 'Invoice'
         WHEN 2 THEN 'Finance Charge'
             WHEN 3 THEN 'Misc Charge'
             WHEN 4 THEN 'Return'
             WHEN 5 THEN 'Credit Memo'
         WHEN 6 THEN 'Payment'
         END Document_Type,
    P.DOCDATE Document_Date,
    P.PSTGDATE GL_Posting_Date,
    P.DUEDATE Due_Date,
    P.DOCNUMBR Document_Number,
    P.DOCAMNT Document_Amount,
    P.CURTRXAM Unapplied_Amount,
    P.TRXDSCRN [Description],
    CASE P.VOIDED
         WHEN 0 THEN 'No'
         WHEN 1 THEN 'Yes'
         END Voided

FROM    (SELECT VENDORID, VCHRNMBR, DOCTYPE, DOCDATE, PSTGDATE,
     DUEDATE, DOCNUMBR, DOCAMNT, CURTRXAM, TRXDSCRN, VOIDED
     FROM PM20000
    UNION ALL
     SELECT VENDORID, VCHRNMBR, DOCTYPE, DOCDATE, PSTGDATE,
     DUEDATE, DOCNUMBR, DOCAMNT, CURTRXAM, TRXDSCRN, VOIDED
     FROM PM30200
     UNION ALL
     SELECT VENDORID, VCHRNMBR, DOCTYPE, DOCDATE, PSTGDATE,
     DUEDATE, DOCNUMBR, DOCAMNT, CURTRXAM, TRXDSCRN, 0 as VOIDED
     FROM PM10000) P

     INNER JOIN
    PM00200 V
    ON V.VENDORID = P.VENDORID)
 
 Select  WF.* 
        ,coalesce(pm.Document_Type,'') as Document_Type
        ,coalesce(pm.Document_Date,'') as Document_Date
        ,coalesce(pm.Document_Number, '') as Document_Number
        ,pm.Voucher
        ,coalesce(pm.Description, '') as Description
        ,coalesce(pm.Vendor_Name,'') as Vendor_Name
        ,coalesce(pm.Document_Amount,0) as Document_Amount 
FROM 
 (SELECT    ROW_NUMBER() OVER ( ORDER BY A.DEX_ROW_ID ) RecordSequence ,
            ISNULL(C.WfBusObjKey, '') WFBusObjKey ,
            LEFT(WfBusObjKey, ISNULL(NULLIF(CHARINDEX('~', WFBusObjKey) - 1, -1), LEN(WfBusObjKey))) as Split,
            Workflow_History_User ,
            C.Workflow_Type_Name,
            A.Workflow_Name ,
            A.Workflow_Step_Name ,
            ISNULL(B.WorkflowTaskAssignedTo, '') WorkflowTaskAssignedTo ,
            CASE C.Workflow_Status
              WHEN 1 THEN 'Not Submitted'
              WHEN 2 THEN 'Submitted (Deprecated)'
              WHEN 3 THEN 'No Action Needed'
              WHEN 4 THEN 'Pending User Action'
              WHEN 5 THEN 'Recalled'
              WHEN 6 THEN 'Completed'
              WHEN 7 THEN 'Rejected'
              WHEN 8 THEN 'Workflow Ended (Depricated)'
              WHEN 9 THEN 'Not Activated'
              WHEN 10 THEN 'Deactivated (Depricated)'
              ELSE ''
            END AS WORKFLOW_STATUS ,
            CASE A.Workflow_Action
              WHEN 1 THEN 'Submit'
              WHEN 2 THEN 'Resubmit'
              WHEN 3 THEN 'Approve'
              WHEN 4 THEN 'Task Complete'
              WHEN 5 THEN 'Reject'
              WHEN 6 THEN 'Delegate'
              WHEN 7 THEN 'Recall'
              WHEN 8 THEN 'Escalate'
              WHEN 9 THEN 'Edit'
              WHEN 10 THEN 'Final Approve'
              ELSE ''
            END AS Workflow_Action ,
            A.Workflow_Due_Date ,
            A.Workflow_Completion_Date ,
            A.DEX_ROW_ID ,
            Workflow_Comments,
        CASE WHEN d.Approver is null THEN ''
          ELSE d.Approver
       END as Approver,
       CASE WHEN d.Completion_Date is null THEN ''
          ELSE d.Completion_Date
       END as Completion_Date,
       CASE WHEN d.Completion_Time is null THEN ''
          ELSE d.Completion_Time
       END as Completion_Time,
       CASE WHEN d.Comments is null THEN ''
          ELSE d.Comments
       END as Comments
    FROM    WF30100 AS A
            LEFT OUTER JOIN WFI10004 AS B ON A.WorkflowInstanceID = B.WorkflowInstanceID
            AND A.WorkflowStepInstanceID = B.WorkflowStepInstanceID
            LEFT OUTER JOIN WFI10002 AS C ON C.WorkflowInstanceID = A.WorkflowInstanceID
            LEFT OUTER JOIN CTE_FINAL d ON D.WorkflowInstanceID = c.WorkflowInstanceID) WF
            LEFT OUTER JOIN CTE_PM pm on pm.Voucher = WF.Split
GO

Until next time. Enjoy the code!

3 Secrets To An Awesome ERP System!

Part of being a Microsoft Dynamics Consultant is seeing all the inventive ways clients use the software we deploy and take that knowledge and provide best practices to everyone! Here are just some of the things to enhance your system processes:

  • Best Practices
  • Customization and Modifications
  • Reporting Enhancements

Best Practices

Customization, Modifications and Third Party Add-ins

  • General Ledger, Sales Distribution, and Payables Distribution intercompany Excel paste – James Lyn’s Excel Paste add-in. is great at extending Dynamics GP’s out of the box functionality. While you’re on his site, check out his other add-ins like GP batch attach for Payables.
  • Create Dynamics GP Marcos or use PowerShell scripts to automate tasks – i.e. Macro to log into Dynamics GP and run inventory reconcile process or reboot your web client servers to remove hung processes.
  • Custom workflows and reporting – reporting to provide detail information on current/open and historical/approved workflows. Find out how to do that here.

Reporting Enhancements

Dynamics GP comes with some good reporting capabilities:

  • Management Reporter.
  • Excel refreshable reports.
  • SmartList.
  • Jet Express for Dynamics GP.
  • Solver’s BI360.
  • Power BI.

With Dynamics GP 2018 you can now deploy Power BI GP content pack or embedded Power BI visual inside of Dynamics GP. So what do the Power BI content pack visuals look like and how do we get them installed? As of Microsoft Dynamics GP 2018, the GP OData service was updated to OData version 4. This redesign also brought on paging and filtering of OData requests. This will create a more stable and robust platform for delivering Microsoft Dynamics GP content to authenticated users. The Power BI content pack features sample reports for Financial, Sales, Purchasing and Inventory data. Each report utilizes relationships built between GP tables and various Filters that can be used to display the information that is important to you. You can also review the included Calculated Columns as examples for including calculations on your Power BI reports such as Net Debit/Credit, Profit, and Item Sales amounts.

In order to use the Power BI Content Pack with Dynamics GP, install OData Services. Once this is complete, you will have to publish the following Data Sources inside Dynamics GP. (Administration > System > OData > Data Sources) The following Data Sources will need to be published for the GP content pack.

  • Accounts
  • Account Transactions
  • Customers
  • Inventory Sales Summary Period History
  • Inventory Transactions
  • Item Quantities
  • Purchase Line Items
  • Purchase Requisition Lines
  • Purchase Requisitions
  • Receiving Line Items
  • Sales Line Items
  • Vendors

The Power BI Content Pack will also have to be configured to point to you’re existing Microsoft Dynamics GP OData Service. This can be done by modifying the existing Data source in Power BI, or by creating a new data source and configuring the content to use the new source. The screenshots below show what the GP content pack sample reports look like.

Finance Dashboard

Finance

Sales Dashboard

Sales

Purchasing Dashboard

Purchasing

Inventory Dashboard

Inventory

Enhancing Our Dashboards

These GP content pack reports are a good starting point and can speed up the process of implementing a Power BI solution. With a little work from your Microsoft Dynamics Consultant, we can provide you so much more. In February of 2017, I started a blog series that provided a “how to” on building a Finance, Sales, Customer, Product, and Inventory dashboard. Since my main goal was to show an update of the Excel multi-company dashboard, I choose to first build a small DataMart and integrate the data from Dynamics GP databases before building my visuals. Follow the links below to find out how I built each one of the enhanced dashboards.

Enhanced Finance Dashboard

The finance dashboard, from the blog series, now provides a summary profit & loss statement that you can drill down into line level detail. Additionally, you see total sales by inventory class and customer class and a weekly moving average.

Enhanced Finance Dashboard

Enhanced Sales Dashboard

The sales summary dashboard provides an analysis of sales by inventory item class, customer class and weekly moving averages for total sales dollars and transactions. The report also includes a cumulative sales and cost comparison.

Enhanced Sales

Product Performance and Inventory Reorder Dashboard

In the product performance dashboard, I’ve added to compare total profits and profit margin, cost by warehouse and ABC segmentation analysis. Additionally, there is a weekly moving average cost comparison chart at the bottom of the report.

Enhanced Product Performance

For my inventory reorder analysis report, I’m using some discussed by Belinda Allen in her, Inventory Item Reorder Dashboard. I converted it to Power BI to help your procurement manager evaluate what’s on hand, allocated to open orders and items sold within a given time period.

Enhanced Inventory Dashboard

Future Developments

Returns are generally thought of a loses and return percentages can be dependent both the type of product and the company’s returns policy. While the average industry rate is four percent, consumer durable goods can range from two to 10 percent and apparel can be in excess of 20 percent. There are several reasons for merchandise returns and tracking the costs and reasons associated with them can increase revenues, lower costs, improved profitability and enhanced levels of customer service. Using Dynamics GP’s RMA module can help and analyze that data in Power BI is one of the things that I’m currently working on. Below is a screenshot of that analysis and a subject of a future blog post.

Sales Returns Dashboard

Ready to Do Even More with Dynamics and Power BI?

Stay tuned for more help in lead your organization into becoming data-driven organization by exploring your Business Intelligence, BI360, Power BI, Microsoft Dynamics GP journey.