Enhancing our Inventory Reorder Point Analysis in Power BI

In my original Inventory Reorder Analysis report, I’m using some the technics discussed by Belinda Allen in her Inventory Item Reorder Dashboard. I converted it to Power BI and added a little extra by also analyzing SOP product sales data and computing “Inventory Logic” based on whether the “Inventory Sold in Time Period” columns total is three times the current “Qty Available” producing a report that looked like the screen shot below.

Original Inventory Re-Order Analysis

Enhancing the Inventory Reorder Analysis

A good Inventory Reorder analysis includes the recalculations to forecast future demand and safety stock find your reorder point. For the sales quantity forecasts for my lead time demand, I’m going to use the DAX time series discussed here by Fountain Analytics. After following time series instructions my sales forecast results looked like this.

Sales Quantity Usage Forecast

With the Sales forecast quantity usage in place, I then needed to calculate our Safety Stock based on the below formula.

Safety Stock Formula

Using the data from the Procurement Analysis and the Sales Analysis reports, I calculated the four data points with the below DAX formulas.

Max Days Consumption =

MAXX (

VALUES ( Dates[Date] ),

MAXX(

VALUES( ‘Products'[ItemNumber] ),

‘Sales Measures'[Units Sold]

))

Max Days Inventory Lead Time =

MAXX (

VALUES ( Dates[Date] ),

MAXX(

VALUES( ‘Products'[ItemNumber] ),

[Lead Time]

))

Average Days Consumption =

AVERAGEX (

VALUES ( Dates[Date] ),

AVERAGEX(

VALUES( ‘Products'[ItemNumber] ),

‘Sales Measures'[Units Sold]

))

Average Days Inventory Lead Time =

AVERAGEX (

VALUES ( Dates[Date] ),

AVERAGEX(

VALUES( ‘Products'[ItemNumber] ),

[Lead Time]

))

You put it all together to your Reorder Point – (Lead time demand) + (safety stock) = Reorder Point. The end result is the updated Inventory Reorder Analysis report shown below.

Update Inventory Re-Order Point Analysis

Updated Template

You can download the updated template here. In the template, I also updated the Inventory Reorder Point report using the New Matrix functionality.

Stay tuned for the future developments of the reports and an overview of their functionality and for help with building reports like this.

Use Power BI and R to Quickly Identify Business Insights

In statistics, the correlation coefficient or Pearson’s correlation is a measure the strength and direction of association that exists between two continuous variables. The value of correlation, r, is always between +1 and –1. To interpret its value, see which of the following values your correlation r is closest to:

  • Exactly 1. A perfect downhill (negative) linear relationship
  • 0.70. A strong downhill (negative) linear relationship
  • 0.50. A moderate downhill (negative) relationship
  • 0.30. A weak downhill (negative) linear relationship
  • 0. No linear relationship
  • +0.30. A weak uphill (positive) linear relationship
  • +0.50. A moderate uphill (positive) relationship
  • +0.70. A strong uphill (positive) linear relationship
  • Exactly +1. A perfect uphill (positive) linear relationship

Don’t make the mistake of thinking that a correlation of –1 is a bad thing, indicating no relationship. Just the opposite is true! A correlation of –1 means the data are lined up in a perfect straight line, the strongest negative linear relationship you can get. The “–” (minus) sign just happens to indicate a negative relationship, a downhill line. Most statisticians like to see correlations beyond at least +0.5 or –0.5 before drawing any conclusions. Additional information on Pearson’s correlation can be found here.

Putting it All to Use

This is all great knowledge, but how can we apply this in a business environment. When I worked for a casino, we kept a daily record of Coin In (gross sales), High Temp, Low Temp and Fuel price in an Excel spreadsheet and tried to find a correlation between these data points. Try find the correlation with this data.

Excel Casino data

With Power BI and R, we can make this association all that much faster and with slicer for interactivity to drill into the data points. I first downloaded the R Correlation Plot custom visual located here and then Excel spreadsheet from above imported into Power BI. I was then able easily identify the correlations in a matter of minutes.

Positive correlations are identified with increasingly dark blue circles, negative correlation is red. The greater the bigger and darker the circle. Selecting “Nov” from the calendar slicer, you will notice that the intersection between “Low Temp” has a positive correlation (blue circle), so the lower the temperature the less you will realize in coin in for the that month.

Power BI and R Correlation Plot

Selecting the “Jul” from the calendar slicer, and notice that the correlation is different.

In this case, the correlation is zero between temperature and coin in.

Power BI and R Correlation Plot image 2

Conclusion

Hope this helps you quickly identify business insights in your organization.

Supercharge Your Dynamics GP Dashboards & Reports with Power BI

During my last blog post, I walked you through creating a financial dashboard using Power BI with Dynamics GP data. I have created additional analysis and measures now.

clip_image002

If you look closely at the bottom left, you will see two additional reports for Sales Summary and Customer Analysis.

Sales Summary

The data used from Dynamics GP for this report are the Sales Order Processing (SOP) tables, Inventory, (IV) and Customer (RM) tables. The Sales Summary contains the same report filters, Year, Month and Company as the Financial Summary report. The top left visual shows the total sales by product class ranked in descending order. The bottom left visual shows the total sales by customer class/sales channel. The top two visuals in the middle and right show the total sales and transactions based on the filters selected with a seven-day moving average behind them. The bottom right visual shows a comparative between cumulative sales and cumulative costs.

clip_image004

Customer Analysis

The Customer Analysis report allows you to dig into Dynamics GP sales data by Customer, Year, Month and Company. By selecting a customer on the left you get to see three years of comparative sales, sales detail, last sales date and amount, total sales, sales last year, totals profits and percent sales growth for that customer.

clip_image006

Future Dashboard developments

In the coming months, I plan additional Power BI reports associated with my Dynamics GP data. These reports will include:

  • Vendor Analysis
  • Inventory Analysis
  • Field Service – Contract and Service Call Analysis

Stay tuned for the future developments of the reports and an overview of their functionality.

WOW! A Dynamics GP multi-company Financial Dashboard – Part Deux

Back in October of 2015, I posted a blog about how to modify Jared Hall’s awesome Excel Financial Dashboard that you can download here. The problem with his solutions is that most Dynamics GP environments are multi-company setups making these dashboards not very functional without navigating and opening multiple Excel spreadsheets to view each company’s dashboard. So, I took it upon myself to modify the one provided by Jared Hall to work in a multi-company Dynamics GP setup, but that was so 2015!

This is 2017, so let’s create our Financial Dashboard in Power BI. The best part about using Power BI over Excel is that we can make is available through the Power BI service and via the iOS and Android Apps. Making it a much better option for mobility and availability.

The Parts to the Solution

For my solution, I choose to use the following tools:

  • A Small DataMart
  • Power BI Designer
  • Power BI Service

The DataMart

I chose to implement a DataMart to simplify the importing of the tables into the Power BI Designer. I created the following tables to implement my Financial Dashboard:

  • DimAccount
  • DimHeader – to summarize the report layout
  • DimDate
  • DimCompany
  • FactFinance – General Ledger data from the GL20000 and GL30000 tables of each company

Dashboard Design

I imported the above tables into Power BI Designer rather than using the DirectQuery mode. Using the Import option will allow for the full functionality of DAX and more importantly the Time Intelligence functions of DAX. Your table relationships should look like the screen shot below after you import the data.

clip_image002

The DAX Measures

The Profit & Loss Statement layout is handled by several DAX measures, the DimHeader table and the Sign and Report Sign columns within the Account table. I created the following DAX measures in the order listed below:

  1. Header Order
  2. HeaderCalcType
  3. Dollars
  4. Dollars with Sign
  5. Dollars with Report Sign
  6. Running Dollars with Sign
  7. Current Period
  8. Cumulative Sales (Selected)

The code for the DAX measures can be downloaded here from the Finance Dashboard template.

Visualize This 

An easy to use modern multi-company cloud-based or mobile app Finance Dashboard that shows your critical data, so you can spot trends, share insights, and make smart decisions.

clip_image004

Conclusion

I’m loving the Power BI Service and all the modern cloud-based analytics that it can bring to your dashboards and reports.

Power BI Visuals in On Premise SQL Server Reporting Services

Yesterday, January 17, 2017, Microsoft released the Power BI reports in SQL Server Reporting Services (SSRS) technical preview. You can read all about it here. I downloaded the technical preview and installed it on my system and deployed a Power BI Visual to my SSRS web site. Here are the steps to get it installed on your system.

· Download the SSRS Technical Preview and Power BI Designer for the preview

Power BI SSRS

· The Technical Preview has the following system requirements:

    • Your own VM or server (not one in production use)
    • Windows Server 2012 or later (or Windows 8 or later)
    • .NET Framework 4.5.2 or later
    • SQL Server Database Engine (2008 or later), to store the report server database
    • SQL Server Analysis Services (2012 SP1 CU4 or later), to store your data models

· I first installed the SQLServerReportingServices.exe on my system and accepted the licensing terms and agreement and then selected the “Install” button.

POwer BI SSRS

· After the installation completed I needed to configure the technical preview of SSRS by selecting the “Configure Report Server” button.

Power BI SSRS

· I provided my SQL Server name to Connect the Reporting Services Configuration Manager.

Power BI SSRS

**NOTE – While the configuration manager let, me configure the SSRS technical preview web site with my original VM’s computer name of ‘BARRYCROWEL38F0’ I could not publish by Power BI report with that computer name. I shortened the name to 7 characters and removed the numbers from the name to get the Power BI Designer Technical Preview to function with the SSRS web site.

· Design a Power BI visual with the Power BI Designer Technical Preview. In my visual, I connected to an SQL Server Analysis Services (SSAS) database. To publish you report to SSRS, select the ‘Save As’ menu option.

Power BI Technical Preview SSRS

· Following the screen prompts and supply the Reporting Server.

Power BI Technical Preview SSRS

· Then enter the report name.

Power BI technical preview SSRS

The final product is a Power BI visual with slicer functionality in an on premise SSRS web site.

**NOTE – the current SSRS technical preview does not work with custom visuals or R visuals.

Power BI On Premise

Conclusion

I’m loving the first on premise technical preview and can’t wait for what next.

Dynamic Security – a look at the final product

Introduction

Recently I posted a blog titled “How to implement Analysis Services Dynamic Security based on Microsoft CRM user access rights”. That post talked about how to implement the back-end structure needed to put it in place but didn’t show the result of all that hard work.

The dynamic row level security (RLS) that we setup inside Analysis Services will work within reports and dashboards built inside Reporting Services, Power BI and Excel. Power BI has other ways of deploying RLS when not using an Analysis Services connection. You can learn more about that here:

· http://sqlkover.com/dynamic-security-in-power-bi/

· http://radacad.com/row-level-security-configuration-in-power-bi-desktop

The End Result

We built are Dashboard in Power BI Designer and published the Dashboard to our end users. One of the end users “Tom” has access to 558 CRM Customer account within the US. When he accesses the Dashboard, his can view results that he has privileges to. (See Figure 1 below)

Power BI dashboard image 1

Figure 1 – Tom’s access view

If “Tom” tries to access an area that he doesn’t have privileges to, say Canada, the dashboard allows him to select the filter and changes the chart(s) results to show no data. (See Figure 2 below)

Power BI dashboard image 2

Figure 2 – Tom’s view with Canada selected

Additionally, the chart(s) change based on the end user that’s accessing the dashboard. In Figure 3 below, I show an administrative user accessing the dashboard and the result shown to them when selecting the “Canada” filter.

Power BI Dashboard image 3

Figure 3 – Administrative user view of Canadian Sales

Conclusion

I hope this helps you with your future Analysis Services development and Dashboard projects.

How to implement Analysis Services Dynamic Security based on Microsoft CRM user access rights

Introduction

In Analysis Services implementations, there is always a demand for implementing security. In this post, I’ll describe a dynamic or attribute security solution in Analysis Services that I built at a customer based on Microsoft CRM Salesperson customer access rights. The data level or row level access is based on CRM security access rights and a SQL script and SSIS package will deploy the user Active Directory account and Customer Number to a custom CRM table. By leveraging the CRM security already established we will reduce the administrative tasks needed for making additional security changes in Analysis Services whenever there is a CRM security change.

The Setup

Here are the pieces that we need to deploy to put the security in place:

1. Microsoft CRM table script

2. DataMart Acct_Access table script

3. DataMart User table script

4. Add Dynamics Security role to Cube and define MDX to filter Active Directory user access

5. Create SSIS package to populate and update Acct_Access and User tables in DataMart

1. Microsoft CRM table script

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

SET ANSI_PADDING ON

GO

CREATE TABLE [dbo].[Acct_Access](

[AccountNumber] [varchar](200) NULL,

[AccountID] [varchar](200) NULL,

[DomainUser] [varchar](200) NULL

) ON [PRIMARY]

GO

SET ANSI_PADDING ON

GO

2. DataMart Acct_Access Table Script

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

CREATE TABLE [dbo].[Acct_Access](

[Access_Key] [bigint] IDENTITY(1,1) NOT NULL,

[AccountNumber] [varchar](200) NULL,

[AccountID] [varchar](200) NULL,

[DomainUser] [varchar](200) NULL,

[Customer_Key] [bigint] NULL,

[User_Key] [bigint] NULL,

[Company_Key] [bigint] NULL,

CONSTRAINT [PK_Access] PRIMARY KEY CLUSTERED

(

[Access_Key] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]

) ON [PRIMARY]

3. DataMart User Table Script

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

CREATE TABLE [dbo].[User](

[User_Key] [bigint] IDENTITY(1,1) NOT NULL,

[DomainUser] [varchar](200) NULL,

CONSTRAINT [PK_User] PRIMARY KEY CLUSTERED

(

[User_Key] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]

) ON [PRIMARY]

GO

clip_image002

4. Add Dynamic Security role to Cube and define MDX script to filter Active Directory user access

Once this DataMart and CRM table structure is in place your next step is to bring it into your Analysis Services solution.  Here are the steps to follow inside BIDS/SSDT to leverage these tables

· Add the new tables (User and Acct_Access) to you Data Source View.

· Create a new dimension based off the User table.  Hide all the columns by changing AttributeHierarchyVisible to False.  You don’t want users to be able to view this dimension and process the dimension.

· Create a new Measure Group in the cube based off the Acct_Access table and name it Security.  Delete all measure it creates except for the automatically generated Count measure.  Select the single measure left to be hidden by selecting it and changing the Visible property to False.  This will hide the entire Measure Group from your users but you can still use it when writing MDX.

· Ensure these two have a Regular relationship setup in the Dimension Usage table of the Cube browser as seen in the screen shot below. 

clip_image004

· In the Solution Explorer, right-click on Roles and select New Role.  Name the Role Dynamic Security and give Read access in the General, Data Sources and Cubes tabs.  On the Membership tab add your CRM groups or user accounts to this Role.

· On the Dimension Data tab select the Customer dimension from the dropdown list.  Then select the Customer Number from the dropdown list.

clip_image006

· Inside the Dimension Data tab select the Advanced tab.  In the Allowed member set area enter this MDX:

EXISTS(

{[Customer].[Customer Number].Members},

STRTOSET(“[Customer].[Domain User].&[“+username()+”]”),

“Security”

)

clip_image008

5. Create SSIS package to populate and update Acct_Access and User tables in DataMart

With the table structure and the Cube solution complete, we need to create the SSIS package to populate and update the Acct_Access table in our CRM database and then bring the updated data into the DataMart. What we need:

a. First we create a SQL script task and add the script below to delete and repopulate the table with the current CRM customer access data. Change the yellow highlighted areas below for your environment needs.

execute as login = ‘Contoso\bcrowell’

delete Acct_Access

GO

declare @DomainName varchar(200)

–Make sure you execute the step to create the domain user table with the correct permissions.

execute as login = ‘Contoso\bcrowell’

if exists (select * from sysobjects where name = ‘DomainUser’)

drop table DomainUser

select DomainName into DomainUser from SystemUser

grant select on DomainUser to [Contoso\Domain Users]

declare SEL_CUR cursor for select DomainName from SystemUser where IsDisabled = 0

open SEL_CUR

fetch next from SEL_CUR into @DomainName

while @@FETCH_STATUS = 0

BEGIN

SELECT SUSER_NAME(), USER_NAME();

execute as login = @DomainName

insert into Acct_Access(AccountNumber, AccountID, DomainUser) select AccountNumber, AccountID, @DomainName from FilteredAccount

fetch next from SEL_CUR into @DomainName

REVERT

END

close SEL_CUR

deallocate SEL_CUR

drop table DomainUser

GO

b. Add additional Data Flow task to take CRM Acct_Access data and populate/update the DataMart’s Acct_Access and User tables.

Conclusion

I’m not the first to deploy some of the techniques used above and I started my learning by reviewing these two blog posts below and modified it to work with security rights already setup in Microsoft CRM.

· http://bidn.com/Blogs/analysis-services-dynamic-security

· http://bifuture.blogspot.com/2011/09/ssas-setup-dynamic-security-in-analysis.html

I hope this helps you with your future Analysis Services and Dashboard development projects.