Supercharge Your Dynamics GP Dashboards & Reports with Power BI

During my last blog post, I walked you through creating a financial dashboard using Power BI with Dynamics GP data. I have created additional analysis and measures now.

clip_image002

If you look closely at the bottom left, you will see two additional reports for Sales Summary and Customer Analysis.

Sales Summary

The data used from Dynamics GP for this report are the Sales Order Processing (SOP) tables, Inventory, (IV) and Customer (RM) tables. The Sales Summary contains the same report filters, Year, Month and Company as the Financial Summary report. The top left visual shows the total sales by product class ranked in descending order. The bottom left visual shows the total sales by customer class/sales channel. The top two visuals in the middle and right show the total sales and transactions based on the filters selected with a seven-day moving average behind them. The bottom right visual shows a comparative between cumulative sales and cumulative costs.

clip_image004

Customer Analysis

The Customer Analysis report allows you to dig into Dynamics GP sales data by Customer, Year, Month and Company. By selecting a customer on the left you get to see three years of comparative sales, sales detail, last sales date and amount, total sales, sales last year, totals profits and percent sales growth for that customer.

clip_image006

Future Dashboard developments

In the coming months, I plan additional Power BI reports associated with my Dynamics GP data. These reports will include:

  • Vendor Analysis
  • Inventory Analysis
  • Field Service – Contract and Service Call Analysis

Stay tuned for the future developments of the reports and an overview of their functionality.

WOW! A Dynamics GP multi-company Financial Dashboard – Part Deux

Back in October of 2015, I posted a blog about how to modify Jared Hall’s awesome Excel Financial Dashboard that you can download here. The problem with his solutions is that most Dynamics GP environments are multi-company setups making these dashboards not very functional without navigating and opening multiple Excel spreadsheets to view each company’s dashboard. So, I took it upon myself to modify the one provided by Jared Hall to work in a multi-company Dynamics GP setup, but that was so 2015!

This is 2017, so let’s create our Financial Dashboard in Power BI. The best part about using Power BI over Excel is that we can make is available through the Power BI service and via the iOS and Android Apps. Making it a much better option for mobility and availability.

The Parts to the Solution

For my solution, I choose to use the following tools:

  • A Small DataMart
  • Power BI Designer
  • Power BI Service

The DataMart

I chose to implement a DataMart to simplify the importing of the tables into the Power BI Designer. I created the following tables to implement my Financial Dashboard:

  • DimAccount
  • DimHeader – to summarize the report layout
  • DimDate
  • DimCompany
  • FactFinance – General Ledger data from the GL20000 and GL30000 tables of each company

Dashboard Design

I imported the above tables into Power BI Designer rather than using the DirectQuery mode. Using the Import option will allow for the full functionality of DAX and more importantly the Time Intelligence functions of DAX. Your table relationships should look like the screen shot below after you import the data.

clip_image002

The DAX Measures

The Profit & Loss Statement layout is handled by several DAX measures, the DimHeader table and the Sign and Report Sign columns within the Account table. I created the following DAX measures in the order listed below:

  1. Header Order
  2. HeaderCalcType
  3. Dollars
  4. Dollars with Sign
  5. Dollars with Report Sign
  6. Running Dollars with Sign
  7. Current Period
  8. Cumulative Sales (Selected)

The code for the DAX measures can be downloaded here from the Finance Dashboard template.

Visualize This 

An easy to use modern multi-company cloud-based or mobile app Finance Dashboard that shows your critical data, so you can spot trends, share insights, and make smart decisions.

clip_image004

Conclusion

I’m loving the Power BI Service and all the modern cloud-based analytics that it can bring to your dashboards and reports.

How to implement Analysis Services Dynamic Security based on Microsoft CRM user access rights

Introduction

In Analysis Services implementations, there is always a demand for implementing security. In this post, I’ll describe a dynamic or attribute security solution in Analysis Services that I built at a customer based on Microsoft CRM Salesperson customer access rights. The data level or row level access is based on CRM security access rights and a SQL script and SSIS package will deploy the user Active Directory account and Customer Number to a custom CRM table. By leveraging the CRM security already established we will reduce the administrative tasks needed for making additional security changes in Analysis Services whenever there is a CRM security change.

The Setup

Here are the pieces that we need to deploy to put the security in place:

1. Microsoft CRM table script

2. DataMart Acct_Access table script

3. DataMart User table script

4. Add Dynamics Security role to Cube and define MDX to filter Active Directory user access

5. Create SSIS package to populate and update Acct_Access and User tables in DataMart

1. Microsoft CRM table script

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

SET ANSI_PADDING ON

GO

CREATE TABLE [dbo].[Acct_Access](

[AccountNumber] [varchar](200) NULL,

[AccountID] [varchar](200) NULL,

[DomainUser] [varchar](200) NULL

) ON [PRIMARY]

GO

SET ANSI_PADDING ON

GO

2. DataMart Acct_Access Table Script

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

CREATE TABLE [dbo].[Acct_Access](

[Access_Key] [bigint] IDENTITY(1,1) NOT NULL,

[AccountNumber] [varchar](200) NULL,

[AccountID] [varchar](200) NULL,

[DomainUser] [varchar](200) NULL,

[Customer_Key] [bigint] NULL,

[User_Key] [bigint] NULL,

[Company_Key] [bigint] NULL,

CONSTRAINT [PK_Access] PRIMARY KEY CLUSTERED

(

[Access_Key] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]

) ON [PRIMARY]

3. DataMart User Table Script

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

CREATE TABLE [dbo].[User](

[User_Key] [bigint] IDENTITY(1,1) NOT NULL,

[DomainUser] [varchar](200) NULL,

CONSTRAINT [PK_User] PRIMARY KEY CLUSTERED

(

[User_Key] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]

) ON [PRIMARY]

GO

clip_image002

4. Add Dynamic Security role to Cube and define MDX script to filter Active Directory user access

Once this DataMart and CRM table structure is in place your next step is to bring it into your Analysis Services solution.  Here are the steps to follow inside BIDS/SSDT to leverage these tables

· Add the new tables (User and Acct_Access) to you Data Source View.

· Create a new dimension based off the User table.  Hide all the columns by changing AttributeHierarchyVisible to False.  You don’t want users to be able to view this dimension and process the dimension.

· Create a new Measure Group in the cube based off the Acct_Access table and name it Security.  Delete all measure it creates except for the automatically generated Count measure.  Select the single measure left to be hidden by selecting it and changing the Visible property to False.  This will hide the entire Measure Group from your users but you can still use it when writing MDX.

· Ensure these two have a Regular relationship setup in the Dimension Usage table of the Cube browser as seen in the screen shot below. 

clip_image004

· In the Solution Explorer, right-click on Roles and select New Role.  Name the Role Dynamic Security and give Read access in the General, Data Sources and Cubes tabs.  On the Membership tab add your CRM groups or user accounts to this Role.

· On the Dimension Data tab select the Customer dimension from the dropdown list.  Then select the Customer Number from the dropdown list.

clip_image006

· Inside the Dimension Data tab select the Advanced tab.  In the Allowed member set area enter this MDX:

EXISTS(

{[Customer].[Customer Number].Members},

STRTOSET(“[Customer].[Domain User].&[“+username()+”]”),

“Security”

)

clip_image008

5. Create SSIS package to populate and update Acct_Access and User tables in DataMart

With the table structure and the Cube solution complete, we need to create the SSIS package to populate and update the Acct_Access table in our CRM database and then bring the updated data into the DataMart. What we need:

a. First we create a SQL script task and add the script below to delete and repopulate the table with the current CRM customer access data. Change the yellow highlighted areas below for your environment needs.

execute as login = ‘Contoso\bcrowell’

delete Acct_Access

GO

declare @DomainName varchar(200)

–Make sure you execute the step to create the domain user table with the correct permissions.

execute as login = ‘Contoso\bcrowell’

if exists (select * from sysobjects where name = ‘DomainUser’)

drop table DomainUser

select DomainName into DomainUser from SystemUser

grant select on DomainUser to [Contoso\Domain Users]

declare SEL_CUR cursor for select DomainName from SystemUser where IsDisabled = 0

open SEL_CUR

fetch next from SEL_CUR into @DomainName

while @@FETCH_STATUS = 0

BEGIN

SELECT SUSER_NAME(), USER_NAME();

execute as login = @DomainName

insert into Acct_Access(AccountNumber, AccountID, DomainUser) select AccountNumber, AccountID, @DomainName from FilteredAccount

fetch next from SEL_CUR into @DomainName

REVERT

END

close SEL_CUR

deallocate SEL_CUR

drop table DomainUser

GO

b. Add additional Data Flow task to take CRM Acct_Access data and populate/update the DataMart’s Acct_Access and User tables.

Conclusion

I’m not the first to deploy some of the techniques used above and I started my learning by reviewing these two blog posts below and modified it to work with security rights already setup in Microsoft CRM.

· http://bidn.com/Blogs/analysis-services-dynamic-security

· http://bifuture.blogspot.com/2011/09/ssas-setup-dynamic-security-in-analysis.html

I hope this helps you with your future Analysis Services and Dashboard development projects.

Real Time Data Warehouse ETL

Part 2 – Implementing Service Broker as a Real Time ETL Tool – The Code

To implement my Service Broker ETL API, I started with the basic code/framework provided by Eitan Blumin in this blog post: http://www.madeiradata.com/service-broker-asynchronous-triggers/. He provides a good basic framework for the setup of Service Broker and an explanation of the installation scripts. To learn more on the basic framework and scripts to implement Service Broker please refer to his post.

In this post, I’m going to provide his scripts for the basic installation with some modifications to work with Dynamics GP database and Solver’s BI360 data warehouse plus my modified stored procedure, usp_AT_iGLETL, that provides the logic of the ETL process from Dynamics GP’s general ledger table, GL20000, to Solver’s BI360 staging table.

General Workflow

Eitan’s general Service Broker workflow looks like this:

1. The user performs an UPDATE operation on a table. In my case with the modifications in place, the posting of a General Ledger batch within Dynamics GP will be the UPDATE operation on the GL20000 table.

2. An AFTER INSERT trigger on the table is fired. This trigger compiles the contents of the INSERTED and DELETED tables into XML parameters, creates a Service Broker message and sends it to a queue. The original transaction immediately returns. The modifications to the AFTER INSERT trigger pass the usp_AT_iGLETL stored procedure into the SB_AT_Fire_Trigger stored procedure.

3. The Service Broker service fires up and processes the messages in the queue independently of the original transaction. It opens up a transaction that will pull the message out of the queue, execute a relevant stored procedure that will use the XML data previously taken from the INSERTED and DELETED tables, and implement the relevant logic within the usp_AT_iGLETL stored procedure.

Service Broker Framework Installation and Usage

The installation script provided below implement a generic Service Broker framework which will allow you to use it for any table without any changes to the installed objects. The framework will work like this:

1. First, run the “Step1_SB_AT_Installation” script on the Dynamics GP company database where you want the asynchronous triggers.

2. Run the “Step2_GP_GLTable_Trigger” script on the Dynamics GP company database to create the AFTER INSERT trigger on the GL20000 table.

3. Run the “Step3_GP_ETL_Logging” script to create the basic framework for logging the Service Broker ETL process and error logs.

4. Run the “Step4_BI360_f_Trans_GL_Staging_Table” script to create the staging table for in BI360DW database.

5. Run the “Step5_ETL_StoredProcedure” script on the Dynamics GP company database to create the stored procedure, usp_AT_iGLETL, with the ETL logic.

Download Code

You should now have in place a “Near Real Time ETL” process in place to handle the loading of data from Dynamics GP to staging table within Solver’s BI360DW data warehouse database. You could then use the same logic and scripts above to handle the loading of the data from the staging table to the f_Trans_GL table within the BI360DW database. In my tests of the implementation outlined above, I was able to post a 350 line batch within Dynamics GP and load the staging table within five seconds.

Conclusion

Hopes this helps you with the basic understanding and installation of Service Broker as an ETL tool in loading your data warehouse..