Use Power BI and R to Quickly Identify Business Insights

In statistics, the correlation coefficient or Pearson’s correlation is a measure the strength and direction of association that exists between two continuous variables. The value of correlation, r, is always between +1 and –1. To interpret its value, see which of the following values your correlation r is closest to:

  • Exactly 1. A perfect downhill (negative) linear relationship
  • 0.70. A strong downhill (negative) linear relationship
  • 0.50. A moderate downhill (negative) relationship
  • 0.30. A weak downhill (negative) linear relationship
  • 0. No linear relationship
  • +0.30. A weak uphill (positive) linear relationship
  • +0.50. A moderate uphill (positive) relationship
  • +0.70. A strong uphill (positive) linear relationship
  • Exactly +1. A perfect uphill (positive) linear relationship

Don’t make the mistake of thinking that a correlation of –1 is a bad thing, indicating no relationship. Just the opposite is true! A correlation of –1 means the data are lined up in a perfect straight line, the strongest negative linear relationship you can get. The “–” (minus) sign just happens to indicate a negative relationship, a downhill line. Most statisticians like to see correlations beyond at least +0.5 or –0.5 before drawing any conclusions. Additional information on Pearson’s correlation can be found here.

Putting it All to Use

This is all great knowledge, but how can we apply this in a business environment. When I worked for a casino, we kept a daily record of Coin In (gross sales), High Temp, Low Temp and Fuel price in an Excel spreadsheet and tried to find a correlation between these data points. Try find the correlation with this data.

Excel Casino data

With Power BI and R, we can make this association all that much faster and with slicer for interactivity to drill into the data points. I first downloaded the R Correlation Plot custom visual located here and then Excel spreadsheet from above imported into Power BI. I was then able easily identify the correlations in a matter of minutes.

Positive correlations are identified with increasingly dark blue circles, negative correlation is red. The greater the bigger and darker the circle. Selecting “Nov” from the calendar slicer, you will notice that the intersection between “Low Temp” has a positive correlation (blue circle), so the lower the temperature the less you will realize in coin in for the that month.

Power BI and R Correlation Plot

Selecting the “Jul” from the calendar slicer, and notice that the correlation is different.

In this case, the correlation is zero between temperature and coin in.

Power BI and R Correlation Plot image 2

Conclusion

Hope this helps you quickly identify business insights in your organization.

WOW! A Dynamics GP multi-company Financial Dashboard – Part Deux

Back in October of 2015, I posted a blog about how to modify Jared Hall’s awesome Excel Financial Dashboard that you can download here. The problem with his solutions is that most Dynamics GP environments are multi-company setups making these dashboards not very functional without navigating and opening multiple Excel spreadsheets to view each company’s dashboard. So, I took it upon myself to modify the one provided by Jared Hall to work in a multi-company Dynamics GP setup, but that was so 2015!

This is 2017, so let’s create our Financial Dashboard in Power BI. The best part about using Power BI over Excel is that we can make is available through the Power BI service and via the iOS and Android Apps. Making it a much better option for mobility and availability.

The Parts to the Solution

For my solution, I choose to use the following tools:

  • A Small DataMart
  • Power BI Designer
  • Power BI Service

The DataMart

I chose to implement a DataMart to simplify the importing of the tables into the Power BI Designer. I created the following tables to implement my Financial Dashboard:

  • DimAccount
  • DimHeader – to summarize the report layout
  • DimDate
  • DimCompany
  • FactFinance – General Ledger data from the GL20000 and GL30000 tables of each company

Dashboard Design

I imported the above tables into Power BI Designer rather than using the DirectQuery mode. Using the Import option will allow for the full functionality of DAX and more importantly the Time Intelligence functions of DAX. Your table relationships should look like the screen shot below after you import the data.

clip_image002

The DAX Measures

The Profit & Loss Statement layout is handled by several DAX measures, the DimHeader table and the Sign and Report Sign columns within the Account table. I created the following DAX measures in the order listed below:

  1. Header Order
  2. HeaderCalcType
  3. Dollars
  4. Dollars with Sign
  5. Dollars with Report Sign
  6. Running Dollars with Sign
  7. Current Period
  8. Cumulative Sales (Selected)

The code for the DAX measures can be downloaded here from the Finance Dashboard template.

Visualize This 

An easy to use modern multi-company cloud-based or mobile app Finance Dashboard that shows your critical data, so you can spot trends, share insights, and make smart decisions.

clip_image004

Conclusion

I’m loving the Power BI Service and all the modern cloud-based analytics that it can bring to your dashboards and reports.

WOW! A Dynamics GP multi-company Financial Dashboard

Dynamics GP can deploy some great Excel dashboard’s in GP2015 and Jared Hall created a awesome Financial Dashboard that you can download here. The problem with all of these solutions is that most Dynamics GP environments are multi-company setups making these dashboards not very functional without navigating and opening multiple Excel spreadsheets to view each company’s dashboard. So I took it upon myself to modify the one provided by Jared Hall to work in a multi-company Dynamics GP setup.

Changing the Data Connections

The first changes made where to the Dashboard tab to add a form control and add a new tab with a data connection to the Dynamics SY01500 table for the list of active GP databases and then assign that list to the form control (figure 1).

clip_image002

I then added stored procedures in the Dynamics database with three input parameters. Two from the original parameters with Jared Hall’s Financial Dashboard of @UserDate and @TimeUnit and additional parameter of @Database. The @Database called the original stored procedure in the company’s database provided by Microsoft and passed the @UserDate and @TimeUnit into it to produce the intended results. This solved the KPI queries within the original Financial Dashboard. There were several OLE DB Queries in the Financial Dashboard for Budget, Cash Balances, Current Financial Data and Prior Financial Data. I used the original code from the queries and created individual stored procedures in the Dynamics database with one parameter of @Database. Once the stored procedures had been created, I changed the connections in the Current Financial, Prior Financial, Budget and Cash Flows tabs to user the Database connection calling the stored procedure with the @Database parameter mapped to the form control’s output (figure 2).

clip_image004

Refreshing the Pivot Table Cache

That left me only one obstacle that I discovered after all of this hard work. My Key Performance Indicators section was updating when I selected a new company from the drop down list. What I discovered was while the Financial data was updating in the Current Financial, Prior Financial, Cash Flows and Budget tabs the pivot tables Caches where not updating. I solved this by adding a little more VBA to the Financial Dashboard to automatically update all pivot tables caches when data source data changed. To do this created a VBA module and added the following code:

            Sub RefreshAllPivots()

                        Dim PC As PivotCache

                        For Each PC In ActiveWorkbook.PivotCaches

                                    PC.Refresh

                        Next PC

End Sub

I then called the sub from the Financial data sheets whenever the worksheet changed.

Fabulous! A Multi-company Financial Dashboard

What we end up with is a multi-company Financial Dashboard with no need to navigate to different spreadsheet to get a view of each company’s current financial outlook.

clip_image006

Hopefully this helps you start your journey down Excel Dashboards in Dynamics GP. 

Using Power BI to combine Financial and Social Data for More Powerful Analytics

In my last blog post we talked about Power BI Designer Preview and why I was lovin’ it. In this post I’m going to use Microsoft’s most recently Power BI tool to connect to my SQL server instance to analysis my Google Analytics data, Marketing data and CRM opportunities.

Connect to your SQL Data

Let’s get started by selecting the Get Data button in the top left corner and then select SQL Server

clip_image002

Supply your SQL sever name and database that you would like to connect to

clip_image004

Select with tables you would like to use for your dashboard/report. I’m going to use my CRM, Google Analytics and KTLMarketing.

clip_image006

Transforming your Data

At this point we can edit the query before loading the table, by selecting Edit from the bottom of the window, or we can load the table(s) and then select each table to transform the data.

Now that we’ve connected to a data sources, we need to adjust the data to meet our needs.

The Query view in Power BI Designer makes ample use of right-click menus, in addition to having tasks available on the ribbon. Most of what you can select in the Transform ribbon is also available by right-clicking an item (such as a column) and choosing from the menu that appears.

When you transform data in the Query view, you’re providing step-by-step instructions (that Query carries out for you) to adjust the data as Query loads and presents it. Clicking on the View tab allows you to select the option for the formula bar so you can enter in or modify the DAX formulas with your transformation. The original data source is not affected; only this particular view of the data is adjusted.

The steps you specify are recorded by Query in the Applied Steps navigation pane, and each time this query connects to the data source those steps are carried out so that the data is always shaped the way you specify. This process occurs whenever you use the query in the Power BI Designer, or for anyone who uses your shared query, such as on the Power BI Service. Those steps are captured, sequentially, in the Query Settings pane under Applied Steps. Reversing your applied transformation is as simple as clicking on the ‘x’ to the left of the step.

clip_image008

For starters, I’m going to replace the null values with zero. I just right click the column header, select Replace Values.

clip_image010

Type null in the Value To Find field and 0 in the Replace With field and then click the OK button.

clip_image012

Additionally, I’m going to remove several columns that were imported within the KTLMarketing table by highlighting them and then right clicking and Remove Columns.

I’ve completed the transformations to my data that I want to do for now. I can always do additional transformations later by selecting Query and then the table that I want to work with.

Building Your Visual Dashboard/Report

I’m going to start my first report by looking at Google Analytics data. I drag sessions by date, page views by date and page views by channel grouping. I like what I have started but want to compare sessions to page views. To do that I simply select page views by date and drag and drop it on top of sessions by date.

clip_image014

Next I want to add some of our KTL Marketing data to the Google Analytics report. I select the KTLMarketing data set and drag the Blog post views by author, LinkedIn views by author and Twitter by authors statistics. What I am left with below is a clear prospective of where my traffic to my website is coming from and which employee is providing the most relevant posts with each social media channel that I’m analyzing.

clip_image016

Next I’m going to start a new report by right clicking a blank space in the Reports navigation pane to the left.

clip_image018

Next I’m going to select my actual value by customer, estimated and actual value by opportunity owner and then count of opportunities by name. I’m going to use this data to analysis who are best customers are, which opportunity owners do the best at estimating the opportunity and who is the best preforming employee and then I can use the opportunity name to see if there is any correlation between blog posts and the opportunities.

clip_image020

This is just a start into analytics using Power BI Designer Preview and is specific to our business needs. There is so much more that you can do with it and the best thing of all is its FREE!!!!!

How to Analyze and Report on your Data

Part 2 of 2 in Data Warehousing series

Every business from startup to larger established businesses needs to track details about their customers, products, sales, purchases, social media, website logs to name a few. By extracting, manipulating and analyzing this data you can determine key metrics to help you understand more about your customers and grow your business. In part 1 we talked about how to kick start your data warehousing with BI360 and now we need to know what to do with all this data.

Basically you can use your Data Warehouse for financial statement reporting and analysis, dashboards and data mining. I previously went through how to use BI360’s One Stop Reporting here  so I will be talking more about data mining and .

Data mining can be defined as the process of analyzing large quantities of data through automatic or semi-automatic tasks to extract previously unknown interesting patterns. Over the years, storage has grown at a faster rate than computing power. As a result, companies have stored enormous amounts of data and have become data-rich but knowledge poor. The main purpose of data mining is to increase the value of these large datasets by extracting knowledge from them and making the data useful to the business.[1]

In general, there are two types of data mining – predictive and descriptive. Predictive data mining will use variables or fields from a dataset to predict unknown or future values. The goal is to produce a model from a defined dataset that is used to perform classification, estimation, and other data mining tasks. On the other hand, descriptive data mining will focus on finding patterns that describe the data and can be interpreted by data analysts or business users. The goal is to gain understanding of the data by uncovering new relationships, patterns, and important information regarding the dataset.

Data mining can be valuable in virtually any industry and business scenario. The following are just a few examples of how data mining can improve decision making and give businesses a better return on investment with their Business Intelligence (BI) projects:

  • Recommendation Generation and Retail Floor Optimization:

What products/services should a company offer to its customers? By heat mapping customer traffic patterns it can help place offerings. Casino use this to help optimize the game theme selection, denomination selection, placement and removal

E-Commerce Websites – analyze purchasing behaviors of customer population and recommend products based on items in a customer’s cart

  • Anomaly Detection:

Analyze items that don’t fit a certain pattern. Credit card, insurance companies and business with POS systems use this method to detect fraud

  • Churn Analysis:

Which customers are most likely to switch to a competitor? When will they leave and can it be acted on before they do? Based on findings, companies can improve relationships and offer those customers discounts or loyalty incentives.

  • Market Penetration:

How many customers or sales per geographic location.

  • Risk Management:

Determine the risk of a loan or mortgage based on the customer profile and amount they are asking

Companies can make decisions based on cost and risk of the loan by using historical customer data

  • Customer Segmentation:

How well do companies know their customers? What is their Life Time Value (LTV)?

Determines behavioral and descriptive profiles for their customers in order to target marketing campaigns. The behavioral and descriptive profiles high LTV customers and be used and applied against first time customers’ behavioral and descriptive profiles to capture their loyalty.

  • Forecasting:

Estimate how much sales and/or inventory for each week, month, and quarter of a specific year

Data Mining Tasks

Determining the correct task and algorithm to apply to your dataset is a crucial step to achieving an accurate and useful data mining model. In some cases, it will be quite obvious which task will be the most accurate to use depending on the nature of your data. More often than not, you will need to explore and combine multiple tasks before arriving at a single solution. The following section describes the seven basic data mining and some additional resources and discussion on each:

Classification: Can be used to identify loan applicants as low, medium, or high credit risks based on attributes such as income, credit score, employment history, home ownership, and amount of debt. In this case the target would be credit score, and the other attributes would be the predictors.

Clustering: By using attributes such as Income and Age, three clusters could be created (could be more than 3 depending on data) – 1. Younger population with low income, 2. Middle Age with higher income, and 3. Older Customers with lower income.

Association: (also called Market Basket Analysis) Perhaps most famously, Amazon uses these types of analyses to suggest additional items you may be interested in purchasing, based upon other items frequently bought together.

http://www.casinoenterprisemanagement.com/articles/may-2009/turning-market-basket-analysis-action

http://blogs.office.com/2014/10/08/6-cool-ways-use-custom-maps/

Sequence Analysis: Can be used to analyze the sequence of web clicks on a website. The results are probabilities of the next click in the sequence, i.e. if a user clicks ‘News’, there is a 20% chance the next click will be ‘Sports’ and a 30% chance it will be ‘Weather’.

Sentiment Analysis

http://blogs.msdn.com/b/microsoft_business_intelligence1/archive/2012/09/19/the-social-media-summer-games-sentiment-tracking-in-power-view-it-s-a-slam-dunk.aspx

https://cwebbbi.wordpress.com/2014/10/19/sentiment-analysis-in-excel-with-azure-machine-learning-and-power-query/

Social Media Analysis

http://www.digital-warriors.com/social-media-analytics-power-business/

Deviation Analysis: Used to find the rare cases that do not match their behavior to the ‘norm’; most commonly used in fraud detection – finds the transactions that don’t match the spending habits of the customer.

http://www.journalofaccountancy.com/Issues/2003/Aug/TurnExcelIntoAFinancialSleuth

Regression: Can be used to predict a value of a house based on location, number of rooms, land size, and crime rates

Forecasting: Based on the sales from last year by month, how many cases of soda will the northeast store sell in January? The output would be the estimated number of cases of soda for January.

Next steps

For a quick how to on using SSAS Data Mining check out my previous post here.


[1] http://en.wikipedia.org/wiki/Data_mining

Power BI – Sharing the Wealth

Part 5 of a 5 part series

We have put a lot of hard work into this blog series with building a patron analysis so now it’s time to share the wealth of our analysis insights with the rest of the company. Power BI’s web site and mobile approach with the deployment of Office 365 and refreshable data through a Data Management Gateway is the perfect solution. The Data Management Gateway has come a long way since the first version and with version 1.2 it now supports many data source types from Power Query connections.

One of the best parts of the Data Management Gateway is that is makes the connection outside the firewall. That means, no reverse proxies, custom web services or firewalls to deal with.

First, I setup a trial of Power BI at http://powerbi.com. After the provisioning was complete, my Office 365 site has a newly created asset called Power BI.

Creating a Data Management Gateway

Open the admin center and created a new Gateway. A Gateway requires simply a name and description. After giving it a title and description it will generate a new key for you. Save this key as it is the key needed to connect your on premises client to the gateway. If you lose the key you can always regenerate a new key. Also, from here you can download the Data Management Gateway client.

clip_image002

Next, you will use this key to connect to the gateway. Launch your Data Management Gateway client on your on-prem server and select register a new gateway.

clip_image004

Select the endpoint you want the Gateway to use for access

clip_image006

Now we have a connected gateway and we can control the service from this screen.

clip_image008

Creating a Data Source on your Office 365 site

Next, we need to create a data source. Navigate to the Power BI admin center, use the Admin->Power BI link at the top of your Power BI site. Select data source on Power BI admin center site. Click on the “+” sign and select SQL Server as a data source.

clip_image010

Select “Enable Cloud Access” on the data source usage screen.

clip_image012

Select “Next” and then a connection name and connection string.

· The data source name is case sensitive and should match exactly what’s in the Excel workbook.

After you have entered your connection string, select the “Set Credentials” button.

clip_image014

Enter your credentials and test the connection

clip_image016

Next, I clicked on credentials. At this point it will launch a new application and ask for your credentials.

clip_image017

Now my connection is now complete.

Setting up your Microsoft Power BI site for Office 365

From your Office 365 site, click Sites in the upper right corner of the dashboard page.

clip_image019

In the new site page, click the Team Site tile.

In the left navigation pane, click Site Contents.

clip_image021

In the Site Contents page, click the Power BI tile.

clip_image022

In the Power BI page, drag your excel workbook to your site.

clip_image024

To start your own Free Trial of Power BI for Office 365 on this site: http://www.microsoft.com/en-us/powerbi/default.aspx#fbid=v0V8xzuZXJR

Download components of Power BI integrated into Excel:
1.
Power Query
2. Power Pivot
3. Power View
4. Power Map
5. Power BI Mobile App (or Mobile BI).

Online Components in Office 365:
6.
BI Power for Office 365
7. Power Q&A
8. Data Management and Data Stewardship

As part of the Power BI trial you’ll also receive a free 30 day trial of Office 365 Pro Plus giving you access to the latest version of Excel. Once you have Excel installed, download and install the Power Query and Power Map add-ins for Excel and learn about the new features with the getting started guide.  Until my next post have fun sharing the wealth!

Power BI’s Power View – A Picture is Worth a Thousand Words

Part 4 of a 5 part series

The adage “A picture is worth a thousand words” refers to the notion that a complex idea can be conveyed with just a single still image. It also appropriately characterizes one of the main goals of data visualization, namely making it possible to absorb large amounts of data quickly.

The point of Power View is to make it very easy to create pretty, interactive data presentations or reports that will make your boss go “Wow, how did you do that? You’re a genius!” Well, OK, maybe not that far but you’ll definitely look smart. Plus, it also can be used to explore your data visually without worrying about messing anything up and even make reports on your own, without IT help.

Power View provides visualization of Power Pivot Data Models and SSAS tabular mode databases. Power View doesn’t work directly on top of relational data or local Excel data, but Excel 2013 can create a data model on the fly when needed.

Microsoft is continually updating Their Power BI suite and released updates to their Power Map feature that includes filtering and custom maps.

To see Power View in action I will be using my Patron data model from Part three of this five part series, follow these steps:

Insert a Power View sheet

1. Click on the “Insert” tab

2. In the middle of the ribbon, click on the “Power View” button in the Reports section (to the right of the Charts section)

image

3. Next drag the State and TheoWin from the right onto the Power View canvas on the left.

image

4. Next click on Column Chart and then 100% Stacked Column

image

5. Continue this process by drag from the right to the Power View canvas on the left any additional visualizations that you want for your report. The results of my report are below.

image

 

Creating Power View Map Reports in Excel 2013

Now let’s create some cool Power View map reports. For this demonstration, I will continue to use the Patron model from my previous blog post as you can see below:

1. Simply drag Sales Amount and Country from the Power View Fieldslist and your report should look like as shown below:

image

2. To turn the above tabular report to a map report, simply go to Designtab and then click on Map icon as shown below:

image

3. That’s all and here is your first map report for the TheoWin (Casino Sales). Hovering on any of the circles will provide the detail information; also the size of the circle indicates the value with respect to the other values for the report.

image

4. To make additional changes to your Power Map click on the insert tab and then Map.

image

5. In my example I will use the Radius of Influence to show a heat map of where our casino revenue (Theo Win) is coming from. Click on setting and then Change Radius of Influence.

image

 

Next Steps

Creating basic Power View and Power Map dashboards/reports are fairly easy and usually don’t require any IT resources except to provide a data model and possibly security access to the database for importing data. To find out more about Power BI click here to visit Microsoft’s Office site.  In my final post in this series we will put it all together by creating a Office 365 Power BI site and sharing our Excel Analysis work book.