May the Force of PowerShell Module dbatools be with you

Dbatools is a great PowerShell module that has 106 ways to help ease the administrative burden faced by SQL Server DBA’s and developers. As a IT Consultant that handles Microsoft Dynamics products that install within SQL Server, I would suggest that this is a great module for us to know and use too.

For those that may not know what PowerShell is, it was developed by Microsoft for task automation and configuration management. You can learn more about PowerShell at Microsoft Virtual Academy

How to Install

There are four ways to install dbatools module into PowerShell. The PowerShell command that I used was:

Invoke-Expression (Invoke-WebRequest -UseBasicParsing https://dbatools.io/in)

The complete instructions on how to install the module can be found on dbatool’s download page: https://dbatools.io/download/

Getting Started

There are several scenarios to use dbatools. The developer’s goal with every command is to support:

  • SQL Server 2000 – 2016
  • Express – Datacenter Edition
  • Clustered and stand-alone instances
  • Windows and SQL authentication
  • Default and named instances
  • Multiple instances on one server

You can learn more about dbatools and the Functions associated with the PowerShell module here: https://dbatools.io/getting-started/

Functions to Know for Dynamics

As IT Consultant in the Microsoft Dynamics space, we are sometimes tasked with setting up SQL Server and/or migrating the Dynamics databases to the client’s new SQL Server. Some of the key functions I have used in the past to do this are:

clip_image001 Databases

clip_image002 Security

clip_image003 Server Objects

clip_image004 SSIS

clip_image005 Management

clip_image006 SQL Server Agent

Best Practice Commands

General Administration Commands

A complete list of the 106 Functions contained in the current version, 0.8.693. Can be viewed here, https://dbatools.io/functions/, and just follow the link for each Function on that page to get the details on that Functions use.

Conclusion

Until next time, I hope this helps you with your SQL Server Administration and database migrations.

Dynamic Security – a look at the final product

Introduction

Recently I posted a blog titled “How to implement Analysis Services Dynamic Security based on Microsoft CRM user access rights”. That post talked about how to implement the back-end structure needed to put it in place but didn’t show the result of all that hard work.

The dynamic row level security (RLS) that we setup inside Analysis Services will work within reports and dashboards built inside Reporting Services, Power BI and Excel. Power BI has other ways of deploying RLS when not using an Analysis Services connection. You can learn more about that here:

· http://sqlkover.com/dynamic-security-in-power-bi/

· http://radacad.com/row-level-security-configuration-in-power-bi-desktop

The End Result

We built are Dashboard in Power BI Designer and published the Dashboard to our end users. One of the end users “Tom” has access to 558 CRM Customer account within the US. When he accesses the Dashboard, his can view results that he has privileges to. (See Figure 1 below)

Power BI dashboard image 1

Figure 1 – Tom’s access view

If “Tom” tries to access an area that he doesn’t have privileges to, say Canada, the dashboard allows him to select the filter and changes the chart(s) results to show no data. (See Figure 2 below)

Power BI dashboard image 2

Figure 2 – Tom’s view with Canada selected

Additionally, the chart(s) change based on the end user that’s accessing the dashboard. In Figure 3 below, I show an administrative user accessing the dashboard and the result shown to them when selecting the “Canada” filter.

Power BI Dashboard image 3

Figure 3 – Administrative user view of Canadian Sales

Conclusion

I hope this helps you with your future Analysis Services development and Dashboard projects.

How to implement Analysis Services Dynamic Security based on Microsoft CRM user access rights

Introduction

In Analysis Services implementations, there is always a demand for implementing security. In this post, I’ll describe a dynamic or attribute security solution in Analysis Services that I built at a customer based on Microsoft CRM Salesperson customer access rights. The data level or row level access is based on CRM security access rights and a SQL script and SSIS package will deploy the user Active Directory account and Customer Number to a custom CRM table. By leveraging the CRM security already established we will reduce the administrative tasks needed for making additional security changes in Analysis Services whenever there is a CRM security change.

The Setup

Here are the pieces that we need to deploy to put the security in place:

1. Microsoft CRM table script

2. DataMart Acct_Access table script

3. DataMart User table script

4. Add Dynamics Security role to Cube and define MDX to filter Active Directory user access

5. Create SSIS package to populate and update Acct_Access and User tables in DataMart

1. Microsoft CRM table script

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

SET ANSI_PADDING ON

GO

CREATE TABLE [dbo].[Acct_Access](

[AccountNumber] [varchar](200) NULL,

[AccountID] [varchar](200) NULL,

[DomainUser] [varchar](200) NULL

) ON [PRIMARY]

GO

SET ANSI_PADDING ON

GO

2. DataMart Acct_Access Table Script

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

CREATE TABLE [dbo].[Acct_Access](

[Access_Key] [bigint] IDENTITY(1,1) NOT NULL,

[AccountNumber] [varchar](200) NULL,

[AccountID] [varchar](200) NULL,

[DomainUser] [varchar](200) NULL,

[Customer_Key] [bigint] NULL,

[User_Key] [bigint] NULL,

[Company_Key] [bigint] NULL,

CONSTRAINT [PK_Access] PRIMARY KEY CLUSTERED

(

[Access_Key] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]

) ON [PRIMARY]

3. DataMart User Table Script

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

CREATE TABLE [dbo].[User](

[User_Key] [bigint] IDENTITY(1,1) NOT NULL,

[DomainUser] [varchar](200) NULL,

CONSTRAINT [PK_User] PRIMARY KEY CLUSTERED

(

[User_Key] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]

) ON [PRIMARY]

GO

clip_image002

4. Add Dynamic Security role to Cube and define MDX script to filter Active Directory user access

Once this DataMart and CRM table structure is in place your next step is to bring it into your Analysis Services solution.  Here are the steps to follow inside BIDS/SSDT to leverage these tables

· Add the new tables (User and Acct_Access) to you Data Source View.

· Create a new dimension based off the User table.  Hide all the columns by changing AttributeHierarchyVisible to False.  You don’t want users to be able to view this dimension and process the dimension.

· Create a new Measure Group in the cube based off the Acct_Access table and name it Security.  Delete all measure it creates except for the automatically generated Count measure.  Select the single measure left to be hidden by selecting it and changing the Visible property to False.  This will hide the entire Measure Group from your users but you can still use it when writing MDX.

· Ensure these two have a Regular relationship setup in the Dimension Usage table of the Cube browser as seen in the screen shot below. 

clip_image004

· In the Solution Explorer, right-click on Roles and select New Role.  Name the Role Dynamic Security and give Read access in the General, Data Sources and Cubes tabs.  On the Membership tab add your CRM groups or user accounts to this Role.

· On the Dimension Data tab select the Customer dimension from the dropdown list.  Then select the Customer Number from the dropdown list.

clip_image006

· Inside the Dimension Data tab select the Advanced tab.  In the Allowed member set area enter this MDX:

EXISTS(

{[Customer].[Customer Number].Members},

STRTOSET(“[Customer].[Domain User].&[“+username()+”]”),

“Security”

)

clip_image008

5. Create SSIS package to populate and update Acct_Access and User tables in DataMart

With the table structure and the Cube solution complete, we need to create the SSIS package to populate and update the Acct_Access table in our CRM database and then bring the updated data into the DataMart. What we need:

a. First we create a SQL script task and add the script below to delete and repopulate the table with the current CRM customer access data. Change the yellow highlighted areas below for your environment needs.

execute as login = ‘Contoso\bcrowell’

delete Acct_Access

GO

declare @DomainName varchar(200)

–Make sure you execute the step to create the domain user table with the correct permissions.

execute as login = ‘Contoso\bcrowell’

if exists (select * from sysobjects where name = ‘DomainUser’)

drop table DomainUser

select DomainName into DomainUser from SystemUser

grant select on DomainUser to [Contoso\Domain Users]

declare SEL_CUR cursor for select DomainName from SystemUser where IsDisabled = 0

open SEL_CUR

fetch next from SEL_CUR into @DomainName

while @@FETCH_STATUS = 0

BEGIN

SELECT SUSER_NAME(), USER_NAME();

execute as login = @DomainName

insert into Acct_Access(AccountNumber, AccountID, DomainUser) select AccountNumber, AccountID, @DomainName from FilteredAccount

fetch next from SEL_CUR into @DomainName

REVERT

END

close SEL_CUR

deallocate SEL_CUR

drop table DomainUser

GO

b. Add additional Data Flow task to take CRM Acct_Access data and populate/update the DataMart’s Acct_Access and User tables.

Conclusion

I’m not the first to deploy some of the techniques used above and I started my learning by reviewing these two blog posts below and modified it to work with security rights already setup in Microsoft CRM.

· http://bidn.com/Blogs/analysis-services-dynamic-security

· http://bifuture.blogspot.com/2011/09/ssas-setup-dynamic-security-in-analysis.html

I hope this helps you with your future Analysis Services and Dashboard development projects.

How to Create GP Workflow 2.0 Custom Reporting

Recently, I completed a GP 2015 R2 upgrade with Workflow setup for vendor and check batch approval. Microsoft has made changes to the Workflow to move it off SharePoint and the old Workflow version’s reporting capabilities could only be accessed through the Workflow web site. Knowing that the client was subjected to audit reviews on a quarterly basis, I needed to come up with a way for them to produce reports on both current/open and historical/approved workflows. I created the following SQL view in their company database and then in GP used SmartList Designer to create a SmartList based on the view.

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER OFF

GO

CREATE VIEW dbo.vw_WorkFlow_Status

AS

WITH CTE_FINAL (WorkflowInstanceID, Workflow_Name, Workflow_Step_Name, Approver, Workflow_Action, Completion_Date, Completion_Time, Comments )

AS

(select d.WorkflowInstanceID,

d.Workflow_Name,

d.Workflow_Step_Name,

CASE WHEN a.ADDisplayName is null THEN ”

ELSE a.ADDisplayName

END as [Assigned Approver],

CASE WHEN d.Workflow_Action = 1 THEN ‘Submit’

WHEN d.Workflow_Action = 2 THEN ‘Resubmit’

WHEN d.Workflow_Action = 3 THEN ‘Approve’

WHEN d.Workflow_Action = 4 THEN ‘Task Complete’

WHEN d.Workflow_Action = 5 THEN ‘Reject’

WHEN d.Workflow_Action = 6 THEN ‘Delegate’

WHEN d.Workflow_Action = 7 THEN ‘Recall’

WHEN d.Workflow_Action = 8 THEN ‘Escalate’

WHEN d.Workflow_Action = 9 THEN ‘Edit’

ELSE ‘Final Approve’

END as Workflow_Action,

convert(varchar(10),d.Workflow_Completion_Date, 101) as [Completion_Date],

right(‘0’+LTRIM(right(convert(varchar,d.Workflow_Completion_Time,100),8)),7) as Completion_Time,

d.Workflow_Comments

from dbo.WF30100 d

LEFT JOIN WF40200 a ON d.Workflow_Step_Assign_To = a.UsersListGuid

WHERE d.Workflow_Action = 10)

Select –c.WorkflowInstanceID,

c.Workflow_Name,

c.Workflow_Type_Name,

c.Workflow_Originator,

c.WfBusObjKey as Approval_Request,

CASE WHEN c.Workflow_Status = 1 THEN ‘Not Submitted’

WHEN c.Workflow_Status = 2 THEN ‘Submitted’

WHEN c.Workflow_Status = 3 THEN ‘No Action Needed’

WHEN c.Workflow_Status = 4 THEN ‘Pending User Action’

WHEN c.Workflow_Status = 5 THEN ‘Recalled’

WHEN c.Workflow_Status = 6 THEN ‘Completed’

WHEN c.Workflow_Status = 7 THEN ‘Rejected’

WHEN c.Workflow_Status = 8 THEN ‘Workflow Ended’

WHEN c.Workflow_Status = 9 THEN ‘Not Activated’

ELSE ‘Deactivated’

END as Workflow_Status,

CASE WHEN d.Approver is null THEN ”

ELSE d.Approver

END as Approver,

CASE WHEN d.Completion_Date is null THEN ”

ELSE d.Completion_Date

END as Completion_Date,

CASE WHEN d.Completion_Time is null THEN ”

ELSE d.Completion_Time

END as Completion_Time,

CASE WHEN d.Comments is null THEN ”

ELSE d.Comments

END as Comments

FROM dbo.WFI10002 c

LEFT JOIN CTE_FINAL d ON D.WorkflowInstanceID = c.WorkflowInstanceID

GO

Grant Select on dbo.vw_WorkFlow_Status to DYNGRP

Once your done adding the SQL view you can create a SmartList report with SmartList Designer and the complete SmartList should look like the screen shot below.

SmartList Workflow Report

Dynamics GP SmartList default produces no data

Recently when working with the client that has eOne Solutions SmartList Builder installed they came across an issue where their default (*) Sales Line Items SmartList was not producing data when they clicked on it.  Dynamics GP would freeze and the process would be stuck in the Process Monitor.  When you install the SmartList Builder add-in to Dynamics GP, it copies all of your SmartLists and the favorites that you created into eOne Solution tables.  Any Dynamics GP user that selects the “Modify” button on the SmartList window will duplicate any default SmartLists and cause the issue of that SmartList then not producing any data. This button’s functionality should be removed from the general GP users security role to eliminate this from happening in the future.

GP SmartList 1

To fix the issue follow the steps below –

  1. Open SmartList Builder

GP SmartList 2

  1. Select “Existing” and then any SmartList ID with “*” at the end. (i.e. Sales Line Items* in screen shot)

GP SmartList 3

  1. Select the SmartList and then click on “Delete” button to remove modified default SmartList causing the issue.

GP SmartList 4

4. Select SmartList and the Default SmartList you were having an issue with and it should now produce data with no issues.

What happened to all of my GP OLE Notes?

During the migration of OLE Notes to the GP 2013 and above to the document attach functionality not all OLE Notes will get migrated through the Migration tool. After the GP OLE Notes migration tool has completed you will see a screen detailing the process and the number of unsuccessfully extracted notes. Through research from other GP OLE Notes blog, I was able to piece together a manual process to identify, extract and then attach the document within GP to the appropriate record.

GP OLE Notes 002

The Manual Process

Part One – Identifying the Record the OLE Note is associated with

All OLE Notes that weren’t successfully extracted by Microsoft’s migration utility will remain in the original extraction path.

1. To create a list of the files within the folder follow the directions in this Microsoft KB https://support.microsoft.com/en-us/kb/196158. After that I created an Admin database in SSMS on the SQL server and then imported the results within the txt file to a table called OLENotes_Missing.

2. I ran the following SQL script in SSMS to find all missing OLE Note names and associated Note Indexes and convert the HEX Note name.

SELECT NoteINDEX, DATE1, HEX_REMOVE as [OLENote Name]

FROM [ADMIN].[dbo].[OLENotes_Missing]

Order By DATE1 DESC

3. Take the Note Index from first SQL Script and run below SQL script in SSMS to find associated tables. Replace Note Index searching for into yellow highlighted area.

USE [<Production Database>]

DECLARE @noteindx nvarchar(20)

DECLARE @tablename nvarchar(20)

DECLARE @sqlstring nvarchar(MAX)

DECLARE @param nvarchar(100)

DECLARE @print nvarchar(100)

SET @noteindx = ‘9708539’ –Change to note index that you would like to find

DECLARE notecursor CURSOR FOR

SELECT c.TABLE_NAME FROM INFORMATION_SCHEMA.COLUMNS c

LEFT OUTER JOIN INFORMATION_SCHEMA.TABLES t ON t.TABLE_NAME = c.TABLE_NAME

WHERE c.COLUMN_NAME = ‘NOTEINDX’ AND t.TABLE_TYPE = ‘BASE TABLE’

OPEN notecursor

FETCH NEXT FROM notecursor INTO @tablename

SET @sqlstring = N’SELECT ”’ + @tablename + ”’ AS TableName, NOTEINDX, DEX_ROW_ID FROM ‘ + @tablename + ‘ WHERE NOTEINDX = ‘ + @noteindx + CHAR(13);

WHILE @@FETCH_STATUS = 0

BEGIN

SET @sqlstring += N’ UNION SELECT ”’ + @tablename + ”’ AS TableName, NOTEINDX, DEX_ROW_ID FROM ‘ + @tablename + ‘ WHERE NOTEINDX = ‘ + @noteindx + CHAR(13);

FETCH NEXT FROM notecursor INTO @tablename

END

CLOSE notecursor

DEALLOCATE notecursor

EXEC(@sqlstring)

4. Output from above script will look like below.

GP OLE Notes 004

5. Disregard TableName SY03900 and substitute the other TableName and DEX_ROW_ID into SQL script below and execute in SSMS. This will give you the record that the OLE Notes document will need to be attached to.

SELECT * from SVC00600 WHERE DEX_ROW_ID = 51259

GP OLE Notes 006

PART 2 – Document Extraction and Attachment

Using 7Zip you can manually extract the contains of the file.

1. Open 7Zip and navigate to original OLE Note extraction location.

2. Find Note Name you wish to open and then drill down into Embedding folder(s)

GP OLE Notes 008

3. Open CONTENTS file with Internet Explorer

GP OLE Notes 010

4. Print output to PDF

GP OLE Notes 012

5. Log into GP and find the record from SQL script from step 5 of Part 1 and then attached record using document attach functionality. Follow Steps 1 through 3.

GP OLE Notes

A very manual process, but worth the time if you need the documents within your GP system.

Design Best Practices to Build Killer Dashboards

When buying real estate its all about location, location, location. Like real estate, dashboards or data visualizations are all about one thing – data, data, data. With that in mind here are some of my data visualization best practice design tips. Make your data visualization:

  • Relative
  • Interactive
  • Simple

Make it Relative

Make sure that the data underlying your dashboard is up-to-date, accurate and that your selected KPIs, metrics and charts tell your intended story or current business challenges.

Data can be from this quarter, this week, this hour—whatever the right timeline is for your business. Stale data can lend to you thinking you’re making fact-based decisions, but the data is no longer representative of or relevant to your current situation.

Make it interactive

Your dashboard or data visualization has to be relevant to its intended audience. Dashboards need to start with an audience in mind. Who is the consumer of the dashboard? What are their information needs? What do they already know? What are their experiences and prejudices? As we design the dashboard, understanding the consumers of the dashboard will help us craft a product that they love to use.

A complicating factor is that most dashboards have multiple audiences. In fact, delivering the same dashboard across an entire organization has the potential benefit of getting everyone on the same page. However, a diverse audience is hard to serve well. Therefore, try to prioritize the audiences so conflicts can be more easily handled.

After figure out your intended audience, you need to choose the right KPIs, metrics and make it visually engaging – you’ve got everyone looking at the same page. But once they’re on the same page, viewers will have unique questions about what they see. Create your dashboard so that individual viewers can interact with it to get the answers they seek.

Keep it Simple

This one is critical. No Clutter! Resist the temptation to make your dashboard too ashy or over-designed, with gauge-like graphics and widgets. Any dashboard or data visualization should follow the Fibonacci Spiral principle. It is a pretty cool thing to use/apply to your dashboards. Use it wisely and when it works, it works really well. Not all visualizations have to fit the Fibonacci Spiral though. For more on the Fibonacci Spiral principle, please see these links:

· https://www.youtube.com/watch?v=yIf1cy0GxUM&feature=youtu.be&t=20m16s

· http://www.makeuseof.com/tag/golden-ratio-photography/

For simple and well thought out dashboard designs choose the right chart to tell your story. So how do you choose the right chart? While a few years ago Dr. Andrew Abela published a good Diagram helping to decide about which charts are a better fit for a given data and problem at hand (please click on image below to see it in full size).

clip_image002

Dr. Abela also published interesting thoughts about visualization taxonomies and recommended this 3 years old book by Dan Roam, who also published this Visual Thinking “Codex”. Juice Analytics converted Dr. Abela’s diagram to online Chart Chooser application, I suggest to review it when building your dashboard and selecting a chart to visualization your data.

clip_image004

Final Thoughts

Hope this helps you with your future dashboard and data visualization designs..