Powerful Dynamics GP Field Service Reporting with Power BI

On August 23, 2017, Microsoft announced the release of the Power BI solution template for Microsoft Dynamics 365 for Field Service. The template includes a set of reports designed for field service professionals. The solution template offers a very fast guided experience to create compelling reports on an extensible, scalable, and secure architecture and can be customized as needed. This means that instead of spending your time on plumbing, you can instead spend it on extending and customizing the solution template to meet your organization’s needs.

In this post, we will create the same report functionality inside of Power BI using Dynamics GP Field Service’s service call information. We first need to get the information from the following Field Service tables:

  • SVC00100 – Technician Master
  • SVC00200 – Service Call Master
  • SVC00201 – Service Call Analysis Codes
  • SVC00203 – Service Call Line Detail
  • SVC00901 – Cause Codes Validation
  • SVC00906 – Warranty Validation Codes
  • SVC00907 – Problem Codes Validation
  • SVC00908 – Repair Codes Validation
  • SY03900 – Record Notes Master

Almost all the tables are a straight import of the data except for the Service Call Analysis Code. This table contains Equipment Item Number, Problem Code, Repair Code and three note indexes with the Service Techs’ notes related to the fix. For this data, I created the following view:

                    ALTER View [dbo].[ServiceCallEquipment]

                    AS

                    SELECT [SRVRECTYPE]

                    ,[CALLNBR]

                    ,[LNITMSEQ]

                    ,[EQUIPID]

                    ,[ITEMNMBR]

                    ,[PRDLINE]

                    ,[PROBCDE]

                    ,[CAUSECDE]

                    ,[RPRCODE]

                    ,coalesce(a.TXTFIELD, ”) as [Note_1]

                    ,coalesce(b.TXTFIELD, ”) as [Note_2]

                    ,coalesce(c.TXTFIELD, ”) as [Note_3]

                    FROM [dbo].[SVC00201]

                    LEFT OUTER JOIN dbo.SY03900 a on SVC00201.Note_Index_1 = a.NOTEINDX

                    LEFT OUTER JOIN dbo.SY03900 b on SVC00201.Note_Index_2 = b.NOTEINDX

Now that we have determined our Service Call data points and imported the data into Power BI, we need to create our table relationships. The final table relationship model should look like this:

Power BI Service Call Table Relationships

With our relationships established, we now need to create our measures to be used in our reports. Here are the DAX measures I’ll be creating:

  • Completed Service Calls

            Completed Service Calls = CALCULATE(COUNTROWS(ServiceCalls),FILTER(ServiceCalls,ServiceCalls[SRVSTAT]=”70C”))

  • Open Service Calls

            Open Service Calls = CALCULATE(COUNTROWS(ServiceCalls),FILTER(ServiceCalls,ServiceCalls[SRVSTAT]<>”70C”),ServiceCalls[SRVSTAT]<>”OOC”)

  • Total Completion Time (in Minutes)

            Total Completion Time = SUMX(FILTER(ServiceCalls, ServiceCalls[SRVSTAT]=”70C”),[Completion Time])

  • Avg Completion Time per Order (in Minutes)

            Avg Completion Time per Order = DIVIDE( [Total Completion Time],[Completed Service Calls],0)

  • Equip Service Call Count

            Equip Service Call Count = COUNTROWS(ServiceCallEquipment)

Analyzing Open Service Calls

Now it’s time to create our reports. The first report we are going to create is for our Service Calls or Open Work Orders. For this report, we will use the following charts:

  • Active Work Order – Using a Card visual and Open Service Calls measure.
  • Work Orders by Type – Using a Donut Chart and Open Service Calls measure and Service Type.
  • Work Orders by Status – Using a Donut Chart and Open Service Calls measure and Service Status.
  • Open Work Orders by Resource – Using a Stacked Bar Chart and Open Service Calls measure in the Values section and Service Tech and Customer Name in the Axis section of the chart.
  • The final visual on the report is a matrix to see our open work order details associated with each chart above when click on them to cross filter.

Here is what the Open Work Orders report looks like:

Power BI Service Call Open Orders

Analyzing Completed Work Orders

Time to analyze our completed service call/work orders. For this report, we will use the following charts:

  • Completed Work Order – Using a Card visual and Completed Service Calls measure.
  • Avg Completion Time (in Mins) – Using a Card visual and Completed Service Calls measure.
  • Avg Completion Time per Order by Service Type – Using a Stacked Column Chart and Completed Service Calls measure and Service Type.
  • Avg Completion Time per Order by Service Tech – Using a Stacked Column Chart and Completed Service Calls measure and Service Tech ID.
  • The final visual on the report is a matrix to see our open work order details associated with each chart above when click on them to cross filter.

Here is what the Completed Work Orders report looks like:

Power BI Completed Orders

Analyzing Equipment Repairs

Time to analyze our completed service call/work orders. For this report, we will use the following charts:

  • Equipment Service Call count by Problem Code – Using a Stacked Bar Chart and Equip Service Call Count measure in the values section and Problem Code and Customer Name Axis section.
  • Equipment Service Call count by Repair Code – Using a Stacked Bar Chart and Equip Service Call Count measure in the values section and Repair Code and Customer Name Axis section.
  • The final visual on the report is a matrix to see our open work order details associated with each chart above when click on them to cross filter.

Here is what the Completed Work Orders report looks like:

Power BI Equipment Repair analysis

Conclusion

Stay tuned for the future Power BI developments and an overview of their functionality and for help with building reports like this.

Advertisements

SSIS and PowerShell – A Powerful Combination

PowerShell is a powerful task automation tool from Microsoft. Although SSIS does not provide something like Execute PowerShell Script Task out of the box, it does have an Execute Process Task which can be used to run PowerShell scripts just as easily.

On a recent project, I was tasked with downloading and updating Federal Election Commission, FEC.GOV, Committee and Candidate Election data into Dynamics GP vendor maintenance and eOne Solutions Extender tables. To accomplish this, I created a PowerShell script to download the zip files for the FEC.GOV web site, added this script inside a SSIS package Execute Script task and then added additional data flow tasks to un-zip the files and import into custom SQL tables for use with a SmartConnect integration into Dynamics GP. The final SSIS package looked like the screen shot below.

SSIS Package Control Flow

Here are the detailed steps to create this SSIS package:

1.  Create the two custom tables

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

SET ANSI_PADDING ON

GO

CREATE TABLE [dbo].[FEC_Candidate](

[Candidate_ID] [varchar](9) NOT NULL,

[Candidate_Name] [varchar](200) NULL,

[Party_Affiliation] [varchar](3) NULL,

[Year_of_Election] [numeric](4, 0) NULL,

[Candidate_State] [varchar](2) NULL,

[Candidate_Office] [varchar](1) NULL,

[Candidate_District] [varchar](2) NULL,

[Incumbent_Challenger_Status] [varchar](1) NULL,

[Candidate_Status] [varchar](1) NULL,

[Principal_Campaign_Committee] [varchar](9) NULL,

[Street1] [varchar](35) NULL,

[Street2] [varchar](35) NULL,

[City] [varchar](35) NULL,

[State] [varchar](2) NULL,

[Zip] [varchar](9) NULL

) ON [PRIMARY]

GO

SET ANSI_PADDING OFF

GO

CREATE TABLE [dbo].[FEC_Committee](

[Committee_ID] [varchar](50) NOT NULL,

[Committee_Name] [varchar](200) NULL,

[Treasurer_Name] [varchar](100) NULL,

[Street1] [varchar](50) NULL,

[Street2] [varchar](50) NULL,

[City] [varchar](50) NULL,

[State] [varchar](50) NULL,

[Zip] [varchar](50) NULL,

[Committee_Designation] [varchar](50) NULL,

[Committee_Type] [varchar](50) NULL,

[Committee_Party] [varchar](50) NULL,

[Filing_Frequency] [varchar](50) NULL,

[Interest_Grp_Category] [varchar](50) NULL,

[Connected_Org_Name] [varchar](200) NULL,

[Candidate_ID] [varchar](50) NULL

) ON [PRIMARY]

GO

SET ANSI_PADDING OFF

GO

2.  Delete Extracted FEC Files – Delete the previous downloaded zip files

a.  Add Script Task to the Control Flow

b.  Add C# script to delete previous extracted files – Add the Script Task to the Control Flow and give it a name “Delete Extracted FEC Files”. Then edit it by choosing the Script Language (C# or VB.NET). I used C# for my solution.

Script Task Editor

C# delete files script

public void Main()

{

string directoryPath = @”D:\KTL\FEC_Downloads\Extract”;

string[] oldFiles = System.IO.Directory.GetFiles(directoryPath, “*.txt”);

foreach (string currFile in oldFiles)

{

FileInfo currFileInfo = new FileInfo(currFile);

currFileInfo.Delete();

}

// TODO: Add your code here

Dts.TaskResult = (int)ScriptResults.Success;

}

3.  Truncate FEC tables – Truncate the custom tables from the previous download

4.  Download FEC files from Website – Attached the below PowerShell script to this data flow task to get the updated FEC data

a.  Create PowerShell script and save as a .ps1 file

Invoke-WebRequest ftp://ftp.fec.gov/FEC/2018/cm18.zip -OutFile D:\KTL\FEC_Downloads\Archive\Committee.zip

Invoke-WebRequest ftp://ftp.fec.gov/FEC/2018/cn18.zip -OutFile D:\KTL\FEC_Downloads\Archive\Candidate.zip

Invoke-WebRequest ftp://ftp.fec.gov/FEC/2018/ccl18.zip -OutFile D:\KTL\FEC_Downloads\Archive\Link.zip

b.  Add PowerShell.ps1 to SSIS Execute Process task – Add the SSIS Execute Process task to the Control Flow and add PowerShell.exe to the Executable field and –F D:\KTL\FEC_Downloads\FEC_Download.ps1 to the Arguments field. The Arguments will need to change based on your system file location and name.

Execute Process for PowerShell script

5.  Foreach loop to Extract the zip files

a.  Add a Variable to the SSIS package – Add a variable to the package named “ZipFullPath”

SSIS Package variable

b.  Add a Foreach Loop to the Control Flow and then add then the following to the Collection and Variable Mappings within the Foreach Loop Editor

Foreach Loop folder location

Foreach Loop variable

6.   Add a Script Task inside the Foreach Loop

a.  Inside the Foreach Loop add the Script Task and open the Editor. Use C# with the ReadOnlyVariables shown below.

C# Script Editor

b.  Click on the Edit Script and add the following script

public void Main()

{

string zipfullpath=Dts.Variables[“User::ZipFullPath”].Value.ToString();

string inputfolder=Dts.Variables[“$Package::InputFolder”].Value.ToString();

using (ZipArchive arch=ZipFile.OpenRead(zipfullpath))

{

foreach(ZipArchiveEntry entry in arch.Entries)

{

entry.ExtractToFile(Path.Combine(inputfolder,entry.FullName));

}

}

File.Delete(zipfullpath);

// TODO: Add your code here

Dts.TaskResult = (int)ScriptResults.Success;

}

The final script should look like this:

C# zip extract script

7. Add a Data Flow task to the Control Flow – Add a Data flow task and two data flow from the Committee text file and one for the Candidate text file into each of the custom tables created in the first step.

Data Flow screen shot

Conclusion

Hope this helps show you some of the possibilities that you can create by combining PowerShell with SSIS.