As we all know, Report Server Execution Log plays a significant role and is the first place to look into in order to analyse & optimize the Performance of Reporting Services. Hence in each version of SQL Server, the ExecutionLog view in Report Server database is enhanced to gather more and more information.
If you need more information on Report Server Execution Log you can check the following link
However, in this blog post I intend to explain you how to analyse the report server execution log.
The Execution Log contains the following information viz
1. User executed the report
2. Report name and Report Path
3. Instance which executed the report (useful in scale out deployments)
4. Total execution time (Start Time and End Time)
5. Data Retrieval Time
6. Report Processing Time
7. Report Rendering Time
8. Source of the Report (Live,Cache,Snapshot,Session cache etc)
9. RequestType (Interactive,Subscription etc)
10. Status (Success or any error)
For analytics, I can interpret the above information as the following dimension model
- Count of the number of executions of the report
- Avg Execution Time taken by the report
- Avg Data Retrieval time taken by the report
- Avg Report Processing time taken by the report
- Avg Report Rendering time taken by the report
The above measures needs to be sliced across the following dimensions
- Report Catalog (DimReport)
- User (DimUser)
- Instance (DimInstance)
- Source of Report (DimSource)
- RequestType (DimRequest)
For the above dimension model, I can create a separate DW database defining the dimension and fact table and then design an SSIS package to load the data periodically from Execution Log table to DW and further design a cube and add reports to interpret and analyse the data to identify
- Most frequently accessed reports ( which can be a good candidate for caching/snapshot)
- Reports where data retrieval takes long time ( which requires SQL query tuning)
- Reports which takes long time for processing (which are too complex and should be simplified)
- Reports which takes too long (which are slow and needs optimization)
Although the above reports are helpful and important but creating a separate Dimensional Model, SSIS package and Reports to analyse the report execution log might not be good idea due to following
- The time required to design the DW,SSIS packages and reports is more.
- A separate sql instance or adding the above database to an existing instance adds an overhead to the server.
- Only 1 or 2 users who are Report Administrators need this report and it is not required for wide variety of users.
So the cost of implementing this solution might far far exceed its benefits.
This is where SELF SERVICE BI (Powerpivot or PowerView (soon to be released)) plays an important role and help us achieve the same results without spending much time and infrastructure.
So to use Powerpoint, one needs to install Excel 2010 with shared Office components alongwith powerpivot add-in for excel 2010.
Once installed, you can open Powerpivot window from Excel and import the following tables from the SQL Server instance hosting the Report Server database
- ExecutionLog3 (in case of SQL 2008 R2, ExecutionLog2 in case of SQL 2008)
This will create 2 tables in your BISM model of your powerpivot worksheet.
Delete the undesired columns from ExecutionLog3 such as itemaction,format,bytecount which are not our measurable.
From the Catalog table, you can keep path and name column and delete the remaining columns.
This is done to ensure we don’t bring more data than required and thereby improve the performance of our report.
In the Design tab of the powerpivot window, you can click Create Relationships and create a relationship between ExecutionLog3 and Catalog table such that Itempath column of ExecutionLog3 refers to Path column of Catalog
You will now observe a empty PivotChart worksheet alongwith a powerpivot Field list window aligned on the right as shown below.
You can expand the ExecutionLog3 table and drag an drop ( or simply check) the following column to the value section of the Powerpivot window
You will see that they get added as Sum which is a default aggregate for all values. However you can right click these columns in values section and change the aggregate to Average as shown below
if you notice, for executionid column the default aggregate added was Count which is correct and shouldn’t be change as we need to plot the number of time the report was fetched.
From the catalog table, you can drag the name the column into the Axis Field Section of the PowerPivot window so that report name forms the X-axis of the chart.
This is the reason to import catalog table in the powerpivot since ExecutionLog3 view stores the Path of the report and not the actual report Name.
Further we can add a RequestType column as ReportFilter since we are interested to analyze only the performance of Interactive Report and least bothered for subscription report which comparatively takes lesser priority
Now, we can add the Horizontal and Vertical Slicers which can help us analyse the data more interactively.
In our case, the fields InstanceName and UserName forms the Vertical Slicers while Source and Status forms the Horizontal Slicers as shown below
These Slicers will help me identify and slice the data in the following manner,
- in which instance the requests are more than the other ( useful for scale out deployments)
- Which User queries which report more
- Which report request failed or aborted (status column)
- How many requests were catered from Cache,Snapshot,Live,Session
After adding the above, my Pivot Chart Report looks as shown below
If I want to find the reports which were aborted maximum number of times during processing, I can simply click on the Horizontal slicer for Source and click on rsProcessingAborted and the graph changes to as shown below
This shows be 2 of my reports has gone for rsProcessingAborted status.Similarly, you use various other slicers to quickly interpret and analyse the report execution log.
Finally, you use the RequestType ReportFilter to filter only Interactive report by clicking on ReportType at the top of the chart as shown.
I was able to create this report in 30 mins without any knowledge of Dim or Fact tables and with the help of Powerpoint add-in which is free downloadable tool
Using PowerPivot to analyze the Report Server Execution Log to me is most cost-effective,efficient and smart way to analyze the logs.
Further this excel sheet can be published in the sharepoint if the same report needs to be accessed and seen by more than 1 users and further the access to the report can be controlled by sharepoint security.
This is one of the example of Self Service BI which can be acheived using Powerpoint and whose visualization is further enhanced by using Powerview which will be launched in SQL 2012.
Hope this helps !!!
Premier Field Engineer