Quantcast
Viewing latest article 10
Browse Latest Browse All 13

Blog Series: Creating BI Solutions with BISM Tabular (Part 3 of 4)

Guest Author: Dan Clark, Senior BI Consultant, Pragmatic Works

Introduction

The goal of this blog series is to expose you to the process needed to create a BISM Tabular Model in SQL Server 2012 and deploy it to an Analysis Server where it can be exposed to client applications. Part 1 covered setting up a tabular model project and importing data into the tabular model, and  Part 2 looked at table relations, implementing calculations and creating measures with DAX. This segment covers time based analysis, implementing semi-additive measures and securing the model. Part 4 will conclude the series by looking at how you deploy and connect to the tabular model from a client application.

Time Based Analysis

Comparing and aggregating data over different time periods is one of the most common ways to analyze performance. For example, you may want to sum up year to date sales or compare this quarter’s sales to the same quarter last year. DAX makes these calculations easy by providing 35 functions to work with time based data.

In order to implement time based functions in DAX, you need to import a date table that contains one distinct row for every date in the date range of your data. Once the date table is added, it is marked as the date table in the model (See Figure 1). This table is then used by DAX to create a set of dates for each calculation.

Image may be NSFW.
Clik here to view.
Marking_the_Date_Table

Figure 1 – Marking the Date Table

DAX always calculates a set of dates as a table. This table is then used as a filter argument to the CALCULATE function. Since a user may select multiple dates within a PivotTable, the context for a calculation might be a single date, a set of dates or a set of dates. These dates may also need to be shifted to a previous or future set of dates. For example, it is very common to compare sales from previous years, months, or quarters.

As an example, you may want to compare sales from the current month to sales from the previous month. Using the DAX PreviousMonth function as part of the filter for the Calculate function, you can easily calculate the sales for the previous month.

CALCULATE(SUM(ResellerSales[SalesAmount]),PREVIOUSMONTH('Date'[FullDate]))

This can then be used to measure the month over month sales growth. Figure 2 shows a PowerPivot table with a KPI based on monthly sales growth.

Image may be NSFW.
Clik here to view.
Monthly_Sales_Growth

Figure 2 – Monthly Sales Growth

Another example of incorporating date functions is calculating values over a period of time. You may want to look at sales as a running total. The DAX functions TOTALMTD, TOTALQTD, and TOTALYTD make it very convenient to calculate running totals. As an example, the following function calculates the sales as a running total for the month.

CALCULATE(SUM([SalesAmount]),DATESMTD('Date'[FullDate]))

Semi-Additive Measures

A common scenario when analyzing data is when the measure is additive over one dimension but not additive across another. For example, you may want to look at inventory levels over time. The inventory level for each product is additive across the stores but not additive across time.

To calculate the inventory level for a store you need to determine the last non blank value for the time period. The DAX function LASTNONBLANK allows us to determine this value. The LASTNONBLANK is used as a filter in the CALCULATE function to add up the inventory for the time period being analyzed.

CALCULATE([InventoryLevel], LASTNONBLANK(Date[FullDate],[InventoryLevel])

One adjustment that needs to be made is accounting for previous inventory left over prior to the time period being analyzed. For example if you are looking at inventory for 2012 you need to account for inventory left over from 2011. To do this you use the DATESBETWEEN function to expand the date context to account for earlier dates.

DATESBETWEEN(Date[FullDate], BLANK(), LASTDATE(Date[FullDate]))

The Blank() function is used to signify a start date of the beginning of the dates recorded in the date table. Combining these equations gives us a measure of the amount of inventory at any given date.

CALCULATE([InventoryLevel], LASTNONBLANK(DATESBETWEEN(Date[FullDate], BLANK(), LASTDATE(Date[FullDate])),[InventoryLevel])

Figure 3 shows a PowerPivot table for analyzing inventory levels. Notice how the values are additive across stores but not across years.

Image may be NSFW.
Clik here to view.
Analyzing_Inventory_Levels

Figure 3 – Analyzing Inventory Levels

Security

Security in a tabular model is based on roles and permissions. You create a role and assign permissions to the role. These permissions define the actions that a role member can take and what data they can see. Windows users and groups are assigned to the roles. A user can belong to more than one role and permissions are cumulative by least restrictive. For example, if a user belongs to one role with read permissions, and another role with the Read and Process permission, the Read and Process permission is used. The Role Manager dialog is used to add roles to the Tabular Model (Figure 4).

Image may be NSFW.
Clik here to view.
Setting_Up_Role_Base_Security

Figure 4 – Setting Up Role Base Security

You can also implement row level security using a DAX expression that evaluates to true or false. For example in Figure 5, the role is restricted to the US data.

Image may be NSFW.
Clik here to view.
Implementing_Row_Level_Security

Figure 5 – Implementing Row Level Security

If you need to implement dynamic security (security based on user name instead of a role), you can use the DAX UserName function which returns the domain\username of the currently logged on user. You can then use this information to look up pertinent data such as department Id and restrict row access by department.

Testing the Tabular Model

A useful feature of the Tabular Model Designer in the SQL Data Tools project is the ability to analyze and test the model in Excel. When you launch the Analyze in Excel feature, Excel launches and automatically creates a connection to the tabular model and creates a PivotTable worksheet. This creates a convenient environment for testing the model and security. When Excel is launched you are asked what user or role you want to connect as and what perspective you want to use (Figure 6).

Image may be NSFW.
Clik here to view.
Specifying_the_Role

Figure 6 – Specifying the Role

After selecting the role to run under, Excel displays a pivot table sheet. The tables and measures of your model will appear in the PivotTable field list (Figure 7).

Image may be NSFW.
Clik here to view.
The_Field_List_in_Excel

Figure 7 – The Field List in Excel

Adding fields to the regions below the list populates the pivot table with data values (Figure 8).

Image may be NSFW.
Clik here to view.
Viewing_the_Model_Data_in_a_Pivot_Table

Figure 8 – Viewing the Model Data in a Pivot Table

Summary

The goal of this blog series is to expose you to the process needed to create a BISM Tabular Model in SQL Server 2012 and deploy it to an Analysis Server where it can be exposed to client applications. This segment covered time based analysis, implementing semi-additive measures and securing the model. Join me for the final segment of this series where we will look at how you deploy and connect to the tabular model from a client application.

Image may be NSFW.
Clik here to view.

Viewing latest article 10
Browse Latest Browse All 13

Trending Articles