IN2BI, Microsoft Business Intelligence
data, stories & insights
rss | email | twitter
  • Thu
    02
    Jan 14

    Many Happy User Peaks

    I wrote this blog post in august 2012, but somehow I forgot to post it. At the start of 2014 I wish every one many happy user peaks and wisdom in creating them…

    Recently I have been involved in converting a business intelligence (dashboard) web application. Making it more suitable for tablets and other mobile devices. This offers a great opportunity to talk about this great image of Kathy Sierra:

    The_Featuritis_Curve_Kathy_Sierra

    Of course the location of the “Happy User Peak” will differ for every person. The message of the image however is very clear: limit options and features!

    When constructing an application for mobile devices you’ll need to limit the features and options even further and make it really really simple for your users. They don’t want menus, toolbars, lookup textboxes, lists in combo boxes to choice from.

    They just want to click and swipe to investigate the tables and charts presented in the dashboard.

     

    ourscoreboard

  • Wed
    18
    Dec 13

    Run as Different User

    As a consultant I often need to run a program as a different user. For future reference I collected the information about RunAs in this article.

    In Windows 7 en Windows Server 2008R2 you can run an application as a different user  by holding down the shift key and right clicking the application shortcut.

    runas

    To use this functionality in windows 8 you need some extra steps to configure it. These steps are described by Taylor Gibb in: How to Run Windows 8 Apps as a Different User from the Start Screen

    Another option is to use the ShellRunAs utility that is provided by the Windows Sysinternals website. If you install this utility you can right-click on a shortcut and select “Start as different user”.

    Runas Command

    You can also use the runas command line tool from a command prompt or in a .bat file as in these examples:

    %windir%\System32\runas.exe /netonly /user:domain\username "C:\Program Files\Microsoft SQL Server\110\Tools\Binn\VSShell\Common7\IDE\Ssms.exe”

    This will prompt you for the password of “domain\user” and open Sql Server Management Studio. The \netonly parameter means that the provided identity will be used only with a remote resource the remote site will authenticate you as 'domain\username'.

    As an alternative, first ask for the user name and use the input in the runas command:

    set /P user="Type the domain\username: "
    %windir%\System32\runas.exe /netonly /user:%user% "C:\Program Files\Microsoft SQL Server\110\Tools\Binn\VSShell\Common7\IDE\Ssms.exe”

  • Sun
    08
    Dec 13

    Create a LightSwitch Application to Maintain Metadata

    Correct metadata is essential for every automation solution. When I started working with BIML I used one table that describes the source to target table mapping. (see the “Creating a Meta Data Driven SSIS Solution with BIML” series)

    And that works fine in most simple scenarios like importing staging tables or when you are able to move business rules to a view. Key in these simple scenarios is that the columns in your target table have the same name and the same data type.

    Extending the metadata model

    In more recent assignments I added a column mapping table and used T-SQL to simplify the BIML script. With this approach I created more robust SSIS packages that provided extra possibilities, like:

    • Incremental load patterns
    • Converting data types
    • Surrogate key lookups
    • Expressions to calculate new values

    A problem with this approach however is maintaining these column mappings. So I decided to take Visual Studio LightSwitch for a spin and create a simple application to fulfill my maintenance needs.

    LightSwitch

    Visual Studio LightSwitch is a rapid application development (RAD) tool that is used to help write data-centric line of business (LOB) applications. An import prerequisite of LightSwitch is that you need to have a data model or construct one using the LigthSwitch GUI. Based on this data model you can easily define a set of screens and your application is done. So: No coding necessary. (although coding is possible to extend the standard functionality)

    Data Model

    For my data model I decided to add two additional tables:

    • ETLJob
    • Connection

    From a BIML perspective the ETLJob table is primarily used to create the master package with Execute Package and Execute SQL tasks.  An ETLJob can be either:

    • A generated package (with one or more table mappings)
    • A manual package or a
    • SQL Statement

    So I ended up with the following model:

    image

    The SQL DDL Statements to create this model:

    CREATE SCHEMA meta
    GO

    CREATE TABLE meta.Connection
    (
          ConnectionID INT IDENTITY NOT NULL PRIMARY KEY
        , ConnectionName VARCHAR(64) NOT NULL
        , ConnectionType VARCHAR(20) NOT NULL
        , ConnectionString VARCHAR(256) NOT NULL
        , CreateInProject BIT NOT NULL DEFAULT(0)
        , DelayValidation BIT NOT NULL DEFAULT(1)
    )
    CREATE TABLE meta.ETLJob
    (
          ETLJobID INT IDENTITY NOT NULL PRIMARY KEY
        , ETLJobName VARCHAR(64) NOT NULL
        , ETLJobType VARCHAR(20) NOT NULL
        , ETLJobGroup VARCHAR(64)
        , SQLConnection INT
            REFERENCES meta.Connection(ConnectionID)
        , SQLStatement VARCHAR(512)
    )
    CREATE TABLE meta.TableMapping
    (
          TableMappingID INT IDENTITY NOT NULL PRIMARY KEY
        , ETLJOB INT NOT NULL
            REFERENCES meta.ETLJob(ETLJobID)
        , TableName VARCHAR(64)
        , TableType VARCHAR(20)
        , TableConnection INT NOT NULL
            REFERENCES meta.Connection(ConnectionID)
        , TableSchema VARCHAR(64)
        , SourceConnection INT
            REFERENCES meta.Connection(ConnectionID)
        , SourceObject VARCHAR(256)
    )
    CREATE TABLE meta.ColumnMapping
    (
          ColumnMappingID INT IDENTITY NOT NULL PRIMARY KEY
        , TableMapping INT NOT NULL
            REFERENCES meta.TableMapping(TableMappingID)
        , ColumnName VARCHAR(64) NOT NULL
        , ColumnType VARCHAR(20) NOT NULL
        , ColumnDataType VARCHAR(20) NOT NULL
        , SourceColumnName VARCHAR(64)
        , SourceColumnDataType VARCHAR(20)
        , Calculation VARCHAR(256)
        , LookupConnection INT 
            REFERENCES meta.Connection(ConnectionID)
        , LookupObject VARCHAR(256)
        , InputColumnName VARCHAR(64)
        , InputColumnDataType VARCHAR(20)
        , OutputColumnName VARCHAR(64)
    )

     

    Importing the data model in LightSwitch

    To import this data model you only need 6 steps:

    1. Create a new database. (I named it MetaBase) and execute the  afore mentioned DDL Statements to create the objects and relations.
    2. In Visual Studio create a New Project. Select LightSwitch as template and choose LightSwitch Application (Visual C#). Give the project a name (I used Metabase) and click OK.

      image
    3. In the next screen click Attach to External Data Source.

      image
    4. In the Attach Data Source Wizard form select Database and click next.
      image
    5. In the Connection Properties window enter the server name\instance and Select the database you created in step 1. Click OK.

      image 
    6. Check Tables and use the default name in Specify the name of the data source in the Choose your Database Objects step. Click Finish.

      image 

    You have imported the data model into LightSwitch.

     

    Changing the data model

    When you look at the LightSwitch designer in your data model you will notice LightSwitch made some small name changes, describing the relations. Do not try to correct these. Instead change the display name in the properties window. As I did for the source connection in this screenshot.

    image

     

    Creating the screens

    Creating screens is even easier than creating the data model.

    1. Right-Click Screens in Solution Explorer and choose Add Screen ..

      image

    2. In the Add New Screen dialog select Details Screen as template. Select MetaBaseData.ETLJob as Screen Data and check ETLJob TableMapping. Click OK.

      image

    3. After I created the screen I made some small changes in the designer: Moved some fields up or down by dragging and dropping and changed the number of Lines in the SQL Statement field from 1 to 3.

      image

    I repeated steps 1 and 2 to create additional screens:

    Template Screen Data
    Search Data Screen Connections
    Search Data Screen ETLJobs
    Search Data Screen TableMappings
    New Data Screen Connections
    New Data Screen ETLJobs 2)
    New Data Screen TableMappings 1)
    Details Screen Connections
    Details Screen TableMappings 1)

    1) Add ColumnMappings under Additional data to include
    2) Add TableMappings under Additional data to include

     

    Navigation, Shell and Theme

    Right-Click Screens en select Edit Screen navigation from the context menu. Change the screen navigation to the following image:

    image

    Choose General Properties and change the Shell to LightSwitch Standard Shell and the Theme to LightSwitch Blue Theme.

    image

     

    Conclusion

    With Visual Studio LightSwitch you can easily create an application to maintain the meta data for your BIML solution. You create a normalized data model. Import it in LightSwitch and start adding screens. And if you like BIML you’ll probably like LightSwitch as well: Both make it easier to program your solution.

    An additional advantage of storing the meta data in separate tables and creating an application to maintain that data is that you have a great overview of your data lineage and up to date documentation. Here are some screenshots of the application with data.

    image
    Search ETLJob screen showing the packages and SQL Statements of our solution.

    image
    An ETL Job detail screen with the associated table mappings.

    image
    A Table Mapping detail screen with the associated column mappings.

  • Tue
    17
    Sep 13

    Making your Biml files less complex

    The combination of XML and C# code in a Biml file can make Biml files very complex. In web development Microsoft overcame this problem by introducing the code-behind model with one file for the static (html) text and a separate file for the dynamic code. Unfortunately this is not possible with Biml. There are however some ways to make your files less complex:image

    1. Separate files
    2. Move code logic to T-SQL

    In this post I’ll briefly describe the first option and use some examples to explain the second option.


    Separate files

    You can use the include directive to insert a piece of static xml of another file in your Biml file. John Minkjan has a nice example on his website.
    If you need more dynamic content you can also opt for the CallBimlScript function which will allow you to use properties. Of course you will have to handle these properties in the callee file.


    Move some code logic to T-SQL

    A typical Biml solution exists not only of Biml files but also of metadata that describe the packages that you want to build. I typically use a table mapping table and a column mapping table in a SQL Server database. This allows me to create a stored procedure that combines information from both tables in one dataset with all relevant information in one row.

    For the next examples I will use the following information in this source to target mapping table:

    image


    Using FOR XML PATH(‘’) in T-SQL

    With the “FOR XML PATH(‘’)” syntax you can transform columns into a string. The T-SQL statement:

    SELECT srcColumnList = SELECT  ', ' + srcColumn 
    FROM meta.ColumnMapping WHERE srcColumn is not null
    FOR XML PATH ('')

    returns: , Id, f_name, l_name, sex a string you can almost use as a column list in a source component in the SELECT <column list> FROM <source table> statement. Almost … because you’ll have to remove the first comma and handle strange characters in column names. So you’ll have to do some stuff to remove this comma:

    SELECT srcColumnList = STUFF((SELECT  ', ' + QUOTENAME(srcColumn) FROM meta.ColumnMapping WHERE srcColumn is not null
    FOR XML PATH('')),1,2,'')

    which returns the string we need: [Id], [f_name], [l_name], [sex]


    Creating the data conversion task

    Expanding on the previous statement you can create additional dynamic content for your Biml solution. In this case the column definition in  a Data Conversion Task. In Biml you would write:

    <DataConversion Name="DC">
      <Columns>
       
    <Column SourceColumn="f_name" TargetColumn="FirstName"
                DataType="AnsiString" Length="40" CodePage="1252" />
         <…more columns …>

    </Columns>
    </DataConversion>

    To create the column list for this data conversion use the statement:

    SELECT DCColumns = REPLACE(REPLACE(STUFF(
        (SELECT  char(10) + '<Column SourceColumn="' + srcColumn 
            + '" TargetColumn="' + tgtColumn
            + '" DataType="' + tgtDataType
            + CASE WHEN tgtDataType='AnsiString'
            THEN '" Length="' + CAST(tgtLength AS varchar(10))
                 + '" CodePage="1252" />'
            ELSE '" />' END
         FROM meta.ColumnMapping
         FOR XML PATH('')),1,1,''),'&lt;','<'),'&gt;','>')

    Which returns the Biml string we need:

    <Column SourceColumn="Id" TargetColumn="CustomerID"
         DataType="Int32" />
    <Column SourceColumn="f_name" TargetColumn="FirstName"
         DataType="AnsiString" Length="40" CodePage="1252" />
    <Column SourceColumn="l_name" TargetColumn="LastName"
         DataType="AnsiString" Length="40" CodePage="1252" />
    <Column SourceColumn="sex" TargetColumn="Gender"
        DataType="AnsiString" Length="6" CodePage="1252" />

    Some remarks to the SQL statement:

    • We don’t really need the char(10), but the line break is convenient when we look at the results.
    • Since this query uses the for xml syntax and xml can’t handle the opening en closing tag signs. SQL server replaces them with their escape codes. We have to use the REPLACE function to change these escape codes back into the opening and closing tag signs.

     

    Creating the stored procedure

    For this example I would create the following stored procedures that combines the two metadata tables:

    CREATE PROCEDURE meta.getPackageDetails (@MappingType varchar(50)) AS
    SELECT
          PackageName = t.MappingName
        , TargetTable = t.tgtSchema + '.' + t.tgtTable
        , TargetConnection = t.tgtConnection
        , SourceTable = t.srcSchema + '.' + t.srcTable
        , SourceConnection = t.srcConnection
        , srcColumnList = STUFF((SELECT  ', ' + QUOTENAME(srcColumn)
            FROM meta.ColumnMapping
            WHERE srcColumn is not null
            AND TableMappingID=t.TableMappingID
            FOR XML PATH('')),1,2,'')
        , DCColumns = REPLACE(REPLACE(STUFF(
            (SELECT  char(10) + '<Column SourceColumn="' + srcColumn
            + '" TargetColumn="' + tgtColumn
            + '" DataType="' + tgtDataType
            + CASE WHEN tgtDataType='AnsiString'
            THEN '" Length="' + CAST(tgtLength AS varchar(10))
                + '" CodePage="1252" />'
            ELSE '" />' END
            FROM meta.ColumnMapping
            WHERE TableMappingID=t.TableMappingID
            FOR XML PATH('')),1,1,''),'&lt;','<'),'&gt;','>')
    FROM meta.TableMapping t
    WHERE t.MappingType = @MappingType

    Less complex Biml file

    The Biml file that you need to create the packages:

    <Biml xmlns="http://schemas.varigence.com/biml.xsd">
      <#@ include file="Connection.biml" #>

        <Packages>
        <# string sConn =  “Provider=SQLNCLI10;Server=.\\SQL2012;
           Initial Catalog=BimlSamples;Integrated Security=SSPI;"; #>
        <# string sSQL = string.Format("Exec meta.getPackageDetails
          {0}", "dim"); #>
        <# DataTable tblPackages = ExternalDataAccess.GetDataTable
          (sConn,sSQL); #>
        <# foreach (DataRow pkg in tblPackages.Rows){ #>
           
            <Package Name="<#=pkg["PackageName"]#>"
                    ConstraintMode="Linear">
                <Tasks>

            <Dataflow Name="DFT <#=pkg["PackageName"]#>">
              <Transformations>
                <OleDbSource Name="ODS Source"
                  ConnectionName="<#=pkg["SourceConnection"]#>" >
                  <DirectInput>
                    SELECT <#=pkg["srcColumnList"]#>
                    FROM <#=pkg["SourceTable"]#>
                  </DirectInput>
                </OleDbSource>
               
                <DataConversion Name="DC">
                  <Columns>
                    <#=pkg["DCColumns"]#>
                  </Columns>
                </DataConversion>

                <OleDbDestination Name="ODD Target"
                    ConnectionName="<#=pkg["TargetConnection"]#>">
                  <ExternalTableOutput
                     Table="<#=pkg["TargetTable"]#>"/>
                </OleDbDestination>

              </Transformations>
            </Dataflow>
          </Tasks>
            </Package>
            <#}#>
        </Packages>
    </Biml>
    <#@ import namespace="System.Data" #>

    Conclusion

    In this blog post I discussed some ways to make you Biml files less complex. You can move (semi)-static content to other files and import them with the include directive or use the CallBimlScript function. And you can move parts of complicated Biml code to T-SQL. The choice of how much you will move to T-SQL will largely depend on your proficiency in C# or T-SQL.

  • Sat
    31
    Aug 13

    8 Practical BIML Tips

    You can leverage Business Intelligence Markup Language – or BIML – to automate the creation of Microsoft SQL Server Integration Services (SSIS) Packages. The popular open source BIDSHelper project includes Biml functionality, enabling anyone to write and execute BIML code for free. Among professional SSIS Developers BIML is quickly gaining popularity.
    In this post I’ll share some practical tips ….

    1. Better copy and paste experience

    The irritating behavior of the XML editor in Visual Studio when copying BIML script and how to overcome it  has been documented on several blogs. In this overview of tips I couldn’t discard it.
    In Visual Studio / SSDT … Select Options in the Tools menu.
    In the Treeview of the Options form expand Text Editor, expand  XML and choose Formatting.
    Next uncheck both options under Auto Reformat.

    clip_image002


    2. Put directive tags #@..# at the bottom of your BIML file

    In many examples (including mine) these directive are placed at the top of the BIML file. Which makes sense because this is the default location in software development. However when Visual Studio opens a file with a directive before the <Biml> xml tag it doesn’t use the xml editor and we lose the formatting and intellisense features in the gui.
    So instead place the directives at the bottom of the file. After the closing </Biml> tag. This will not have any effect on the creation of packages.

    3. Vanilla Biml file

    Before using script in a BIML file create a working BIML file that can create a package with more than 80% of the expected functionality.
    Copy this file and use it as a base and then start scripting.
    Why? The Combination of xml and code in one document makes it more complicated to select the correct BIML elements and attributes. Next use a small dataset so when you test your work only a small amount of packages are created.

    4. Test and save often

    During development regularly often check your work. Use the options: “Check Biml for Errors” or “Generate SSIS Packages” from the context menu. This way you not only test your work but save it as well.
    Why? Debugging BIML files is mostly a pain. Error messages are limited and often refer to the wrong row and small typos can have a huge impact. So you better find your errors early in development.

    5. Special XML characters

    Xml has some special characters that you need to enclose in  a CDATA tag or replace the special character with its escape code:

    • double quote " ( &quot; )
    • single quote '  ( &apos; )
    • less than sign  < ( &lt; )
    • greater than sign  >  ( &gt; )
    • ampersand & ( &amp; )

    As an example suppose you have the following BIML:

    <Direct Input>
    SELECT Name FROM dbo.People WHERE Age > 25
    </Direct Input>

    then the xml processor will fail at Age > 25 As a remedy change your BIML into:

    <Direct Input>
    <![CDATA[SELECT Name FROM dbo.People WHERE Age > 25]]>
    </Direct Input>,
    or

    <Direct Input>
    SELECT Name FROM dbo.People WHERE Age &gt; 25
    </Direct Input>

    6. Special C# characters

    C# also as some special characters that you will need to escape with a backslash . Most notably:

    • the backslash itself \ ( \\ )
    • single quote ‘ ( \’ )
    • double quote “ ( \” )

    As an example escaping the backslash in a file location  
    string FileName = “C:\\Dir\\File.txt”;
    or use the verbatim string construction:
    string FileName = @”C:\Dir\File.txt”;

    7. Learn some basic C#

    C# is the principal language of the .NET framework and is widely used for all sorts of programs: Web Services, Web Applications, Windows Form applications, SSIS Scripts, SQL CLR Stored Procedures etc. An investment in learning some C# will pay off. There is an abundant supply of websites and books with relevant information.

    To get you started: read the chapters Basics  and  Flow control  off This tutorial: http://zetcode.com/lang/csharp/

    8. Learn from the samples

    Steal / use the samples on:

  • Thu
    13
    Jun 13

    SQL2012 Windowing Functions In The Data Warehouse–2. Reporting

    This is the second post of a diptych on the magical windowing functions in data warehouse scenarios. With these functions you can greatly simplify the TSQL you write. Many complex queries with CTE’s, temp tables and sub queries can be rewritten to simpler, better maintainable and better performing queries.  In this post I’ll dive into some possibilities for reporting.

    For the examples in this post I’ll use the Contoso Retail Data Warehouse database. A sample database for data warehouses provided by Microsoft.

    Year To Date (YTD) Calculations

    On the Internet you’ll find a lot of examples on using the running total technique to calculate year to date values. In this example I need the monthly sales and the YTD sales for every store.

    SELECT CalendarMonth
      , StoreName
      , PeriodSales
      , SalesYTD = SUM(PeriodSales) OVER 
         (PARTITION BY StoreName, CalendarYear ORDER BY CalendarMonth) FROM
      (
      SELECT CalendarYear
        , CalendarMonth
        , StoreName
        , PeriodSales = SUM(sal.SalesAmount)
      FROM FactSales sal
      JOIN DimDate dat ON sal.DateKey = dat.Datekey
      JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
      GROUP BY CalendarYear, CalendarMonth, StoreName
      ) SalesByMonth
    ORDER BY StoreName, CalendarMonth

    The sub query “SalesByMonth” aggregates the sales amount for every store per month. The windowing function SUM() OVER() calculates the YTD sales. Which will result in the required dataset:

    image

     

    The SUM(SUM()) OVER() Construction

    Since you can use Windowing Functions over an aggregated we don’t need the sub query and we can simplify this query to:

    SELECT CalendarMonth
      , StoreName
      , PeriodSales = SUM(SalesAmount)
      , SalesYTD = SUM(SUM(SalesAmount)) OVER 
         (PARTITION BY StoreName, CalendarYear ORDER BY CalendarMonth) FROM FactSales sal
    JOIN DimDate dat ON sal.DateKey = dat.Datekey
    JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
    GROUP BY CalendarYear, CalendarMonth, StoreName
    ORDER BY StoreName, CalendarMonth

    The second SUM in: “SUM(SUM()) OVER() GROUP BY “ is used in conjunction with the GROUP BY clause to calculate the monthly sales first.
    The first SUM in: SUM(SUM()) OVER() GROUP BY is then used in conjunction with the OVER clause to calculate the YTD sales.

     

    Comparing to previous year

    Adding the figures of the previous year as a comparison is a common reporting requirement. You can easily realize this by using the LAG function returning the results 12 months back in time. Building upon our earlier query:

    SELECT *
      , PeriodSalesPrevYear = LAG(PeriodSales,12,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
      , YTDSalesPrevYear = LAG(SalesYTD,12,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
    FROM
    (
      SELECT CalendarMonth
        , StoreName
        , PeriodSales = SUM(SalesAmount)
        , SalesYTD = SUM(SUM(SalesAmount))
            OVER (PARTITION BY StoreName, CalendarYear ORDER BY CalendarMonth)
      FROM FactSales sal
      JOIN DimDate dat ON sal.DateKey = dat.Datekey
      JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
      GROUP BY CalendarYear, CalendarMonth, StoreName
    ) Base
    ORDER BY StoreName, CalendarMonth

    Which results into:

    image

     

    How Do We Do Compared to the other stores?

    In this example I use the RANK() Function to determine the store’s rank in the total monthly sales and the store’s sales as a percentage of the total monthly sales:

    SELECT CalendarMonth
      , StoreName
      , PeriodSales = SUM(SalesAmount)
      , StoreRank = RANK() OVER
         
    (PARTITION BY CalendarMonth ORDER BY SUM(SalesAmount) DESC)
      , StoreShare = 100*SUM(SalesAmount)/
          SUM(SUM(SalesAmount)) OVER (PARTITION BY CalendarMonth)
    FROM FactSales sal
    JOIN DimDate dat ON sal.DateKey = dat.Datekey
    JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
    GROUP BY CalendarMonth, StoreName

    image

     

    Compare to (Average of) Previous Periods

    In a recent client engagement a report which was used to audit the monthly invoice process gave a lot of troubles. The SQL query behind it was very difficult to comprehend and consisted of several sub queries. By using windowing functions our team was able to greatly simplify the query. The requirement can be restated/simplified to our example as: Give us the current month sales, the previous 3 and the average of those previous 3. This is the resulting query:

    SELECT CalendarMonth
      , StoreName
      , PeriodSales = SUM(SalesAmount)
      , SalesPrevPeriod1 = LAG(SUM(SalesAmount),1,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
      , SalesPrevPeriod2 = LAG(SUM(SalesAmount),2,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
      , SalesPrevPeriod3 = LAG(SUM(SalesAmount),3,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
      , AveragePrevPeriods = AVG(SUM(SalesAmount))
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth
          ROWS BETWEEN 3 PRECEDING AND 1 PRECEDING)
    FROM FactSales sal
    JOIN DimDate dat ON sal.DateKey = dat.Datekey
    JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
    GROUP BY CalendarYear, CalendarMonth, StoreName
    ORDER BY StoreName, CalendarMonth

    image

    I especially like the way you can use the window frame clause to limit the average to 3 periods: ROWS BETWEEN 3 PRECEDING AND 1 PRECEDING

    More Information:

    My previous post on using Windowing Functions focused on dimensions 

    MSDN (Books on Line) about the OVER Clause

    Introduction blog series on Windowing Functions by Fabiano Amorim on simple talk

    Blog post with a instructional video on the SQL 2012 Windowing Functions Leaving the Windows Open by Jeremiah Peschka

    Book: Microsoft® SQL Server® 2012 High-Performance T-SQL Using Window Functions by Itzik Ben-Gan

  • Fri
    24
    May 13

    SQL2012 Windowing Functions In The Data Warehouse–1. Dimensions

    Windowing functions, introduced in SQL Server 2005 and greatly enhanced in SQL Server 2012, have something magical: Within the context of one row in the result set you have access to the contents of the other rows of the result set.

    With these functions you can greatly simplify the TSQL you write. Many complex queries with CTE’s, temp tables and sub queries can be rewritten to more simple, better maintainable and better performing queries.  In this post I’ll dive into some possibilities for dimensions.

    In SQL 2008 and earlier

    In SQL 2008 and earlier I will typically build a dimension table based upon the Type 2 Slowly Changing Dimensions system. And then use a view with a self join to present the user with the historical and/or current attribute values. I will use this small example of a customer dimension:

    image

    When Marco moved to Paris on July 20th 1988 and married Jose for both of them a new row was added with the new attributes and the EndDate of the old row was changed to the date of the change. This is how attributes changes are handled for Type 2 Slowly Changing Dimensions.

    To consume the dimension information I will typically use a view in the model schema as a source for Analysis Services / PowerPivot / Report:

    CREATE VIEW model.Customer AS
    SELECT his.Id
      , his.Number
      , his.Name
      , his.City
      , CurrentCity = cur.City 
      , his.MaritalStatus
      , CurrentMaritalStatus = cur.MaritalStatus
    FROM dim.Customer his
    JOIN dim.Customer cur
      ON his.Number = cur.Number
    WHERE cur.EndDate ='9999-12-31'

    Which will result into:

    image

     

    In SQL 2012  Using the LAST_VALUE Function

    If your not working for the  Oracle at Delphi the last value will typically be the current value of an attribute. So in 2012 this view can be replaced with:

    CREATE VIEW model.Customer AS
    SELECT Id
      , Number
      , Name
      , City
      , CurrentCity = LAST_VALUE(City)
        OVER(PARTITION BY Number ORDER BY StartDate
         ROWS BETWEEN UNBOUNDED PRECEDING
         AND UNBOUNDED FOLLOWING)
      , MaritalStatus
      , CurrentMaritalStatus = LAST_VALUE(MaritalStatus)
        OVER(PARTITION BY Number ORDER BY StartDate
         ROWS BETWEEN UNBOUNDED PRECEDING
         AND UNBOUNDED FOLLOWING)
    FROM dim.Customer

    Although the LAST_VALUE function seems a bit awkward to write due to the long window frame clause: ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING” it has some nice advantages:

    • No need to use a self join, which will enhance performance.
    • No need to use the EndDate column.
    • Much easier to maintain. Because the purpose of the LAST_VALUE function will be more obvious for your successors.

     

    SCD Type 0 with the FIRST_VALUE Function

    Occasionally you may stumble upon a request/requirement to show the original value of an attribute. (e.g. the sales that landed the customer). In that case you can simply add a column using the FIRST_VALUE function:

    FirstCity = FIRST_VALUE(City) 
      OVER(PARTITION BY Number ORDER BY StartDate)

     

    Mapping Queries Using Windowing Functions

    When you load fact tables you will want to lookup the surrogate keys of the dimensions. In the most simple variant you would use (in SQL 2008)

    SELECT Number, Id FROM dim.Customer
      WHERE EndDate ='9999-12-31'

    In SQL 2012, assuming you will not store the EndDate in your ETL process, you can use:

    SELECT Number, Id FROM
    (SELECT Number, Id, RowNumber = ROW_NUMBER() OVER(PARTITION BY Number ORDER BY StartDate DESC)
    FROM dim.Customer) Sub
    WHERE RowNumber=1

    Unfortunately you will have to use the sub query construct here because it’s not yet possible to use Windowing Functions in the WHERE clause.

     

    But why Would you not add an EndDate in the ETL Process?

    If you don’t end date rows the ETL process gets much easier, faster and less error-prone: You don’t have the distinguish between new and changed rows you’ll just add both in the same way to the dimension table. And you don’t have to identify and update the ‘old’ rows.

    And of course if you really need the EndDate you can just get if with the new LEAD function:

    EndDate = LEAD(StartDate, 1, '9999-12-31')
        OVER(PARTITION BY Number ORDER BY StartDate)

     

    More Information:

    MSDN (Books on Line) about the OVER Clause

    Jamie Thomson Debunking Kimball Effective Dates part 2 – Windowing Functions

    Introduction blog series on Windowing Functions by Fabiano Amorim on simple talk

    Blog post with a instructional video on the SQL 2012 Windowing Functions Leaving the Windows Open by Jeremiah Peschka

    Book: Microsoft® SQL Server® 2012 High-Performance T-SQL Using Window Functions by Itzik Ben-Gan

  • Sat
    18
    May 13

    Logging in SSIS with BIML 2 – Logging Package Execution in the Control Flow

    This is the second post in a series on on logging in SSIS. In the first post I described logging package and tasks within a package by using an event handler. In this post I’ll describe a second method: Logging Package Execution in the control flow. It’s a very common pattern:

    • At the start of the package create a record in an audit table.
    • Use the identity of this row in the dataflow and add it to every row you insert in the target table.
    • Use several counter to count the rows passing through the dataflow.
    • At the end of the package update the audit row that we created in the first step with the end time and the row counters.

    Data Lineage

    The most important advantage of this method lies in step 2 by adding a reference to the audit row in every row that we insert into the target table we will be able to track back every row to when and how it was inserted. This has saved my ass several times by be being able to roll back a faulty import 2 weeks ago and repopulate it with the correct data.

    Andy Leonard wrote a nice blog post on how you can set this up in SSIS. In this post I’ll focus on creating a similar package with BIML. My package will look like this:

    image

    The Audit Table

    To store the package execution information we need an audit table. Create it with the following statement:

    CREATE TABLE meta.SsisAudit(
        AuditID int IDENTITY(1,1) NOT NULL
      , PackageName  varchar(50)
      , PackageGUID uniqueidentifier
      , PackageVersionGUID uniqueidentifier
      , ExecutionID uniqueidentifier
      , StartDate datetime
      , Enddate datetime
      , PackageDuration AS DATEDIFF(second, StartDate
            , EndDate) PERSISTED
      , ExtractRowCount int
      , InsertRowCount int
      , UpdateRowCount int
      , IsProcessed bit DEFAULT(0)
      , CONSTRAINT PK_SsisAudit PRIMARY KEY CLUSTERED
        ( AuditID ASC));
    GO

    The Stored Procedures

    To log the package execution information we’ll use two stored procedures that write the information to the SsisAudit table. The first one is used at the start of the package and will return the id of the inserted row. The second is used to update this row with the end date and the row counters .

    CREATE PROCEDURE [meta].[uspNewAuditRow]
        @PackageName varchar(50)
      , @PackageGUID uniqueidentifier
      , @PackageVersionGUID uniqueidentifier
      , @ExecutionID uniqueidentifier
      , @StartDate datetime
    AS
    INSERT INTO meta.SsisAudit
      ( PackageName
      , PackageGUID
      , PackageVersionGUID
      , ExecutionID
      , StartDate
      , IsProcessed)
    VALUES
      ( @PackageName
      , @PackageGUID
      , @PackageVersionGUID
      , @ExecutionID
      , @StartDate
      ,0);

    /* Return the id of the inserted row*/
    SELECT CAST(SCOPE_IDENTITY() AS int) AS AuditID
    GO


    CREATE PROCEDURE [meta].[uspUpdateAuditRow] 
        @ExtractRowCount int
      , @InsertRowCount int
      , @UpdateRowCount int
      , @AuditID int
    AS
    UPDATE meta.SsisAudit
    SET EndDate = SYSDATETIME()   
      , ExtractRowCount = @ExtractRowCount
      , InsertRowCount = @InsertRowCount
      , UpdateRowCount = @UpdateRowCount
      , IsProcessed = 1
    WHERE AuditID = @AuditID
    GO

    BIML Snippets

    To create the logging facility in an SSIS Package use the following code snippets in the BIMLScript that creates the package:

    <!--Variables-->
    <Variables>
       <Variable Name="RcExtract"
          DataType="Int32"
           Namespace="User" >0</Variable>
       <Variable Name="RcInsert"
          DataType="Int32"
          Namespace="User" >0</Variable>
       <Variable Name="RcUpdate"
          DataType="Int32"
          Namespace="User" >0</Variable>
       <Variable Name="AuditID"
          DataType="Int32" Namespace="User" >0</Variable>
    </Variables>

    <Tasks>
       <!--Create audit row and get id-->
       <ExecuteSQL Name="SQL Create AuditRow and Get AuditID"
          ConnectionName="metaDB"
          ResultSet="SingleRow">
          <DirectInput>
    EXEC [meta].[uspNewAuditRow] ?,?,?,?,?
          </DirectInput>
          <Parameters>
             <Parameter Name="0" Direction="Input"
                DataType="Int32"
                VariableName="System.PackageName" />
             <Parameter Name="1" Direction="Input"
                DataType="Guid"
                VariableName="System.PackageID" />
             <Parameter Name="2" Direction="Input"
                DataType="Guid"
                VariableName="System.VersionGUID" />
             <Parameter Name="3" Direction="Input"
                DataType="Guid"
                VariableName="System.ExecutionInstanceGUID" />
             <Parameter Name="4" Direction="Input"
                DataType="DateTime"
                VariableName="System.StartTime" />
          </Parameters>
          <Results>
             <Result Name="0"
                VariableName="User.AuditID" />
          </Results>
       </ExecuteSQL>

       <!--In Dataflow:-->
       <Dataflow>
       <Transformations>

        <!--Add meta data –>
        <DerivedColumns Name="DC add meta data">
           <Columns>
              <Column Name="AuditID"
                 DataType="Int32"
                 ReplaceExisting="false">
    @AuditID
                </Column>
             </Columns>
             </DerivedColumns>
          </Transformation>
       </Dataflow>

       <!--Update audit row-->
       <ExecuteSQL Name="SQL Update Audit Row"
          ConnectionName="metaDB"
          ResultSet="None">
          <DirectInput>
    EXEC [meta].[uspUpdateAuditRow] ?,?,?,?
          </DirectInput>
          <Parameters>
             <Parameter Name="0" Direction="Input"
                DataType="Int32"
                VariableName="User.AuditID" />
             <Parameter Name="1" Direction="Input"
                DataType="Int32"
                VariableName="User.RcExtract" />
             <Parameter Name="2" Direction="Input"
                DataType="Int32"
                VariableName="User.RcInsert" />
             <Parameter Name="3" Direction="Input"
                DataType="Int32"
                VariableName="User.RcUpdate" />
          </Parameters>
       </ExecuteSQL>

     

    ABOUT BIML

    You can leverage Business Intelligence Markup Language – or BIML – to automate the creation of Microsoft SQL Server Integration Services (SSIS) Packages. BIML is a creation of Varigence, Inc. and is available in proprietary products, open source projects, and has been published as an open language specification. The popular open source BIDSHelper project includes Biml functionality, enabling anyone to write and execute Biml code for free.

    Do you need a head start with the automation of SSIS packages: consider my BIML Workshop.

  • Thu
    03
    Jan 13

    Logging in SSIS with BIML 1 – Logging Task Execution

    The project deployment model in SSIS (2012) offers great default logging functionality. However you may find yourself in need of additional functionality, especially if you use the package deployment model or an earlier version of SQL Server. In this series on logging I dive into 3 methods:

    1. Logging Task Execution using the OnPostExecute Event
      (this post)
    2. Logging Package Execution in the control flow.
    3. Logging Errors using the OnError Event.

    In this post I’ll describe the first method: Logging Task Execution using the OnPostExecute Event. This event is fired after the execution of each task and package. And thus provides a great way to log relevant information like package duration and row counts in the data flow.

    The SsisEventLog Table

    To store these events we need a table:

    CREATE TABLE  meta.SsisEventLog (
          EventID int IDENTITY(1,1) NOT NULL
        , ExecutionID uniqueidentifier 
        , EventLogDate datetime 
        , PackageStartDate datetime 
        , PackageDuration AS datediff(second, PackageStartDate
            , EventLogDate) PERSISTED
        , PackageGUID uniqueidentifier 
        , PackageVersionGUID   uniqueidentifier 
        , PackageName  varchar(50)
        , TaskGUID   uniqueidentifier 
        , TaskName   varchar(50)
        , ExtractRowCount int 
        , InsertRowCount int 
        , UpdateRowCount int 
        , CONSTRAINT  PK_SsisEventLog 
            PRIMARY KEY CLUSTERED (EventID)
    )

    Some remarks:

    • The PackageDuration column is calculated using the datediff function. It calculates the difference in seconds between the time the event was fired and the start date of the package.
    • If the TaskGUID equals to the PackageGUID the event is fired directly after the package was executed.
    • The row count columns will be filled by variables we define in the package and we’ll have to find a way to reset them after logging the event.

    To log this information we’ll use a stored procedures that writes the information in the SsisEventLog table and returns a row with 3 columns, all with the value of 0 to reset the variables after logging.

    CREATE PROCEDURE [meta].[uspLogEvent]      
          @ExecutionID [uniqueidentifier]
        , @PackageStartDate [datetime]
        , @PackageGUID [uniqueidentifier]
        , @PackageVersionGUID [uniqueidentifier]
        , @PackageName [varchar](50)
        , @TaskGUID [uniqueidentifier]
        , @TaskName [varchar](50)
        , @ExtractRowCount [int]
        , @InsertRowCount [int]
        , @UpdateRowCount [int]
    AS       

    INSERT INTO [meta].[SsisEventLog](
          ExecutionID
        , EventLogDate
        , PackageStartDate
        , PackageGUID
        , PackageVersionGUID
        , PackageName
        , TaskGUID
        , TaskName
        , ExtractRowCount
        , InsertRowCount
        , UpdateRowCount
    )
    VALUES(
          @ExecutionID
        , SYSDATETIME()   
        , @PackageStartDate
        , @PackageGUID
        , @PackageVersionGUID
        , @PackageName
        , @TaskGUID
        , @TaskName
        , @ExtractRowCount
        , @InsertRowCount
        , @UpdateRowCount
    );
    SELECT 0 AS ExtractRowCount
        , 0 AS InsertRowCount
        , 0 AS UpdateRowCount
    ;

    Creating the log event handler in the package

    Before we can create this event handler you have to create three variables of type Int32 with value 0 as in this image:

    OnPostExecuteVariables

    To create the event handler in a package open the Event Handlers tab and select the package in the Executable dropdown box and select the OnPostExecute in the Event handler dropdown box.

    Next drop a Execute SQL Task from the toolbox on the canvas and use the following setting in the General tab:

    • Name: SQL Log Event
    • ResultSet: Single row
    • ConnectionType: OLE DB
    • Connection: The name of the connection with your log table
    • SQLSourceType: Direct imput
    • SQLStatement:
      EXEC [meta].[uspLogEvent] ?,?,?,?,?,?,?,?,?,?

     

    OnPostExecute1

    In the Parameter Mapping tab add 9 mappings using the following variable names:

    • System:: ExecutionInstanceGUID
    • System:: StartTime
    • System:: PackageID
    • System:: VersionGUID
    • System:: PackageName
    • System:: SourceID
    • System:: SourceName
    • User:: RcExtract
    • User:: RcInsert
    • User:: RcUpdate

    OnPostExecute2

    In the Result Set tab add 3 results as in the image below.

    OnPostExecute3

    And we are ready with the implementation of the event handler in the SSIS Package.

    The Results

    After executing a package with one Execute SQL Task and one Data Flow Task you will see three rows in the SsisEventLog table. Two for the tasks and one for the package:

    OnPostExecuteResult

    OnPostExecuteResult2

    Using this method you can now easily troubleshoot any (performance) issues with the package.

    Using BIML to implement the event handler

    To implement this event handler with BIML use the following code snippets in the BIMLScript that creates the package:

    <!--Variables-->
    <Variables>
        <Variable Name="RcExtract"
            DataType="Int32"
            Namespace="User" >0</Variable>
        <Variable Name="RcInsert"
            DataType="Int32"
            Namespace="User" >0</Variable>
        <Variable Name="RcUpdate"
            DataType="Int32"
            Namespace="User" >0</Variable>
    </Variables>

    <!--Events-->
    <Events>
    <Event EventType="OnPostExecute"
           Name="OnPostExecute"
           ConstraintMode="Linear">
    <Tasks>
        <ExecuteSQL Name="SQL Log Event"
            ConnectionName="NorthwindSales"
            ResultSet="SingleRow">
            <DirectInput>
    EXEC [meta].[uspLogEvent] 'OnPostExecute',?,?,?,?,?,?,?,?,?,?
            </DirectInput>
            <Parameters>
                <Parameter Name="0" Direction="Input"
                    DataType="Guid"
                    VariableName="System.ExecutionInstanceGUID" />
                <Parameter Name="1" Direction="Input"
                    DataType="DateTime"
                    VariableName="System.StartTime" />
                <Parameter Name="2" Direction="Input"
                    DataType="Guid"
                    VariableName="System.PackageID" />
                <Parameter Name="3" Direction="Input"
                    DataType="Guid"
                    VariableName="System.VersionGUID" />
                <Parameter Name="4" Direction="Input"
                    DataType="String"
                    VariableName="System.PackageName" />
                <Parameter Name="5" Direction="Input"
                    DataType="Guid"
                    VariableName="System.SourceID" />
                <Parameter Name="6" Direction="Input"
                    DataType="String"
                    VariableName="System.SourceName" />
                <Parameter Name="7" Direction="Input"
                    DataType="Int32"
                    VariableName="User.RcExtract" />
                <Parameter Name="8" Direction="Input"
                    DataType="Int32"
                    VariableName="User.RcInsert" />
                <Parameter Name="9" Direction="Input"
                    DataType="Int32"
                    VariableName="User.RcUpdate" />
            </Parameters>
            <Results>
                <Result Name="0"
                    VariableName="User.RcExtract" />
                <Result Name="0"
                    VariableName="User.RcInsert" />
                <Result Name="0"
                    VariableName="User.RcUpdate" />
            </Results>
        </ExecuteSQL>
    </Tasks>
    </Event>
    </Events>

    ABOUT BIML

    You can leverage Business Intelligence Markup Language – or BIML – to automate the creation of Microsoft SQL Server Integration Services (SSIS) Packages. BIML is a creation of Varigence, Inc. and is available in proprietary products, open source projects, and has been published as an open language specification. The popular open source BIDSHelper project includes Biml functionality, enabling anyone to write and execute Biml code for free.

    Do you need a head start with the automation of SSIS packages: consider my BIML Workshop.

  • Tue
    18
    Dec 12

    SQL Server, SSIS, SSAS and SSRS on ONE Server

    Best practice dictates that we use a separate server for each of these SQL Server Services. And this seems logical because otherwise these services will compete over server resources. In IT we call this competition: contention.

    However there are some great reasons to put these services on one box:

    • Licensing: SQL Server licensing can be expensive. And you need licenses for every server on which a services runs.
    • Better resource utilization: Less servers, less power usage, less maintenance and -monitoring cost.
    • Sometimes the network is the problem as a lot of data moves from the SQL Engine to SSIS or SSAS resulting in network congesting. If services run on the same machine, SQL Server uses the Shared Memory Protocol which is faster and leads to less network congestion.

    In a scenario with all SQL/BI Services on one server we need some strategies to diminish the described contention. This is of course a big subject and I can only touch the basics in this post and give you some tips.

    TIP 1: Limit the software and services on the server

    This seems logical but I have been to several sites where developers remote desktop into the production server to do maintenance- and other jobs. This is certainly not a best practice. Better is to not install any client tools on the server and use a separate developer/test server to do maintenance.

    TIP 2: Get an overview of your typical daily workload

    image

    In this image I have mapped a typical daily workload to the services needed to perform the workload:

    • From 07:00 AM to 23:00 is the extended working day window. Some users start early, others end late: all are extensively opening SSRS reports and querying SSAS cubes.
    • From 23:00 to 01:00 AM backups are running for all (source) systems. This is the period we do our own SSAS and SQL Server backups.
    • From 01:00 AM tot 07:00 AM is our process window

    In this period:

    • we use SSIS to land data from our source systems into the staging database
    • we use SSIS to load data from our staging database into the data warehouse
    • we’ll process the dimensions and cubes of our SSAS databases.
    • we warm the cache of our SSAS database and start distributing reports using SSRS subscriptions and SSIS for large Excel reports.

    With this knowledge we have a clear understanding of potential contention issues. Furthermore we can schedule to stop and start services on a need to use basis. We can use the operating system commands NET START and NET STOP for this purpose.

    TIP 3 Set Maximum Memory Setting for SQL Server

    With the max server memory setting you can place a limit to the amount of buffer pool memory used by SQL Server. Which leads to the question to what amount should we limit the buffer pool memory?

    During our extended working day window we will probably not benefit from a large buffer pool memory: Most queries will probably be answered by Analysis Services. However in the process windows we will benefit from a larger buffer pool memory. We can change this property during the day and schedule these changes by using the sp_configure system stored procedure. Based on 24GB of RAM, reserve 4 GB for the operating system and:

    • During working day window set max server memory to 4 GB, reserve 8 GB for SSAS and 8 GB for SSRS.
    • During the process window set max server memory to 10 GB, reserving 10GB for SSIS and 8GB for SSAS.

    TIP 4 Set SSAS Memory Settings

    The SSAS memory settings are available in the msmdsrv.ini file and in properties window of the server. If the value is between 0 and 100 than this means a percentage of total available physical memory. Above 100 means bytes. Change these settings to:

    • Memory\HardMemoryLimit: from the default of 0 to 32
    • Memory\TotalMemoryLimit: from the default of 80 to 28
    • Memory\LowMemoryLimit:from the default of 65 to 24

    This means that Analysis Services will start freeing up memory once its has reached the LowMemoryLimit threshold of 24% of physical memory. This process will get more aggressive if it reaches the other thresholds.
    While you’re at it change some other memory properties:

    • OLAP\Process\BufferMemoryLimit from 60 to 20
    • OLAP\Process\AggregationMemoryLimitMax from 80 to 14
    • OLAP\Process\AggregationMemoryLimitMin from 10 to 4

    TIP 5 Set SSRS Memory Settings

    In the RSReportServer.config file add the WorkingSetMaximum property and set it to 8GB (for our 24GB example):
    <WorkingSetMaximum>8000000</WorkingSetMaximum>
    Restart the reporting services service, so other memory related properties that are based on WorkingSetMaximum get set.

    TIP 6 Monitor and adapt the memory setting

    Use the settings in tip 3, 4 en 5 as a starting point but monitor memory usage and change these properties to map them to your workload.

    TIP 7 Use Windows System Resource Manager to restrict CPU utilization

    With Windows System Resource Manager (WSRM) you can create custom resource allocation policies to restrict resources (CPU, RAM) to an application. You can map your workload windows to policies and define these policies e.g.:

    • policy ProcessData which runs when Analysis Services is processing data, set the CPU percentage for SQL to 45% and 45% for SSAS
    • policy ProcessIndex which runs when Analysis Services is processing aggregations, set the CPU percentage for SQL to 10% and 80% for SSAS

    wsrm_resource_allocation

    Next you can setup calendar rules to apply the different policies to the correct time frames.

    However this has the drawback that you have two calendar schemes: one in SQL agent and one in WSRM that you have to keep in sync. Furthermore it’s possible that there aren’t any CPU contention issues. My advice would be to monitor CPU usage and plan resource allocation policies if necessary based on the monitoring results.

    TIP 8 Don’t write to the same disk you’re reading from

    In the daily process window we will typically load large amount of data in our data warehouse:

    • We load data from the source system in a staging database.
    • From the staging database we load the data in the data warehouse.
    • From the data warehouse we load the Analysis Services cubes.

    In this scenario put the data warehouse database files on another drive as the staging database and the Analysis Services cubes.

    Conclusion

    In this post I focused on diminishing contention issues when using one server for your BI Solution running SQL Server, Analysis Services, Integration Services and Reporting Services. Key aspect is understanding your workload and the role the different applications play as well as the properties you can tweak to diminish contention.

    References