IN2BI, Microsoft Business Intelligence
data, stories & insights
rss | email | twitter
  • Mon
    28
    Jul 14

    Truncate all Tables

    There are many reasons for me to blog. One of the less vain is to recover and reuse code I wrote earlier. In my current project we have a need to test, test and retest the initial load of the data warehouse. TruncateAllTables

    For this purpose I wrote this T-SQL script that:

    • Drops all foreign key constraints;
    • Truncates all the tables;
    • Recreates all foreign key constraints. I dropped earlier.
  • Mon
    21
    Jul 14

    Viewing or Saving the Compiled Biml File(s)

    Some time ago a BIML enthusiast asked me for the location of the temporary files that BIDSHelper create when you generate the SSIS packages in BIDSHelper. I couldn't help him and explained that the Biml Engine first compiles the Biml documents,  applying any BimlScript in memory and then creates the SSIS packages. The intermediate step isn't persisted on disk.

    Debugging Biml Files Made Easier

    Obviously he needed this intermediate result to better debug his Biml files and improve his development efforts. Recently I learned this simple trick to create the intermediate results he wanted. And I like to share it with you in this blog post:

    • Add a new Biml File to your project and name it SaveCompiledBiml.biml
    • Replace the contents of this file with the next code block

    Some remarks:

    Change the directory and filename of the sFile variable to match your environment / wishes.
    In this example I have used tier 5. If you have files with a higher tier change the template tier directive in the SaveCompiledBiml file.  
    (BIML files are compiled in the order of their "tier", the files with the lowest tier first and those with the highest last. The default tier is 0 for files without BimlScript and 1 for files with BimlScript.)

    Do you want to improve the quality and speed of your ETL-development?

    Biml is quickly gaining popularity as the way to automate the creation of SSIS packages. And thus saving time and money while improving quality. Are you interested in automation of SSIS packages with Biml? Consider my Biml Workshop to get a headstart.

  • Sun
    04
    May 14

    Building a data warehouse while the source systems are still in development

    Some years ago the client of a contractor I worked for made some major investments in their IT-landscape. They replaced their ERP - and CRM system and of course BI/Reporting was part of the acceptance criteria for the new system. It was a big project and a disaster! ERP, CRM and BI consultants from the contractor where running in and out. Discussing requirement details with the client and adapting the systems to these wishes. For the BI team it was especially hard. When we build something on Monday chances were slim that it still worked on Thursday. We depended upon the ERP and CRM team to communicate their changes and provide us with correct test data. And there was no love wasted between the teams. I was very glad when the opportunity arose to leave this war hole and move on. And I did.

    Trending

    Nevertheless it seems to become a trend to build/adapt the BI system before a new source system has been launched. And make it part of the acceptance criteria for the new system. This of course offers the client an easy way to verify the new system by comparing the reports they work with. In my previous three projects (part) of the source system was still in development and I would probably have gone crazy keeping up with all the changes if I hadn’t been able to automate the work.

    Agility is the name of the game

    image

    In these situations I have found that you (as in an agility drill) need three things:

     

    Testing Framework

    A testing framework has a complicated ring to it. But if you keep it simple and focus on the results the users expect to see it will be easy and the advantages are great:

    • You’ll notice any errors early in the process.
    • You can correct the errors before the users’ starts testing.
    • You can pinpoint any changes made in the source system that ruin your reports.
    • You’ll gain confident in your solution and sleep better.

    I’ll describe a method that has worked for me. Key to success herein is starting early!

    First get hard copies and digital versions of reports the users are using now. Ask them to highlight the important measures. Start entering the tests in an Excel spreadsheet and use the digital versions of the reports to copy any relevant data) I use the following format where I keep values in the column testName unique

    testName testType criterion1 criterion2 expectedValue expression
    InternetSales201401 DWH 201404 Internet 125.035  

    Now the difficult/laborious part: start entering SQL statement in the expression column that will return the expected value. Use [criterion1] and/or [criterion2] as placeholders in these statements. The values in the criterion1 and criterion2 column will replace these placeholders at execution time. You will now be able to copy the expression to similar test with different values for the criteria. Example expression:

    SELECT CAST(SUM(f.sales) AS int) FROM factSales f INNER JOIN dimDate d ON f.DateKey = d,DateKey INNER JOIN dimChannel ch ON f.ChannelKey = ch.ChannelKey WHERE d.YearMonth = [criterion1] AND ch.ChannelName = ‘[criterion2]’ /* SQL code of example test expression */

    Testing Framework: Tables

    Import your Excel spreadsheet into a SQL Server table with the following definition:

    CREATE TABLE DataTests ( testName nvarchar(64) , testType nvarchar(20) , criterion1 nvarchar(128) , criterion2 nvarchar(128) , expectedValue int , expression nvarchar(1024) ) /* SQL code to create the DataTests table */

    Create an additional table to store the result of the tests that were performed. Use the following definition:

    create table DataTestResults ( testName nvarchar(64) , testDate datetime default getdate() , actualValue int , expectedValue int ) /* SQL code to create the DataTestResults table */

    Testing Framework: SSIS Package

    Now create an SSIS Package that will perform the tests. It uses:

    • An Execute SQL Task to get a list of tests
    • A For Each Loop Container that is used to loop through the list with in it:
    • An Execute SQL Task to execute every test
    • An Execute SQL Task to log the result of every test

     

    image

    This package can be described with the following BIML (Business Intelligence Markup Language) snippet:

    <Packages> <Package Name="PKG TestData" ConstraintMode="Linear"> <Variables> <Variable Name="TestList" DataType="Object" /> <Variable Name="testQuery" DataType="String" /> <Variable Name="testName" DataType="String" /> <Variable Name="expectedValue" DataType="Int32"> 0 </Variable> <Variable Name="actualValue" DataType="Int32"> 0 </Variable> </Variables> <Tasks> <!--Get list with tests to be performed—> <ExecuteSQL Name="SQL GetTestList" ConnectionName="META" ResultSet="Full"> <DirectInput> SELECT testName , testQuery = REPLACE(REPLACE(expression,'[criterion1]',criterion1),'[criterion2]',criterion2) , expectedValue FROM DataTests WHERE testType='DWH' </DirectInput> <Results> <Result Name="0" VariableName="User.TestList" /> </Results> </ExecuteSQL> <!--Loop through tests and perform them—> <ForEachAdoLoop Name="FELC Test" SourceVariableName="User.TestList" ConstraintMode="Linear" > <VariableMappings> <VariableMapping Name="0" VariableName="User.testName" /> <VariableMapping Name="1" VariableName="User.testQuery" /> <VariableMapping Name="2" VariableName="User.expectedValue" /> </VariableMappings> <Tasks> <!--Perform test—> <ExecuteSQL Name="SQL Execute Test" ConnectionName="DWH" ResultSet="SingleRow" > <VariableInput VariableName="User.testQuery" /> <Results> <Result Name="0" VariableName="User.actualValue" /> </Results> </ExecuteSQL> <!--Log test result—> <ExecuteSQL Name="SQL Log Test Result" ConnectionName="META" ResultSet="None"> <DirectInput> INSERT INTO DataTestResults (testName,actualValue,expectedValue) VALUES (?,?,?) </DirectInput> <Parameters> <Parameter Name="0" Direction="Input" DataType="String" VariableName="User.testName" /> <Parameter Name="1" Direction="Input" DataType="Int32" VariableName="User.actualValue" /> <Parameter Name="2" Direction="Input" DataType="Int32" VariableName="User.expectedValue" /> </Parameters> </ExecuteSQL> </Tasks> </ForEachAdoLoop> </Tasks> </Package> </Packages> */ BIML code to create the testing ssis package */

    Conclusion

    In this blog post I discussed the components of an agile system that are necessary when you build a data warehouse where the source systems are still in development:

    • An Alert System to inform you of changes in the source systems.
    • A Data Warehouse Automation Framework.
    • A Testing Framework.

    I ended with a simple implementation of a testing framework that worked for me. Does this help you or did you implement something similar? … I love to hear from you!

  • Thu
    02
    Jan 14

    Many Happy User Peaks

    I wrote this blog post in august 2012, but somehow I forgot to post it. At the start of 2014 I wish every one many happy user peaks and wisdom in creating them…

    Recently I have been involved in converting a business intelligence (dashboard) web application. Making it more suitable for tablets and other mobile devices. This offers a great opportunity to talk about this great image of Kathy Sierra:

    The_Featuritis_Curve_Kathy_Sierra

    Of course the location of the “Happy User Peak” will differ for every person. The message of the image however is very clear: limit options and features!

    When constructing an application for mobile devices you’ll need to limit the features and options even further and make it really really simple for your users. They don’t want menus, toolbars, lookup textboxes, lists in combo boxes to choice from.

    They just want to click and swipe to investigate the tables and charts presented in the dashboard.

     

    ourscoreboard

  • Wed
    18
    Dec 13

    Run as Different User

    As a consultant I often need to run a program as a different user. For future reference I collected the information about RunAs in this article.

    In Windows 7 en Windows Server 2008R2 you can run an application as a different user  by holding down the shift key and right clicking the application shortcut.

    runas

    To use this functionality in windows 8 you need some extra steps to configure it. These steps are described by Taylor Gibb in: How to Run Windows 8 Apps as a Different User from the Start Screen

    Another option is to use the ShellRunAs utility that is provided by the Windows Sysinternals website. If you install this utility you can right-click on a shortcut and select “Start as different user”.

    Runas Command

    You can also use the runas command line tool from a command prompt or in a .bat file as in these examples:

    %windir%\System32\runas.exe /netonly /user:domain\username "C:\Program Files\Microsoft SQL Server\110\Tools\Binn\VSShell\Common7\IDE\Ssms.exe”

    This will prompt you for the password of “domain\user” and open Sql Server Management Studio. The \netonly parameter means that the provided identity will be used only with a remote resource the remote site will authenticate you as 'domain\username'.

    As an alternative, first ask for the user name and use the input in the runas command:

    set /P user="Type the domain\username: "
    %windir%\System32\runas.exe /netonly /user:%user% "C:\Program Files\Microsoft SQL Server\110\Tools\Binn\VSShell\Common7\IDE\Ssms.exe”

  • Sun
    08
    Dec 13

    Create a LightSwitch Application to Maintain Metadata

    Correct metadata is essential for every automation solution. When I started working with BIML I used one table that describes the source to target table mapping. (see the “Creating a Meta Data Driven SSIS Solution with BIML” series)

    And that works fine in most simple scenarios like importing staging tables or when you are able to move business rules to a view. Key in these simple scenarios is that the columns in your target table have the same name and the same data type.

    Extending the metadata model

    In more recent assignments I added a column mapping table and used T-SQL to simplify the BIML script. With this approach I created more robust SSIS packages that provided extra possibilities, like:

    • Incremental load patterns
    • Converting data types
    • Surrogate key lookups
    • Expressions to calculate new values

    A problem with this approach however is maintaining these column mappings. So I decided to take Visual Studio LightSwitch for a spin and create a simple application to fulfill my maintenance needs.

    LightSwitch

    Visual Studio LightSwitch is a rapid application development (RAD) tool that is used to help write data-centric line of business (LOB) applications. An import prerequisite of LightSwitch is that you need to have a data model or construct one using the LigthSwitch GUI. Based on this data model you can easily define a set of screens and your application is done. So: No coding necessary. (although coding is possible to extend the standard functionality)

    Data Model

    For my data model I decided to add two additional tables:

    • ETLJob
    • Connection

    From a BIML perspective the ETLJob table is primarily used to create the master package with Execute Package and Execute SQL tasks.  An ETLJob can be either:

    • A generated package (with one or more table mappings)
    • A manual package or a
    • SQL Statement

    So I ended up with the following model:

    image

    The SQL DDL Statements to create this model:

    CREATE SCHEMA meta
    GO

    CREATE TABLE meta.Connection
    (
          ConnectionID INT IDENTITY NOT NULL PRIMARY KEY
        , ConnectionName VARCHAR(64) NOT NULL
        , ConnectionType VARCHAR(20) NOT NULL
        , ConnectionString VARCHAR(256) NOT NULL
        , CreateInProject BIT NOT NULL DEFAULT(0)
        , DelayValidation BIT NOT NULL DEFAULT(1)
    )
    CREATE TABLE meta.ETLJob
    (
          ETLJobID INT IDENTITY NOT NULL PRIMARY KEY
        , ETLJobName VARCHAR(64) NOT NULL
        , ETLJobType VARCHAR(20) NOT NULL
        , ETLJobGroup VARCHAR(64)
        , SQLConnection INT
            REFERENCES meta.Connection(ConnectionID)
        , SQLStatement VARCHAR(512)
    )
    CREATE TABLE meta.TableMapping
    (
          TableMappingID INT IDENTITY NOT NULL PRIMARY KEY
        , ETLJOB INT NOT NULL
            REFERENCES meta.ETLJob(ETLJobID)
        , TableName VARCHAR(64)
        , TableType VARCHAR(20)
        , TableConnection INT NOT NULL
            REFERENCES meta.Connection(ConnectionID)
        , TableSchema VARCHAR(64)
        , SourceConnection INT
            REFERENCES meta.Connection(ConnectionID)
        , SourceObject VARCHAR(256)
    )
    CREATE TABLE meta.ColumnMapping
    (
          ColumnMappingID INT IDENTITY NOT NULL PRIMARY KEY
        , TableMapping INT NOT NULL
            REFERENCES meta.TableMapping(TableMappingID)
        , ColumnName VARCHAR(64) NOT NULL
        , ColumnType VARCHAR(20) NOT NULL
        , ColumnDataType VARCHAR(20) NOT NULL
        , SourceColumnName VARCHAR(64)
        , SourceColumnDataType VARCHAR(20)
        , Calculation VARCHAR(256)
        , LookupConnection INT 
            REFERENCES meta.Connection(ConnectionID)
        , LookupObject VARCHAR(256)
        , InputColumnName VARCHAR(64)
        , InputColumnDataType VARCHAR(20)
        , OutputColumnName VARCHAR(64)
    )

     

    Importing the data model in LightSwitch

    To import this data model you only need 6 steps:

    1. Create a new database. (I named it MetaBase) and execute the  afore mentioned DDL Statements to create the objects and relations.
    2. In Visual Studio create a New Project. Select LightSwitch as template and choose LightSwitch Application (Visual C#). Give the project a name (I used Metabase) and click OK.

      image
    3. In the next screen click Attach to External Data Source.

      image
    4. In the Attach Data Source Wizard form select Database and click next.
      image
    5. In the Connection Properties window enter the server name\instance and Select the database you created in step 1. Click OK.

      image 
    6. Check Tables and use the default name in Specify the name of the data source in the Choose your Database Objects step. Click Finish.

      image 

    You have imported the data model into LightSwitch.

     

    Changing the data model

    When you look at the LightSwitch designer in your data model you will notice LightSwitch made some small name changes, describing the relations. Do not try to correct these. Instead change the display name in the properties window. As I did for the source connection in this screenshot.

    image

     

    Creating the screens

    Creating screens is even easier than creating the data model.

    1. Right-Click Screens in Solution Explorer and choose Add Screen ..

      image

    2. In the Add New Screen dialog select Details Screen as template. Select MetaBaseData.ETLJob as Screen Data and check ETLJob TableMapping. Click OK.

      image

    3. After I created the screen I made some small changes in the designer: Moved some fields up or down by dragging and dropping and changed the number of Lines in the SQL Statement field from 1 to 3.

      image

    I repeated steps 1 and 2 to create additional screens:

    Template Screen Data
    Search Data Screen Connections
    Search Data Screen ETLJobs
    Search Data Screen TableMappings
    New Data Screen Connections
    New Data Screen ETLJobs 2)
    New Data Screen TableMappings 1)
    Details Screen Connections
    Details Screen TableMappings 1)

    1) Add ColumnMappings under Additional data to include
    2) Add TableMappings under Additional data to include

     

    Navigation, Shell and Theme

    Right-Click Screens en select Edit Screen navigation from the context menu. Change the screen navigation to the following image:

    image

    Choose General Properties and change the Shell to LightSwitch Standard Shell and the Theme to LightSwitch Blue Theme.

    image

     

    Conclusion

    With Visual Studio LightSwitch you can easily create an application to maintain the meta data for your BIML solution. You create a normalized data model. Import it in LightSwitch and start adding screens. And if you like BIML you’ll probably like LightSwitch as well: Both make it easier to program your solution.

    An additional advantage of storing the meta data in separate tables and creating an application to maintain that data is that you have a great overview of your data lineage and up to date documentation. Here are some screenshots of the application with data.

    image
    Search ETLJob screen showing the packages and SQL Statements of our solution.

    image
    An ETL Job detail screen with the associated table mappings.

    image
    A Table Mapping detail screen with the associated column mappings.

  • Tue
    17
    Sep 13

    Making your Biml files less complex

    The combination of XML and C# code in a Biml file can make Biml files very complex. In web development Microsoft overcame this problem by introducing the code-behind model with one file for the static (html) text and a separate file for the dynamic code. Unfortunately this is not possible with Biml. There are however some ways to make your files less complex:image

    1. Separate files
    2. Move code logic to T-SQL

    In this post I’ll briefly describe the first option and use some examples to explain the second option.


    Separate files

    You can use the include directive to insert a piece of static xml of another file in your Biml file. John Minkjan has a nice example on his website.
    If you need more dynamic content you can also opt for the CallBimlScript function which will allow you to use properties. Of course you will have to handle these properties in the callee file.


    Move some code logic to T-SQL

    A typical Biml solution exists not only of Biml files but also of metadata that describe the packages that you want to build. I typically use a table mapping table and a column mapping table in a SQL Server database. This allows me to create a stored procedure that combines information from both tables in one dataset with all relevant information in one row.

    For the next examples I will use the following information in this source to target mapping table:

    image


    Using FOR XML PATH(‘’) in T-SQL

    With the “FOR XML PATH(‘’)” syntax you can transform columns into a string. The T-SQL statement:

    SELECT srcColumnList = SELECT  ', ' + srcColumn 
    FROM meta.ColumnMapping WHERE srcColumn is not null
    FOR XML PATH ('')

    returns: , Id, f_name, l_name, sex a string you can almost use as a column list in a source component in the SELECT <column list> FROM <source table> statement. Almost … because you’ll have to remove the first comma and handle strange characters in column names. So you’ll have to do some stuff to remove this comma:

    SELECT srcColumnList = STUFF((SELECT  ', ' + QUOTENAME(srcColumn) FROM meta.ColumnMapping WHERE srcColumn is not null
    FOR XML PATH('')),1,2,'')

    which returns the string we need: [Id], [f_name], [l_name], [sex]


    Creating the data conversion task

    Expanding on the previous statement you can create additional dynamic content for your Biml solution. In this case the column definition in  a Data Conversion Task. In Biml you would write:

    <DataConversion Name="DC">
      <Columns>
       
    <Column SourceColumn="f_name" TargetColumn="FirstName"
                DataType="AnsiString" Length="40" CodePage="1252" />
         <…more columns …>

    </Columns>
    </DataConversion>

    To create the column list for this data conversion use the statement:

    SELECT DCColumns = REPLACE(REPLACE(STUFF(
        (SELECT  char(10) + '<Column SourceColumn="' + srcColumn 
            + '" TargetColumn="' + tgtColumn
            + '" DataType="' + tgtDataType
            + CASE WHEN tgtDataType='AnsiString'
            THEN '" Length="' + CAST(tgtLength AS varchar(10))
                 + '" CodePage="1252" />'
            ELSE '" />' END
         FROM meta.ColumnMapping
         FOR XML PATH('')),1,1,''),'&lt;','<'),'&gt;','>')

    Which returns the Biml string we need:

    <Column SourceColumn="Id" TargetColumn="CustomerID"
         DataType="Int32" />
    <Column SourceColumn="f_name" TargetColumn="FirstName"
         DataType="AnsiString" Length="40" CodePage="1252" />
    <Column SourceColumn="l_name" TargetColumn="LastName"
         DataType="AnsiString" Length="40" CodePage="1252" />
    <Column SourceColumn="sex" TargetColumn="Gender"
        DataType="AnsiString" Length="6" CodePage="1252" />

    Some remarks to the SQL statement:

    • We don’t really need the char(10), but the line break is convenient when we look at the results.
    • Since this query uses the for xml syntax and xml can’t handle the opening en closing tag signs. SQL server replaces them with their escape codes. We have to use the REPLACE function to change these escape codes back into the opening and closing tag signs.

     

    Creating the stored procedure

    For this example I would create the following stored procedures that combines the two metadata tables:

    CREATE PROCEDURE meta.getPackageDetails (@MappingType varchar(50)) AS
    SELECT
          PackageName = t.MappingName
        , TargetTable = t.tgtSchema + '.' + t.tgtTable
        , TargetConnection = t.tgtConnection
        , SourceTable = t.srcSchema + '.' + t.srcTable
        , SourceConnection = t.srcConnection
        , srcColumnList = STUFF((SELECT  ', ' + QUOTENAME(srcColumn)
            FROM meta.ColumnMapping
            WHERE srcColumn is not null
            AND TableMappingID=t.TableMappingID
            FOR XML PATH('')),1,2,'')
        , DCColumns = REPLACE(REPLACE(STUFF(
            (SELECT  char(10) + '<Column SourceColumn="' + srcColumn
            + '" TargetColumn="' + tgtColumn
            + '" DataType="' + tgtDataType
            + CASE WHEN tgtDataType='AnsiString'
            THEN '" Length="' + CAST(tgtLength AS varchar(10))
                + '" CodePage="1252" />'
            ELSE '" />' END
            FROM meta.ColumnMapping
            WHERE TableMappingID=t.TableMappingID
            FOR XML PATH('')),1,1,''),'&lt;','<'),'&gt;','>')
    FROM meta.TableMapping t
    WHERE t.MappingType = @MappingType

    Less complex Biml file

    The Biml file that you need to create the packages:

    <Biml xmlns="http://schemas.varigence.com/biml.xsd">
      <#@ include file="Connection.biml" #>

        <Packages>
        <# string sConn =  “Provider=SQLNCLI10;Server=.\\SQL2012;
           Initial Catalog=BimlSamples;Integrated Security=SSPI;"; #>
        <# string sSQL = string.Format("Exec meta.getPackageDetails
          {0}", "dim"); #>
        <# DataTable tblPackages = ExternalDataAccess.GetDataTable
          (sConn,sSQL); #>
        <# foreach (DataRow pkg in tblPackages.Rows){ #>
           
            <Package Name="<#=pkg["PackageName"]#>"
                    ConstraintMode="Linear">
                <Tasks>

            <Dataflow Name="DFT <#=pkg["PackageName"]#>">
              <Transformations>
                <OleDbSource Name="ODS Source"
                  ConnectionName="<#=pkg["SourceConnection"]#>" >
                  <DirectInput>
                    SELECT <#=pkg["srcColumnList"]#>
                    FROM <#=pkg["SourceTable"]#>
                  </DirectInput>
                </OleDbSource>
               
                <DataConversion Name="DC">
                  <Columns>
                    <#=pkg["DCColumns"]#>
                  </Columns>
                </DataConversion>

                <OleDbDestination Name="ODD Target"
                    ConnectionName="<#=pkg["TargetConnection"]#>">
                  <ExternalTableOutput
                     Table="<#=pkg["TargetTable"]#>"/>
                </OleDbDestination>

              </Transformations>
            </Dataflow>
          </Tasks>
            </Package>
            <#}#>
        </Packages>
    </Biml>
    <#@ import namespace="System.Data" #>

    Conclusion

    In this blog post I discussed some ways to make you Biml files less complex. You can move (semi)-static content to other files and import them with the include directive or use the CallBimlScript function. And you can move parts of complicated Biml code to T-SQL. The choice of how much you will move to T-SQL will largely depend on your proficiency in C# or T-SQL.

  • Sat
    31
    Aug 13

    8 Practical BIML Tips

    You can leverage Business Intelligence Markup Language – or BIML – to automate the creation of Microsoft SQL Server Integration Services (SSIS) Packages. The popular open source BIDSHelper project includes Biml functionality, enabling anyone to write and execute BIML code for free. Among professional SSIS Developers BIML is quickly gaining popularity.
    In this post I’ll share some practical tips ….

    1. Better copy and paste experience

    The irritating behavior of the XML editor in Visual Studio when copying BIML script and how to overcome it  has been documented on several blogs. In this overview of tips I couldn’t discard it.
    In Visual Studio / SSDT … Select Options in the Tools menu.
    In the Treeview of the Options form expand Text Editor, expand  XML and choose Formatting.
    Next uncheck both options under Auto Reformat.

    clip_image002


    2. Put directive tags #@..# at the bottom of your BIML file

    In many examples (including mine) these directive are placed at the top of the BIML file. Which makes sense because this is the default location in software development. However when Visual Studio opens a file with a directive before the <Biml> xml tag it doesn’t use the xml editor and we lose the formatting and intellisense features in the gui.
    So instead place the directives at the bottom of the file. After the closing </Biml> tag. This will not have any effect on the creation of packages.

    3. Vanilla Biml file

    Before using script in a BIML file create a working BIML file that can create a package with more than 80% of the expected functionality.
    Copy this file and use it as a base and then start scripting.
    Why? The Combination of xml and code in one document makes it more complicated to select the correct BIML elements and attributes. Next use a small dataset so when you test your work only a small amount of packages are created.

    4. Test and save often

    During development regularly often check your work. Use the options: “Check Biml for Errors” or “Generate SSIS Packages” from the context menu. This way you not only test your work but save it as well.
    Why? Debugging BIML files is mostly a pain. Error messages are limited and often refer to the wrong row and small typos can have a huge impact. So you better find your errors early in development.

    5. Special XML characters

    Xml has some special characters that you need to enclose in  a CDATA tag or replace the special character with its escape code:

    • double quote " ( &quot; )
    • single quote '  ( &apos; )
    • less than sign  < ( &lt; )
    • greater than sign  >  ( &gt; )
    • ampersand & ( &amp; )

    As an example suppose you have the following BIML:

    <Direct Input>
    SELECT Name FROM dbo.People WHERE Age > 25
    </Direct Input>

    then the xml processor will fail at Age > 25 As a remedy change your BIML into:

    <Direct Input>
    <![CDATA[SELECT Name FROM dbo.People WHERE Age > 25]]>
    </Direct Input>,
    or

    <Direct Input>
    SELECT Name FROM dbo.People WHERE Age &gt; 25
    </Direct Input>

    6. Special C# characters

    C# also as some special characters that you will need to escape with a backslash . Most notably:

    • the backslash itself \ ( \\ )
    • single quote ‘ ( \’ )
    • double quote “ ( \” )

    As an example escaping the backslash in a file location  
    string FileName = “C:\\Dir\\File.txt”;
    or use the verbatim string construction:
    string FileName = @”C:\Dir\File.txt”;

    7. Learn some basic C#

    C# is the principal language of the .NET framework and is widely used for all sorts of programs: Web Services, Web Applications, Windows Form applications, SSIS Scripts, SQL CLR Stored Procedures etc. An investment in learning some C# will pay off. There is an abundant supply of websites and books with relevant information.

    To get you started: read the chapters Basics  and  Flow control  off This tutorial: http://zetcode.com/lang/csharp/

    8. Learn from the samples

    Steal / use the samples on:

  • Thu
    13
    Jun 13

    SQL2012 Windowing Functions In The Data Warehouse–2. Reporting

    This is the second post of a diptych on the magical windowing functions in data warehouse scenarios. With these functions you can greatly simplify the TSQL you write. Many complex queries with CTE’s, temp tables and sub queries can be rewritten to simpler, better maintainable and better performing queries.  In this post I’ll dive into some possibilities for reporting.

    For the examples in this post I’ll use the Contoso Retail Data Warehouse database. A sample database for data warehouses provided by Microsoft.

    Year To Date (YTD) Calculations

    On the Internet you’ll find a lot of examples on using the running total technique to calculate year to date values. In this example I need the monthly sales and the YTD sales for every store.

    SELECT CalendarMonth
      , StoreName
      , PeriodSales
      , SalesYTD = SUM(PeriodSales) OVER 
         (PARTITION BY StoreName, CalendarYear ORDER BY CalendarMonth) FROM
      (
      SELECT CalendarYear
        , CalendarMonth
        , StoreName
        , PeriodSales = SUM(sal.SalesAmount)
      FROM FactSales sal
      JOIN DimDate dat ON sal.DateKey = dat.Datekey
      JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
      GROUP BY CalendarYear, CalendarMonth, StoreName
      ) SalesByMonth
    ORDER BY StoreName, CalendarMonth

    The sub query “SalesByMonth” aggregates the sales amount for every store per month. The windowing function SUM() OVER() calculates the YTD sales. Which will result in the required dataset:

    image

     

    The SUM(SUM()) OVER() Construction

    Since you can use Windowing Functions over an aggregated we don’t need the sub query and we can simplify this query to:

    SELECT CalendarMonth
      , StoreName
      , PeriodSales = SUM(SalesAmount)
      , SalesYTD = SUM(SUM(SalesAmount)) OVER 
         (PARTITION BY StoreName, CalendarYear ORDER BY CalendarMonth) FROM FactSales sal
    JOIN DimDate dat ON sal.DateKey = dat.Datekey
    JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
    GROUP BY CalendarYear, CalendarMonth, StoreName
    ORDER BY StoreName, CalendarMonth

    The second SUM in: “SUM(SUM()) OVER() GROUP BY “ is used in conjunction with the GROUP BY clause to calculate the monthly sales first.
    The first SUM in: SUM(SUM()) OVER() GROUP BY is then used in conjunction with the OVER clause to calculate the YTD sales.

     

    Comparing to previous year

    Adding the figures of the previous year as a comparison is a common reporting requirement. You can easily realize this by using the LAG function returning the results 12 months back in time. Building upon our earlier query:

    SELECT *
      , PeriodSalesPrevYear = LAG(PeriodSales,12,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
      , YTDSalesPrevYear = LAG(SalesYTD,12,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
    FROM
    (
      SELECT CalendarMonth
        , StoreName
        , PeriodSales = SUM(SalesAmount)
        , SalesYTD = SUM(SUM(SalesAmount))
            OVER (PARTITION BY StoreName, CalendarYear ORDER BY CalendarMonth)
      FROM FactSales sal
      JOIN DimDate dat ON sal.DateKey = dat.Datekey
      JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
      GROUP BY CalendarYear, CalendarMonth, StoreName
    ) Base
    ORDER BY StoreName, CalendarMonth

    Which results into:

    image

     

    How Do We Do Compared to the other stores?

    In this example I use the RANK() Function to determine the store’s rank in the total monthly sales and the store’s sales as a percentage of the total monthly sales:

    SELECT CalendarMonth
      , StoreName
      , PeriodSales = SUM(SalesAmount)
      , StoreRank = RANK() OVER
         
    (PARTITION BY CalendarMonth ORDER BY SUM(SalesAmount) DESC)
      , StoreShare = 100*SUM(SalesAmount)/
          SUM(SUM(SalesAmount)) OVER (PARTITION BY CalendarMonth)
    FROM FactSales sal
    JOIN DimDate dat ON sal.DateKey = dat.Datekey
    JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
    GROUP BY CalendarMonth, StoreName

    image

     

    Compare to (Average of) Previous Periods

    In a recent client engagement a report which was used to audit the monthly invoice process gave a lot of troubles. The SQL query behind it was very difficult to comprehend and consisted of several sub queries. By using windowing functions our team was able to greatly simplify the query. The requirement can be restated/simplified to our example as: Give us the current month sales, the previous 3 and the average of those previous 3. This is the resulting query:

    SELECT CalendarMonth
      , StoreName
      , PeriodSales = SUM(SalesAmount)
      , SalesPrevPeriod1 = LAG(SUM(SalesAmount),1,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
      , SalesPrevPeriod2 = LAG(SUM(SalesAmount),2,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
      , SalesPrevPeriod3 = LAG(SUM(SalesAmount),3,0)
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth)
      , AveragePrevPeriods = AVG(SUM(SalesAmount))
          OVER (PARTITION BY StoreName ORDER BY CalendarMonth
          ROWS BETWEEN 3 PRECEDING AND 1 PRECEDING)
    FROM FactSales sal
    JOIN DimDate dat ON sal.DateKey = dat.Datekey
    JOIN DimStore sto ON sal.StoreKey = sto.StoreKey
    GROUP BY CalendarYear, CalendarMonth, StoreName
    ORDER BY StoreName, CalendarMonth

    image

    I especially like the way you can use the window frame clause to limit the average to 3 periods: ROWS BETWEEN 3 PRECEDING AND 1 PRECEDING

    More Information:

    My previous post on using Windowing Functions focused on dimensions 

    MSDN (Books on Line) about the OVER Clause

    Introduction blog series on Windowing Functions by Fabiano Amorim on simple talk

    Blog post with a instructional video on the SQL 2012 Windowing Functions Leaving the Windows Open by Jeremiah Peschka

    Book: Microsoft® SQL Server® 2012 High-Performance T-SQL Using Window Functions by Itzik Ben-Gan

  • Fri
    24
    May 13

    SQL2012 Windowing Functions In The Data Warehouse–1. Dimensions

    Windowing functions, introduced in SQL Server 2005 and greatly enhanced in SQL Server 2012, have something magical: Within the context of one row in the result set you have access to the contents of the other rows of the result set.

    With these functions you can greatly simplify the TSQL you write. Many complex queries with CTE’s, temp tables and sub queries can be rewritten to more simple, better maintainable and better performing queries.  In this post I’ll dive into some possibilities for dimensions.

    In SQL 2008 and earlier

    In SQL 2008 and earlier I will typically build a dimension table based upon the Type 2 Slowly Changing Dimensions system. And then use a view with a self join to present the user with the historical and/or current attribute values. I will use this small example of a customer dimension:

    image

    When Marco moved to Paris on July 20th 1988 and married Jose for both of them a new row was added with the new attributes and the EndDate of the old row was changed to the date of the change. This is how attributes changes are handled for Type 2 Slowly Changing Dimensions.

    To consume the dimension information I will typically use a view in the model schema as a source for Analysis Services / PowerPivot / Report:

    CREATE VIEW model.Customer AS
    SELECT his.Id
      , his.Number
      , his.Name
      , his.City
      , CurrentCity = cur.City 
      , his.MaritalStatus
      , CurrentMaritalStatus = cur.MaritalStatus
    FROM dim.Customer his
    JOIN dim.Customer cur
      ON his.Number = cur.Number
    WHERE cur.EndDate ='9999-12-31'

    Which will result into:

    image

     

    In SQL 2012  Using the LAST_VALUE Function

    If your not working for the  Oracle at Delphi the last value will typically be the current value of an attribute. So in 2012 this view can be replaced with:

    CREATE VIEW model.Customer AS
    SELECT Id
      , Number
      , Name
      , City
      , CurrentCity = LAST_VALUE(City)
        OVER(PARTITION BY Number ORDER BY StartDate
         ROWS BETWEEN UNBOUNDED PRECEDING
         AND UNBOUNDED FOLLOWING)
      , MaritalStatus
      , CurrentMaritalStatus = LAST_VALUE(MaritalStatus)
        OVER(PARTITION BY Number ORDER BY StartDate
         ROWS BETWEEN UNBOUNDED PRECEDING
         AND UNBOUNDED FOLLOWING)
    FROM dim.Customer

    Although the LAST_VALUE function seems a bit awkward to write due to the long window frame clause: ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING” it has some nice advantages:

    • No need to use a self join, which will enhance performance.
    • No need to use the EndDate column.
    • Much easier to maintain. Because the purpose of the LAST_VALUE function will be more obvious for your successors.

     

    SCD Type 0 with the FIRST_VALUE Function

    Occasionally you may stumble upon a request/requirement to show the original value of an attribute. (e.g. the sales that landed the customer). In that case you can simply add a column using the FIRST_VALUE function:

    FirstCity = FIRST_VALUE(City) 
      OVER(PARTITION BY Number ORDER BY StartDate)

     

    Mapping Queries Using Windowing Functions

    When you load fact tables you will want to lookup the surrogate keys of the dimensions. In the most simple variant you would use (in SQL 2008)

    SELECT Number, Id FROM dim.Customer
      WHERE EndDate ='9999-12-31'

    In SQL 2012, assuming you will not store the EndDate in your ETL process, you can use:

    SELECT Number, Id FROM
    (SELECT Number, Id, RowNumber = ROW_NUMBER() OVER(PARTITION BY Number ORDER BY StartDate DESC)
    FROM dim.Customer) Sub
    WHERE RowNumber=1

    Unfortunately you will have to use the sub query construct here because it’s not yet possible to use Windowing Functions in the WHERE clause.

     

    But why Would you not add an EndDate in the ETL Process?

    If you don’t end date rows the ETL process gets much easier, faster and less error-prone: You don’t have the distinguish between new and changed rows you’ll just add both in the same way to the dimension table. And you don’t have to identify and update the ‘old’ rows.

    And of course if you really need the EndDate you can just get if with the new LEAD function:

    EndDate = LEAD(StartDate, 1, '9999-12-31')
        OVER(PARTITION BY Number ORDER BY StartDate)

     

    More Information:

    MSDN (Books on Line) about the OVER Clause

    Jamie Thomson Debunking Kimball Effective Dates part 2 – Windowing Functions

    Introduction blog series on Windowing Functions by Fabiano Amorim on simple talk

    Blog post with a instructional video on the SQL 2012 Windowing Functions Leaving the Windows Open by Jeremiah Peschka

    Book: Microsoft® SQL Server® 2012 High-Performance T-SQL Using Window Functions by Itzik Ben-Gan