SSIS Performance Tuning

This is my much overdue follow-up blog from the presentation I gave to the New York SQL Server User Group. In this post, I am going to provide some additional resources to supplement the presentation, check out the following blog post on different performance tuning techniques that can be used for SSIS.

SSIS Performance Tuning the For Each Loop. Retain Same Connection.

https://mitchellpearson.com/2015/01/12/ssis-performance-tuning-the-for-each-loop-retain-same-connection/

Use File Cache and the Cache Transform to help mitigate network issues.

https://mitchellpearson.com/2014/06/30/use-file-cache-and-the-cache-transform-to-help-mitigate-network-issues/

Create efficient Range Lookups with the Task Factory Advanced Lookup Transform

https://mitchellpearson.com/2014/06/27/create-efficient-range-lookups-with-the-task-factory-advanced-lookup-transform/

Load thousands of files in SSIS lighting fast! MultiFlatFile Connection Manager.

https://mitchellpearson.com/2014/06/27/load-thousands-of-files-in-ssis-lighting-fast-multiflatfile-connection-manager/

Why are Asynchronous components so bad in SSIS?

http://tinyurl.com/ycnvr5ef

Using cache transforms in SSIS to improve Performance (Part 1)

https://mitchellpearson.com/2014/06/20/using-cache-transforms-in-ssis-to-improve-performance-part-1/

Use the aggregate transform in SSIS to out perform T-SQL!

https://mitchellpearson.com/2014/06/22/use-the-aggregate-transform-in-ssis-to-improve-performance-of-t-sql/

Advertisement

Move Files in SSIS Using execute process task

Recently I found myself redesigning an ETL Process for my client to improve performance. Ultimately the client receives hundreds of thousands of files daily that are pushed incrementally throughout the day. We wanted to take advantage of using the Multi Flat File connection manager for efficiently and quickly processing these files. So far so good right?

Not sure what the Multi Flat File Connection manager is? Check out my blog here:

First in order to really use this connection manager I needed to move the files from the original “landing zone” (directory where files are being pushed) to a controlled working directory. So how do I do this quickly and efficiently? Glad you asked Winking smile. With an execute process task and the windows move command of course!

Walkthrough:

  1. Create a new SSIS Package.
  2. Pull the Execute Process Task into the control flow.
  3. Open the Execute Process Task and click on the Process tab (on the left).
  4. For executable point to your windows command prompt, found here:
    1. C:\Windows\System32\cmd.exe
  5. Finally it’s time to issue our windows move command, here is the breakdown in 5 parts:
    1. /C – required when running windows commands from the Execute Process Task
    2. /move – Move file command, read more here:
    3. /Y – Suppresses prompting to confirm you want to overwrite an existing destination file.
    4. Source Directory with wild card filter: C:\Blogs\1000_Files_Test\Test*
    5. Destination Directory: C:\Blogs\1000_Files_Test_Move\

image

Thanks and enjoy!

Upgrading Script Tasks in SSIS. Target of Invocation on Script Task.

Recently I was working on a project with @SQLDusty (Dustin Ryan) and we were upgrading some projects from SQL Server 2012 to 2014. One of our script tasks that we were using to dynamically create and process cube partitions was failing. The worse part is that the error message was completely generic so it wasn’t immediately apparent what was wrong. After a little troubleshooting we were able to find the root cause and solve the issue.

Below is the error message we were getting from the SSISDB:

image

The problem that we had was that our .net reference for Analysis Management Objects was 11.0 (2012) and it needed to be 12.0 (2014).

Here is a screenshot of the .net reference as seen in the solution explorer.

image

This screenshot of the properties window displays the version number for my .net reference. Also pay close attention to the Description box. This description is what you actually look for and add when adding the Microsoft.AnalysisServices reference to your script task.

image

So now all we need to do to fix this problem is remove the current reference and add the correct reference back in.

image

1) From the solution explorer inside your script task component right click on reference and select “Add Reference….”

image

2) From the new window that opens up select assemblies and then extensions:

image

3) Finally select Analysis Management Objects 12.0.0.0 as seen below and add it to your references.

image

This fixed our problem and if you are also going through the upgrade process I hope this helps you as well!

Download Files in SSIS with an HTTP or HTTPS connection.

Recently I wanted to download historical stock information from the internet and store this information into a data warehouse. Using the Task Factory Download File Task I was able to dynamically download thousands of csv files with historical stock information and trends for my data warehouse.

The Download file task allows you to download files over an HTTP connection from inside an SSIS Package and it also has support for HTTPS, I will show you where to enter credentials.

If you want to walk through this example and don’t currently have Task Factory you can download a free 14 day trial here: Free Trial Download

Demo Time:

So in this example I am going to download a CSV File from Yahoo Finance with historical price information for Fed Ex.

  • Create a new package in SSIS and pull the Task Factory Download File Task into the control flow.
  • Now we need to create our HTTP Connection manager.
  • Inside the Download File Task select “Create New HTTP Connection…”

image

  • Creating the connection manager is very easy, especially if you are not using HTTPS.
    • Server URL: (This is the domain that is hosting the file)
    • Use Credentials: (Select this box and proceed with your username and password if credentials are required.
    • Test Connection: (Test Connection before proceeding.)
  • For Server URL I used: Http://finance.yahoo.com
  • Click Ok. Connection manager has now been created successfully.

image

  • Now we need the exact location of the file we want to download.

image

  • Now that we have the exact location of the file we can finish setting up the Download File Task.
  • Paste the file location into the section “Enter Path of File To Download:”
    • Note I am simply specifying a Text Value for this walkthrough. You can use variables and expressions with this task to make it as dynamic as you need.

image

  • The final step in this process is to specify the location where the file will be stored once downloaded. Once again I am using “Text Value”.
  • “Enter File Path For Local File:” Enter the specific file location. (Screenshot)

image

Now execute the package and there you go!

image

Thanks for Looking.

Build notification framework to send SMS Text messages and Emails in SSIS!

Of the over 40 components in Task Factory the TF Advanced Email and SMS Task is one of my favorites. In this blog I am going to discuss the rich features this task offers and provide a walkthrough of setting this task up to send an email or SMS text message upon package failure.

If you like this blog and you want to try out a free trial of Task Factory you can find it here: http://pragmaticworks.com/Products/Task-Factory/TrialDownload

Feature Highlights

  • Supports Email and SMS messages.
  • Can send emails with attachments.
  • Inbuilt HTML editor for HTML email.
  • Allows use of variables as placeholders in the Email Body so variables can be automatically replaced when an email is sent!
  • Expressions can be set on properties to make the package dynamic.

Note: We will be using SMTP connection information to set up our email connection manager. Generally a basic Bing/Google search will return the connection information required. For example Yahoo SMTP Connection info or Gmail SMTP Connection info.

Ok let’s get started with our walkthrough.

  1. Please open up a new SSIS Package.
  2. Next click on the Event Handlers tab at the top.
    1. Don’t change the executable, by default it is set at the package level.
    2. Also leave OnError for the Event Handler.

image

  1. Next click on the hyperlink “Click here to create an ‘OnError’ event handler for executable ‘<Executable Name>’.

image

  1. Now pull in the Task Factory Advanced Email and SMS task.
  2. Next we will set up a new SMTP Connection Manager.
    1. Select the drop down box next to SMTP Connection and then select “Create New Email Connection”.

image

  1. On the general tab fill out the following information:
    • Protocol Type: (Must be SMTP)
    • Mail Server: (The Mail Server will differ by mail provider. I’m using Office 365 settings.)
    • User Name:
    • Password
  2. On the advanced tab fill out the following:
    • Server Port: (Once again changes per mail provider)
    • Type of encrypted connection: TLS, SSL or NONE
    • Timeout in Seconds: (Default is 60)
  3. Finally test connection to validate and click ok to close.

image

image

image

Setting up SMS Providers and Emails

In this section I will demo how easy it is to set up the component to send either an SMS or Email message. Note all of this can be set up dynamically using variables and expressions!

  1. Select the dropdown for “To:” and select either email or SMS. Here I selected email and then manually entered the email address.
    • Please note you can send emails to multiple recipients by separating them with a semi colon ;.

image

  1. Ok, next let’s override our email option and select SMS.
    • First enter the phone number.
    • Next select the provider (The “Provider” will be required for SMS messages.)
    • Finally select the add Icon, see screenshot below:

image

Dynamically set the body of the email and subject using expressions:

In this section we are going to use expressions for the body and subject of the email. Note that expressions can be set on virtually every property of this component.

  1. Select the “Expressions” tab found at the bottom left of the TF Advanced Email editor.

image

  1. From the properties window select “Subject” from the drop down.
  2. In the “Expression” box select the Ellipsis button to open the expression editor.
  3. Here is where we can use the SSIS Expression language and variables to make our subject dynamic at run time. (See third Screenshot below)

image

image

image

  1. Once you have completed your expression for the Subject click ok and that will close the expression editor. Now from the properties window select “Message” and once again select the Ellipsis button to open the expression editor.

image

Now when I execute the package if an OnError event happens in the package and only if this event occurs then I will receive an email or text message!

Here is the email I received upon failure of the package, pretty cool!

image

Thanks for looking!

Get list of subfolders in SSIS with SSIS Script Task!

Recently I needed to get a list of subfolders in SSIS. I also only wanted to bring back a list of subfolders that had the date appended to the end. So let’s walk through how to do this in Script Task using c#.

image

First we need to create two variables in SSIS:

image

Next bring a script task into the control flow and open it up for editing.

  • Select C# for the script language.
  • ReadWriteVariables select objDirectoryList

image

Now select Edit Script

  • Under the section “public void Main()” enter the following code:
    • Of course replace the directory location with your directory.
Dts.Variables["objDirectoryList"].Value = 
       System.IO.Directory.GetDirectories(@"C:\Blogs\RootFolder\","*20*",AllDirectories);
Dts.TaskResult = (int)ScriptResults.Success;
        }

        public SearchOption AllDirectories { get; set; }

Essentially all I’m doing here is populating the object variable objDirectoryList with all the subdirectories in the folder C:\Blogs\RootFolder\. Notice that I have also added a filter here “*20*”. If you do not wish to have a filter and  you want to bring back all subdirectories then just replace my filter with “*”.

Finally we can now iterate through this object variable in a FELC.

image

Below you can see the two directories that are returned from the script task above.

image

image

Thanks for looking, hope this helps.

SSIS Performance Tuning the For Each Loop. Retain Same Connection.

With the Retain Same Connection property I was recently able to more than double the performance of my SSIS package for a client. The package was looping over hundreds of thousands of files and logging the file names into a table to be later processed.

Retain Same Connection is a property setting found on connection managers. By default this property is set to false which means that each time the connection manager is used the connection is opened and subsequently closed. However in a situation like mine this can significantly degrade overall performance as the package has to open and close that connection hundreds of thousands of times. In this blog I’m going to set up a very basic example and walk through setting up this connection manager.

Here I have set up two connection managers. Both connection managers point to the same database. It’s important to note that if you are using Project Level Connection managers in SQL 2012 that setting this property inside any one package will persist across all packages. Therefore I create two connection managers.

image

Inside the package I am simply using a for each loop task to loop through a list of files in a directory and then I load the file names into a table using an Execute SQL task. For demo purposes I have two examples in one package.

image

  1. The FELC on the left takes 30 Seconds to run.
  2. The FELC on the right takes 11 Seconds to run.

Let’s now discuss how and where we can set this property.

Right click on your connection manager found in the connection managers pain inside the package and select Properties.

image

Inside the properties window find the property “RetainSameConnection” and set the value to “True”. Now the connection will remain open for the duration of the package.

image

As always thanks for looking!

Download Emails and Email attachments in SSIS.

Ever need to download emails and attachments using SSIS? If so you have come to right the place. Task Factory, a suite of custom SSIS components by Pragmatic Works has a component called the Email Source. The component supports both POP3 and IMAP and is really easy to set up, so let’s get started!

There is also full support for filtering messages based on sender, message to, subject, date received, body and priority.  This way you can filter down to only the messages you want to load.

If you are one of the poor unfortunate souls that don’t yet have Task Factory you can give it a try with a free trial download here: http://pragmaticworks.com/Products/Task-Factory

Walkthrough:

  1. Create a new package and bring in a data flow task.
  2. Inside the Data Flow Task pull in the TF Email Source component.
  3. Open the TF Email Source. Once open we need to create our connection manager. You can choose between POP3 or IMAP for reading emails. I prefer IMAP because it allows you to choose specifically what folder in your email you want to read from. More on this later.
  4. image
  5. On the general tab of the connection manager fill in the following items:
        1. Protocol Type
        2. Mail Server
        3. User Name
        4. Password
  6. image
  7. On the advanced tab of the connection manager fill in the following items:
        1. Server Port
        2. Encrypted Connection
        3. Root Folder Path (Directory to read from)
  8. image
  9. Click ok to close out the connection manager.
  10. Next select the location to save attachments. “Attachments Directory”
  11. I created a variable to store the message ids in. I won’t be using this object variable in this blog.
  12. Finally if we want to add a filter we can do that below as well.
  13. image
  14. Now simply connect the email source to a destination component.

Does cube partition exists? Script task in SSIS explained!

The first necessary step in dynamically creating cube partitions is to determine if the partition already exists. If the partition does not exists then we create the partition, if it does exists we simply process the existing partition. Recently I gave a free webinar online at PragmaticWorks.com and a few of the attendees wanted more details on the script task I used, so here we go. It’s also important to note that I used a daily partitioning strategy for the webinar. This is important because the naming conventions of my partitions included the partition name and date that I used as a filter. For example:

image

In this blog post I will walk you through step by step setting up this script task. Please see below. First the setup:

  • Create the following variables, if you are in 2012 or 2014 create these as package parameters. We will use these parameters to establish our connection to Analysis Services and then build out the partition name dynamically.
    • image
  • Now we need to create another variable, this variable will be used in our control flow to determine if we should create the partition or process the partition, we will get to this part in a bit.

image The Script Task: (Setup) Now that we have created our variables and have that part set up we are going to build our script task.

  • Step 1) Drag a script task component from the toolbox into your control flow.
  • Step 2) Open the task for editing and choose C# as the script language.
  • Step 3) ReadOnlyVariables: Select all the variables from Step 1 in the setup section.
  • Step 4) ReadWriteVariables: Select the variable intDoesPartitionExists
  • Step 5) Finally, click the Edit Script button.

The Script Task: (Code)

  • Step 1) Add the Microsoft.AnalysisServices reference.
          1. From the solution explorer right click on references and select “new reference”.
          2. From the Add Reference window select .NET tab
          3. Find and select Analysis Management Objects and click ok.
    • Step 2) Add Microsoft.Analysis Services as a namespace. Screenshot below:
      • image
  • Step 3)  Finally we can add our code. In this section we are going to use the parameters previously created to connect to our cube and then check to see if a partition exists. For this blog we simply hardcoded these values for demonstration purposes but typically we would set these values dynamically. If the partition exists I populate the variable intDoesPartitionExists with a value of 1 for True. If the partition does not exists I populate it with the value of 0 for false. I then use this variable in the control flow to determine whether to process or create partitions.
  • Step 4) Find the section titled “public void main() and paste the following code where it says Add your code here
// TODO: Add your code here

            string serverName = "";
            string databaseName = "";
            string cubeName = "";
            string measuregroupName = "";
            string partitionname = "";
            string Date = "";


            //Retrieve the server, database, cube, etc. details from the variables
            serverName = Dts.Variables["ServerName"].Value.ToString();
            databaseName = Dts.Variables["DatabaseName"].Value.ToString();
            cubeName = Dts.Variables["CubeName"].Value.ToString();
            measuregroupName = Dts.Variables["MeasureGroupName"].Value.ToString();
            partitionname = Dts.Variables["MeasureGroupPartitionName"].Value.ToString();
            Date = Dts.Variables["Date"].Value.ToString();

            //Build the connection string
            string ConnectionString = "Data Source=" + serverName + ";Catalog=" + databaseName + ";";

            //Connect to the server and select the database, cube, etc.
            Server srv = new Server();
            srv.Connect(ConnectionString);
            Database db = srv.Databases.FindByName(databaseName);
            Cube cb = db.Cubes.FindByName(cubeName);
            MeasureGroup mg = cb.MeasureGroups.FindByName(measuregroupName);

            if (mg.Partitions.ContainsName(partitionname + "_" + Date))
            {
                Dts.Variables["intDoesPartitionExists"].Value = 1;
            }
            else
            {
                Dts.Variables["intDoesPartitionExists"].Value = 0;
            }

            srv.Disconnect();
            srv.Dispose();

You can find my webinar on Processing SSAS with SSIS at the following link: http://pragmaticworks.com/Training/FreeTraining/ViewWebinar/WebinarID/1714 Thank you for looking!

Processing SSAS Cubes with SSIS –Q & A

For those of you who attended my webinar on Processing SSAS cubes with SSIS, thank you! For those of you who didn’t the video is still available here: http://pragmaticworks.com/Training/FreeTraining/ViewWebinar/WebinarID/1714.

Below are a few of the questions I received in the webinar. There will be a couple of follow up blogs for the more detailed questions.

Q: Can you please share the script used to generate the partitions?

A: I will post a follow up blog in the next couple of days with step by step instructions. Sharing my exact script would not be helpful. I will update this post with the additional blog.

Q: How are you stopping through your SSIS Steps within package execution?

A: Breakpoints! I used breakpoints to pause the package during execution so I could explain exactly what was going on. To enable breakpoints on a task inside SSIS you simply right click on the task and then select edit breakpoints. I showed this at the end of the webinar during Q&A.

Q: What are our best logging options if we do all of our cube processing with script tasks instead of the native components?

A: Your best logging options are going to be setting up a Profiler Trace or Extended events. I briefly explained setting up a trace at the end of my webinar.

Q: Does the C# script task need the connection string information to query SSAS for the partition information?

A: Yes. In my webinar I showed how to dynamically create partitions for your cube inside of SSIS. Before creating them I used a script task to see if the partition already existed. I have posted a follow up blog below:

https://mitchellsql.wordpress.com/2014/11/18/does-cube-partition-exists-script-task-in-ssis-explained/

Q: How are the aggregation designs handled on the dynamic partitions?

A: This is handled in the XMLA by specifying the Aggregation ID. I showed this at the end of my webinar during the Q&A section and I will also include this in a follow up blog on creating the XMLA for dynamic partitions.

Q: Is this applicable to tabular cube processing?

A: Yes. The analysis services processing task processes tabular models, cubes, dimensions, and mining models.

Once again thank you for attending our free webinar. Until next time!