Create efficient Range Lookups with the Task Factory Advanced Lookup Transform

There are three generally accepted methods to perform the tricky range lookup in SSIS. A great blog post that outlines all three of these methods and how they perform can be found here. Per the post the best performing solution is unfortunately the one that is arguably the most difficult to set up. Fortunately for us we can get all the performance of scripting without all the work!

Scenario:

We need to perform a lookup to the segment table based on the Weekday and Time that the transaction took place. For example, assume we had a transaction take place on Tuesday at 15:30:00. This transaction would need to be mapped to the SegmentKey of 3. Screenshot of segment table below:

image

Now that we have identified our business requirement, let’s look at how we can use the Advanced Lookup Transform in Task Factory to solve this tricky problem.

  1. For this example we will need two data flow tasks.
    1. The first data flow task will load our segment table into the TF ADV Lookup Cache Transform.
      1. Inside the first data flow task I bring in an OLEDB Source and select my Segment table.
      2. Next I bring in the TF Advanced Lookup Cache transform and connect the two transforms. (Screenshot 1)

image

      1. Open the TF Cache transform and select “Create New Lookup Cache Transform” (Screenshot 2)

image

      1. Click the Input Parameter tab at the top. We are going to add two parameters, one for day and one for the time of the actual transaction. In the bottom dialog box we are going to define our conditions for when a lookup should occur. For this example we have three criteria.
        1. First the day of the transaction must equal the day in the Segment table.
        2. Second the begin time must be less than or equal to the time of the transaction.
        3. Finally the end time must be greater than or equal to the time of the transaction.

image

      1. Once the criteria have been entered click ok twice to close out of the cache connection manager.
      2. Now in our second data flow we will pull in our transactional data and perform our lookup. For this example I have written a simple select statement to simulate a transaction.
        SELECT 'Tuesday' AS DayOfWeek, '15:30:00' AS TIME
      3. Now that the source has been set up we need to bring in the TF Advanced Lookup Transform and open it for editing.
      4. Select our Cache Connection manager from the drop down and then map our input columns to our input parameters. Screenshot below.

image

      1. Now execute the package!

image

Thanks for looking.

Advertisements

Load thousands of files in SSIS lighting fast! MultiFlatFile Connection Manager.

Recently I was working on a project where we needed to process upward of 12,000 files per day. As most would expect we used the Foreach Loop container in SSIS to loop through all 12,000 files. The results? A whopping 77 minutes to run. WOW! (Incase you were wondering this is WOW that’s AWEFUL!, not wow that’s great)

I assume the excessive time is here is due to the time it takes to dynamically set the flat file connection manager and close/reopen the connection. I don’t know exactly how long this process takes but hypothetically let’s say it takes 1 second for every 3 iterations of the loop.

Now some simple math. 12,000 files / 3 (files per second) / 60 seconds = 67 minutes.

So how did we get around this problem? Well we tried two different design patterns. The first and the one I will be blogging about here is the “MultiFlatfile” connection manager. The second, more complicated to set up, but extremely efficient can be found here: Anthony Martin.

Let’s performance test the MultiFlatFile connection manager!

For this post I created three packages.

  • The first package generates 1000 flat files for testing.
  • The second package iterates through all 1000 files. Runtime (22 Minutes)
  • The third package uses the multiflatfile connection to bring in all the files instantaneously. (13 Seconds)

The Foreach loop got utterly destroyed in this contest. The multiflatfile connection manager was able to process all 1000 files in only 13 seconds!! Let’s look at how to set this up.

  1. Right click in the connection manager window and select New Connection > MultiFlatFile > Click Add
  2. Set up the general screen like your typical Flat File connection, you will need to hardcode an existing file.
  3. Click ok to close the connection manager editor, verify the connection manager is selected and then open up the properties window.
  4. In the properties window, scroll down to Expressions > Click the Ellipsis “…” and select “Connection String” from the drop down menu.
  5. Next click  the ellipsis to the right to open up the Expression editor. This part is critical because this is where we set an expression that will determine which files are extracted from the directory. For this simple example my expression was “C:\\Blogs\\1000_Flat_Files\\Test*” (Don’t miss the asterisk here!). The connection manager will now retrieve all files that start with “Test”. Screenshots for set up below.

image

image

image

image

image

This is an awesome alternative to the For Each Loop and it’s much easier to set up, however it has some obvious drawbacks with how flexible it is. If you discover this pattern does not perform as well as you hoped or is not flexible enough please review the blog post I linked above by Anthony Martin. It takes a little more set up but offers incredible flexibility and is the best performing solution!

As always, thanks for looking!

SSIS Tips and Tricks, the Select * counter punch.

Working with SSIS and the BI Stack from Microsoft I have discovered a few tips and tricks that come in quite handy. To my surprise a lot of these are not very well know in the BI World. In this series I am going to try and post one or two SSIS Tips and Tricks on a weekly basis.

As you are well aware the SELECT * syntax in SQL Server should not be used inside of SSIS. This is because the metadata in SSIS is very rigid and if you add or remove columns to the table in your select statement it can affect other transforms in the pipeline that have been defined with very specific metadata properties.

So if you are like many people who use SSIS it’s likely you alt + tab over to SQL Server Management Studios and then view the table and write your select syntax there with all the applicable columns for your query. Then you copy that syntax and alt + tab back over to BIDS / SSDTs and paste.

It’s unfortunate that there isn’t some kind of tool inside SSIS that will assist us with our incredibly difficult predicament. Just kidding, of course there is, if there wasn’t I wouldn’t be blogging about it! Let’s demonstrate.

First – Open up your OLE DB Source component in your data flow and write your Select * from table statement. For this example I used SELECT * FROM SALES.CUSTOMER in the AdventureWorks2012 database.

Next click “Build Query…” found on the right side of the OLE DB Source Editor.

image

image

Click ok and you are done! Thanks for looking and check back for more time saving tips and tricks.

Why are Asynchronous components so bad in SSIS?

If you have spent anytime with SSIS you have heard that Asynchronous components also known as “Blocking” components are bad for performance and should be avoided where possible. You have also heard some analogy or another that explained how these components “block” the data in the data flow and slow down performance. But I’m here to tell you that it gets much worse.

The data flow sends data down in buffers. Each buffer in the data flow will first utilize memory if available, the memory that is captured for that buffer of records will be released once the data flow is done with that buffer. For example once those records in that buffer have been loaded to the destination the buffer and the corresponding memory is released. This memory can now be used for the next buffer of memory coming down the pipeline in the data flow task. This is where the true evil with asynchronous components come in.

What happens if the buffer is not released because it got caught up in the data flow at some blocking transform?  In a worse case scenario your machine/server becomes starved for memory and then SSIS is forced use disk instead of memory. Too many buffers spooled to disk can exponentially increase your package run time. Let me demonstrate.

For this example I have created two packages. Each package uses a sorting transform, however in different locations.

Package One. This package takes an unknown amount of time to run, I cut it off after approximately 68 minutes. The first screenshot shows the package in it’s current state, the second and third screenshot give some insight into why this is taking so long to run.

Screenshot 1:

image

The following screenshot is obtained from SSMS after I run the package from integration services and then run the following query against the SSISDB.

  • “SELECT * FROM [catalog].[dm_execution_performance_counters] (NULL) “

Screenshot 2:

image

Screenshot 3: (Here we see the execution performance from the SSISDB)

image

Unfortunately there are times when you just have no choice and must use asynchronous components in SSIS. One option to consider, can the blocking transform be placed in a more strategic location to minimize the negative impact to performance? For example. In package one the sorting transform was placed before the lookup, the lookup is only forwarding non matching records to the destination, so why do the matching records need to be sorted? If matching records don’t need to be sorted then we can move the blocking transform to after the lookup and keep everything in memory.

Package Two: (This package completes in 1 minute and 20 seconds)

Screenshot 1:

image

Screenshot 2:

image

Screenshot 3:

image

Thanks for looking!

Use the aggregate transform in SSIS to out perform T-SQL!

I know I got your attention with that title. If you speak to any experienced BI Developer you will always be informed that you should perform all of your aggregations in T-SQL where possible in the source and not using the SSIS components. To this point I agree, well most of the time anyway. There are however times when using SSIS will out perform T-SQL if all the stars align. Let me explain.

Recently I was using a CTE and the row_number function in T-SQL to bring back only distinct customers from a list of transactions. This result set was then used to insert inferred members into the customer dimension, if that customer did not yet exist of course. Once the result set returned the list of distinct customers we performed a lookup against the customer dimension and we loaded the non matching customers into the customer dimension.

Now let me note, the T-SQL query honestly wasn’t that bad, it was only taking around 3 minutes to run. The bigger problem here was that the query was using temp db and this was causing locking issues with a more advanced design pattern (Partition switch) that the SQL Server Engine could not resolve.

So, we turned to good ole SSIS to help us out.

  • First, bring all records from the source exactly as they were, all 30 million transactions.
  • Next, perform the lookup to the customer dimension and redirect non matching rows to aggregate transform.
  • Finally, use the aggregate transform to group the new customers and then load the distinct values into the customer dimension.

image

This works because we filter out all the existing customers before doing the aggregation in SSIS and therefore only small percentage of the records require aggregation. The data flow task to load inferred members runs in approximately 80 seconds, where as the original T-SQL query alone was taking over 3 minutes.

Thanks for looking!

Using cache transforms in SSIS to improve Performance (Part 1)

There are quite a few different scenarios for when cache transforms can be used to improve usability and performance inside SSIS. In this post I am going to quickly outline a scenario where cache connection managers can greatly improve performance when loading a fact table.

During the pre-execution phase of package execution SSIS “caches” the lookups inside your package into memory, unfortunately if you are using OLE DB connections in your Data Flow Task this caching happens in a serialized format (one after the other). Typically this isn’t a big deal because you may only have one or two large dimension lookups and all others are very small in size and very fast. But what if this isn’t the case? What if you have 5, 6, 7, or more dimensions all with millions of records that you must cache before the package begins execution? Here is how we can get around this little issue.

image

  • Inside the data flow tasks we will set each source to point to our cache transforms. Each one of the Cache Transforms will require a connection manager to store the data.

image

  • Below is a screenshot of the connection managers created by the Cache transforms:

image

  • Now that the cache connection managers have been set up, you will need to point each of your lookup transforms to the “Cache Connection manager” instead of the default of  OLE DB connection manager.

image

  • Now to the last step. Now click on the connection tab and point to the cache connection manager that stores the information for your lookup:

image

  • All other configuration steps are the same as normal. Thanks for looking, and hope this helped.

Dynamically set secure FTP connection string and password

Note: This walkthrough is specific to the Secure FTP component available in TaskFactory. TaskFactory is a product offered by Pragmatic Works consisting of 40+ Custom SSIS components for development.

Recently I worked with a client who had an interesting business problem. This client processed documents daily for hundreds of clients and then uploaded these files to each client’s Secure FTP site. They needed to set up the Secure FTP component so that they could dynamically change the connection based on the file being processed and uploaded. In this walkthrough I will explain how the connection string for your FTP Site can be set dynamically.

  • Part 1 – Building the connection string.
  • Part 2 – Setting an expression on the TF Secure FTP Connection Manager

(Note: This tutorial is modifying the connection string of an existing Task Factory sFTP connection manager. If you want to follow along please set up a basic TF Secure FTP task and create a connection manager.)

Part 1 – Building the TF Secure FTP connection string.

There are 18 parts to the connection string for the Secure FTP connection. However for this tutorial we are only concerned with 5 of the 18 sections, we will leave the other sections set to the default settings. What are the 5 sections we are concerned with? I’m glad you asked!

  1. Connection Type
  • 1 = SFTP – SSH FTP
  • 2 = FTPS – FTP over implicit TLS or SSL
  • 3 = FTPES – FTP over explicit TLS or SSL
  • 4 = FTP
  1. Host
  2. User Name
  3. Password
  4. Port

Ok let’s get started. For this example I created 5 variables, all 5 variables have a data type of String. For demonstration purposes I hardcoded the values for each variable. For the case referenced above the client populated these variables from a SQL Table that stored the connection information.

  1. Create the five variables in the screenshot above.
  2. Add these variables to the connection string. (Connection String Text below)
  3. If you used the same variable names as this tutorial you can simply copy the connection string below. (Section A)
  4. Here is a screenshot of where we changed our connection string.
  5. I replaced the hard coded value in the connection string with ” + @[User::VariableName] + “

Section A

“ConnectionType=” + @[User::strConnectionType] + “;Host=” + @[User::strHost] + “;Port=” + @[User::strPort] + “;Username=” + @[User::strUserName] +”;Password=” + @[User::strPassword] + “;Timeout=60;DefaultRemoteDirectory=;UseProxy=False;ReuseConnection=False;ProxyType=0; ProxyHost=;ProxyPort=0;ProxyUsername=;UseBinaryTransfer=True;UsePassiveMode=True; IgnoreServerIP=True;TransferBufferSize=0;SSLVersion=2;”

Part 2 – Setting an expression on the TF Secure FTP Connection Manager

  1. Under Connection Managers highlight your FTP Connection and hit F4 to open the properties window.

     

  2. In the properties window highlight “Expressions” and click the ellipsis button on the right.

     

  3. In the Property Expressions Editor choose ConnectionString for the Property.
  4. Next click the Ellipsis on the left to open up the Expression Builder.
  5. Now copy and paste the connection string into the Expression box.

     

  6. Evaluate the expression to verify it works.
  7. Click ok to close the Expression Builder.
  8. Click ok again to close the Property Expressions Editor.
  9. All Done!

Conclusion: Whether you are uploading files to multiple FTP sites or just want to set the Password dynamically because it changes periodically this method will help. Thanks for looking!

Follow me on twitter! @MitchellSQL