SSIS Performance Tuning

This is my much overdue follow-up blog from the presentation I gave to the New York SQL Server User Group. In this post, I am going to provide some additional resources to supplement the presentation, check out the following blog post on different performance tuning techniques that can be used for SSIS.

SSIS Performance Tuning the For Each Loop. Retain Same Connection.

https://mitchellpearson.com/2015/01/12/ssis-performance-tuning-the-for-each-loop-retain-same-connection/

Use File Cache and the Cache Transform to help mitigate network issues.

https://mitchellpearson.com/2014/06/30/use-file-cache-and-the-cache-transform-to-help-mitigate-network-issues/

Create efficient Range Lookups with the Task Factory Advanced Lookup Transform

https://mitchellpearson.com/2014/06/27/create-efficient-range-lookups-with-the-task-factory-advanced-lookup-transform/

Load thousands of files in SSIS lighting fast! MultiFlatFile Connection Manager.

https://mitchellpearson.com/2014/06/27/load-thousands-of-files-in-ssis-lighting-fast-multiflatfile-connection-manager/

Why are Asynchronous components so bad in SSIS?

http://tinyurl.com/ycnvr5ef

Using cache transforms in SSIS to improve Performance (Part 1)

https://mitchellpearson.com/2014/06/20/using-cache-transforms-in-ssis-to-improve-performance-part-1/

Use the aggregate transform in SSIS to out perform T-SQL!

https://mitchellpearson.com/2014/06/22/use-the-aggregate-transform-in-ssis-to-improve-performance-of-t-sql/

Advertisements

Use the aggregate transform in SSIS to out perform T-SQL!

I know I got your attention with that title. If you speak to any experienced BI Developer you will always be informed that you should perform all of your aggregations in T-SQL where possible in the source and not using the SSIS components. To this point I agree, well most of the time anyway. There are however times when using SSIS will out perform T-SQL if all the stars align. Let me explain.

Recently I was using a CTE and the row_number function in T-SQL to bring back only distinct customers from a list of transactions. This result set was then used to insert inferred members into the customer dimension, if that customer did not yet exist of course. Once the result set returned the list of distinct customers we performed a lookup against the customer dimension and we loaded the non matching customers into the customer dimension.

Now let me note, the T-SQL query honestly wasn’t that bad, it was only taking around 3 minutes to run. The bigger problem here was that the query was using temp db and this was causing locking issues with a more advanced design pattern (Partition switch) that the SQL Server Engine could not resolve.

So, we turned to good ole SSIS to help us out.

  • First, bring all records from the source exactly as they were, all 30 million transactions.
  • Next, perform the lookup to the customer dimension and redirect non matching rows to aggregate transform.
  • Finally, use the aggregate transform to group the new customers and then load the distinct values into the customer dimension.

image

This works because we filter out all the existing customers before doing the aggregation in SSIS and therefore only small percentage of the records require aggregation. The data flow task to load inferred members runs in approximately 80 seconds, where as the original T-SQL query alone was taking over 3 minutes.

Thanks for looking!