For those of you who attended my webinar on Processing SSAS cubes with SSIS, thank you! For those of you who didn’t the video is still available here: http://pragmaticworks.com/Training/FreeTraining/ViewWebinar/WebinarID/1714.
Below are a few of the questions I received in the webinar. There will be a couple of follow up blogs for the more detailed questions.
Q: Can you please share the script used to generate the partitions?
A: I will post a follow up blog in the next couple of days with step by step instructions. Sharing my exact script would not be helpful. I will update this post with the additional blog.
Q: How are you stopping through your SSIS Steps within package execution?
A: Breakpoints! I used breakpoints to pause the package during execution so I could explain exactly what was going on. To enable breakpoints on a task inside SSIS you simply right click on the task and then select edit breakpoints. I showed this at the end of the webinar during Q&A.
Q: What are our best logging options if we do all of our cube processing with script tasks instead of the native components?
A: Your best logging options are going to be setting up a Profiler Trace or Extended events. I briefly explained setting up a trace at the end of my webinar.
Q: Does the C# script task need the connection string information to query SSAS for the partition information?
A: Yes. In my webinar I showed how to dynamically create partitions for your cube inside of SSIS. Before creating them I used a script task to see if the partition already existed. I have posted a follow up blog below:
Q: How are the aggregation designs handled on the dynamic partitions?
A: This is handled in the XMLA by specifying the Aggregation ID. I showed this at the end of my webinar during the Q&A section and I will also include this in a follow up blog on creating the XMLA for dynamic partitions.
Q: Is this applicable to tabular cube processing?
A: Yes. The analysis services processing task processes tabular models, cubes, dimensions, and mining models.
Once again thank you for attending our free webinar. Until next time!