Azure Data Factory–Copy Data Activity

In this blog, we are going to review the Copy Data activity. The Copy Data activity can be used to copy data among data stores located on-premises and in the cloud.

In part four of my Azure Data Factory series, I showed you how you could use the If Condition activity to compare the output parameters from two separate activities. We determined that if the last modified date of the file (located in blob storage) was greater than the last load time then the file should be processed. Now we are going to proceed with that processing using the Copy activity. If you are new to this series you can view the previous posts here:

If Condition Activity – If True activities

As mentioned in the intro, we are going to pick up where we left on in the last blog post. Remember that the If Condition activity evaluates an expression and returns either true or false. If the result is true then one set of activities are executed and if the result is false then another set of activities will be executed. In this blog post we are going to focus on the If True activities.

  • First, select the If Condition activity and then click on activities from the properties pane, highlighted in yellow in the image below.
  • Next, click on the box “Add if True Activity”.

image

This will launch a new designer window, and this is where we will be configuring our Copy Data activity.

image

Copy Data Activity, setup and configuration

Fortunately the Copy Data activity is rather simple to set up and configure, this is a good thing since it is likely you will be using this activity quite often. The first thing we need to do is add the activity to our pipeline. Select the Copy Data activity from the Data Transformation category and add it to the pipeline.

image

Now we need to set up the source and the sink datasets, and then map those datasets. You can think of the sink dataset as the destination for all intents and purposes.

Select the copy data activity and then click on the Source tab found in the properties window. For the dataset I am simply going to choose the same dataset we used in our first blog in this series.

image

Creating the Sink dataset in Azure Data Factory

Now it’s time to set up the Sink (destination) dataset. My destination dataset in this scenario is going to be an Azure SQL Database. I am going to click on Sink and then I will click on + New.

image

When you click on + New, this will launch a new window with all your data store options. I am going to choose Azure SQL Database.

image

This will launch a new tab in a Azure Data Factory where you can now configure your dataset. In the properties window select “Connection”, here you will select your linked service, this is your connection to your database. If you haven’t already created a linked service to your database then click on + New. Clicking new will open up a new window where you can input the properties for your linked service. Take a look at the screenshot below:

image

Once the linked service account has been created I will select the table I want to load. All available tables will show up in the drop down list. I will select my dbo.emp table from the drop down.

image

The final step for the dataset is to import the schema, this will be used when we perform the mapping between the source and sink datasets. Next, I am going to click on the Schema tab. On the Schema tab I will select import schema, this actually returns a column that I don’t need and so I will remove the column from the schema as well. Take a look at steps three and four in the following screenshot:

image

Copy Data activity – Mapping

Now that the source and sink datasets have been configured, it’s time to finish configuring the Copy Data activity. The final step is to map these two datasets. Now I will navigate back to the pipeline and click on the copy data activity. From the properties window I will select the Mapping tab. Then I will import the schemas, importing schemas will import the schemas from the source and the destination. This step will also attempt to automatically map the source and the destination as well and so it is important to verify that the mapping is correct. Unfortunately my source dataset was a file and it did not have any column headers. Azure Data Factory automatically created the column headers Prop_0 and Prop_1 for my first and last name columns. Take a look at the following screenshot:

image

This was a simple application of the Copy Data activity, in a future blog post I will show you how to parameterize the datasets to make this process dynamic. Thanks for reading my blog!

Advertisement

Quick Tips–Updating Parameters from the PBI Service

In this quick tip I want to share how you can update Parameters in Power BI from the service. Previously, this was not an option and Parameters could only be updated from Power BI Desktop.

Updating Parameters in Power BI

To update parameters, navigate to the Datasets section in the Power BI Service. Next, click on the schedule refresh icon. Updating parameters is done from the schedule refresh window.

SNAGHTML20f21a8d

Click on Parameters from the schedule refresh window. Once expanded you will see available parameters, simply update the parameters and you’re done!

image

Thanks for reading!

Quick Tips – Export data from Power BI using R

One of my favorite tricks in Power BI is to use R integration to quickly export data outside of Power BI. This is also a very popular question among Power BI enthusiasts. Once users realize the true capabilities and easy of the Power Query editor to transform and clean data they want to clean up their files using Power BI but the challenge is then how do I get the data out?

Update: Video embedded at bottom of blog post.

Prerequisite – R

If you want to test this method out you need to install R on your machine, for your convenience see the following link:

https://mran.microsoft.com/open/

Export data from Power BI using R

Like all of my quick tip blogs, this will be quick Smile.

You will export data from the Power Query Editor. To launch the Power Query Editor click Edit Queries from the Home ribbon in Power BI.

image

Inside the Power Query Editor, click the transform ribbon and then click on R.

SNAGHTML3d882ca

Once you click on the large letter R seen in the screenshot below a new window will open. Now you just have to type in some basic code. Here is the pseudo code:

write.csv(dataset, <destination>)

image

That’s it! Click ok and check your folder for the file. Do notice that I used two backslash characters, this is required. The other question I get is can I write the results to an excel file or to a SQL Database table and the answer is…. Yes! R has packages that make doing particular tasks easier and if you want to write to an Excel file or SQL Server table you would need to install those packages. My hope is to do some follow-up post around this topic.

image

 

Thanks for checking out my blog!