Click here to Skip to main content
15,868,006 members
Articles / Hosted Services / Azure
Tip/Trick

Transferring Data from a Text File Hosted on an Azure Blob Storage onto an Azure SQL Managed Instance using Azure Data Factory

Rate me:
Please Sign up or sign in to vote.
0.00/5 (No votes)
10 Jun 2021CPOL3 min read 3.8K   2  
How to transfer data using Azure data factory

Problem Statement

Sravan works as a Azure SQL DBA with UniversalCollege Group. Developers at UniversalCollege wants Sravan to upload the Student data present in a text file on an Azure BLOB Storage onto an Azure SQL Managed Instance named universalcollege.database.windows.net.

Solution

Sravan decides to perform this Task using Azure Data Factory. Azure Data Factory is a very powerful ETL tool that allows users to create ETL solutions in the Cloud. In the next couple of days, we are going to explore Azure Data Factory in depth.

At present, the text file is uploaded on an Azure Blob Storage inside a Container as shown below:

Image 1

It contains the following data: sid,sname,gender.

1,Sravan K,Male
2,Rahul,Male
3,Niharika,Female
4,Amandeep,Female

This data needs to be uploaded inside the Student table hosted on the college database which in turn is hosted on a PaaS Instance named universalcollege.database.windows.net.

Now let’s logon to the Azure Portal; portal.azure.com and try to create a Data Factory, go to All services->Integration->Data Factories as shown in the screen capture below:

Image 2

Click on + Create as shown in the screen capture below:

Image 3

Enter the required information as shown in the Screen Capture below and then click on Review+create button.

Image 4

Under Git Configuration; ensure that you click on the CheckBox named Configure Git later as shown below:

Image 5

Once all the Validations are passed; kindly click on the Create button as shown in the screen capture below:

Image 6

Kindly click on UniversalCollege Data Factory as shown below:

Image 7

Click on Author & Monitor as shown in the screen capture below:

Image 8

This opens the Azure Data Factory home page as shown below:

Image 9

Click on Copy data as shown below:

Image 10

Please give an Appropriate Task Name and then click on the Next> button as shown in the screen capture below:

Image 11

In the Source Data Store; from the DropDown; kindly select Azure Blob Storage and then Click on + Create New Connection as shown in the screen capture below:

Image 12

In the New Linked Service; select Azure BLOB Storage as shown in the screen capture below:

Image 13

Enter the required information and then click on Test Connection as shown in the screen capture below:

Image 14

If the Test Connection is successful; it means that the source is configured successfully; click on the Create button.

Image 15

Click on the Next> button as shown below:

Image 16

Choose the required file and then click on the Next> button as shown in the screen capture below:

Image 17

Click on the Next> button as shown in the screen capture below:

Image 18

In the Destination Data Store; kindly select Azure SQL Database from the drop down and then click on +Create New Connection as shown in the screen capture below:

Image 19

In New Linked Service; select Azure SQL Database and then click on Continue button as shown in the screen capture below:

Image 20

Enter the required information and then click on Test Connection as shown in the screen capture below:

Image 21

If Test Connection is successful; then it means that we are good to proceed ahead.

Image 22

Click on the Create button as shown below:

Image 23

Click on the Next> button as shown below:

Image 24

Click on the Next> button as shown below:

Image 25

Click on the Next> button as shown below:

Image 26

Click on the Next> button as shown below:

Image 27

Click on the Next> button as shown below:

Image 28

Click on the Finish button as shown below:

Image 29

The below screen capture confirms that the data is inserted successfully into the desired table on the destination PaaS database.

Image 30

I hope you found something interesting to learn.

History

  • 10th June, 2021: Initial version

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Architect CapGemini India Private Limited
India India
Satnam Singh is a DBA Manager with Capgemini in India. Satnam has around 14 years of experience on Microsoft SQL Server Technology. His main area of expertise is T-SQL, High Availability solutions, Migrations, Upgradations, Performance Tuning etc. During his free time when he is not talking about SQL, Satnam loves spending time with his family. Satnam Lives along with his Parents in Mumbai,India.

Comments and Discussions

 
-- There are no messages in this forum --