How to bring on-premise SQL Server data into Azure Fabric Lakehouse

On-premise data gateway - one of the important feature/service Azure Fabric platform is using to bring your on-premise data into Azure cloud. The data movement is secure and encrypted.

You can review my my earlier post https://splaha.blogspot.com/2025/03/use-on-premise-gateway-in-fabric-and.html  where I explain how to configure On-premise data gateway using Step 1 and Step 2.

In this exaple, I'm going to use the same data conenction to pull my on-premise SQL tables using custom join SQL into Lakehouse.

The entire post is self explanatory with step-by-step snapshot. I hope this will help you to understand each and individual steps as well as to execute the same at your end.


Let's start with my on-premise SQL server details. As you can see the below picture, it talks about the database Employee where there are two tables EMP and DEPT. I'm going to use the same SQL into Fabric to pull the data.


I assume you already created the on-premise data gateway using my earlier post. In case not, then  open Fabric >> Settings >> Manage connection and gateways. If you already have the on-premise data gateway, then go to the next step - Creationn of Dataflow.



Creation of dataflow: Go to your workspace >> Click on + New Item >> Select Dataflow Gen2



Select More/SQL Server





Select Data gateway


Specify your Server details, database name and I place my custom SQL under SQL statement box.



Just to make sure you place the correct credential as in the below picture. 
Click Next to let Fabric to connect your local on-premise data and bring it into Fabric.

You can see how fabric use functions to pull your data. You can apply filter, sorting aginst the pull data.


Click on Data destination + icon to select your Destination (where this data is going to save). 

In the following example, I choose Lakehouse as I want to store the table in Lakehouse.



Click Next. In case, if you want to change connection credential, you can clcik on Edit connection and update your connection there.


Inside the Lakehouse, Select the SQL DB and specify your table name. You have option to add the data in the existing table, or create a new table. For the first time, definately will go ahead and create a new table as below.



Here, it is standard ETL UI, mapping UI. Map the sourec column and data type with destination column name and datatype. Once done, click on Save settings


You did it :)



Now, if you go to your Lakehouse and click on the newly created table, you can see the data pull from your on-premise store.



I hope you like this post. Comments, request, advice are always welcome :)

Comments

Popular posts from this blog

How to fix Azure DevOps error MSB4126

SharePoint Admin Center

Entity Framework common error - no such table: __EFMigrationsHistory + ConnectionString property has not been initialized + certificate chain was issued by an authority that is not trusted