Use On-Premise-Gateway In Fabric and copy data from On-prem to Azure Fabric

On-Premise data gateway is an application which is required to be installed in your laptop/computer/server which connect to cloud and uses cloud services to send data back-and-forth from on-prem to cloud. This is secure data connection and data encryption is in place.

Here is more about on-premise data gateway - https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-onprem?toc=%2Ffabric%2Fdata-factory%2Ftoc.jsonhttps://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-onprem?toc=%2Ffabric%2Fdata-factory%2Ftoc.json

Architecture of it - https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-onprem-indepth?toc=%2Ffabric%2Fdata-factory%2Ftoc.jsonhttps://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-onprem-indepth?toc=%2Ffabric%2Fdata-factory%2Ftoc.json


Step 1

First thing first - You need to install on-premise data gateway in your lcoal.

Second thing is to open the on-premise data gateway and ensure the credential you use to configure in local must be a windows credential in your local - In other words, if you use an entra id or your Azure account that supposed to be a local windows account as well (most of the cases it is).

Initial thing is done. Once install, search by name on-premise gateway in your lcoal windows search and make sure the app open as below.



Step 2

Now, the second step. It is configure data connection in Fabric using your on-premise data gateway.

Go to Fabric >> Click on settings >> Manage Connections and gateways >> New 

Here, you'll find your on-premise data gateway which you just created in Step 1 using your Azure EntraID/CloudID. Select that gateway, and provide other details like SQL Server name, user id (most probably your Azure Entra ID/Windows ID, however you can use service principle as well). 

Click on create to create this connection in Fabric.


Fabric will validate all the provided information and then it will establish the connection from Azure cloud to your local SQL. If everything goes well, it will display green bar on sucessfully, otherwise error information will be display there. 


Below is my local SQL server name and I pick the EasyDate database to connect. Once connection created, the last step is to use the connection to get the data from local.



Step 3

Now, as the last step, you are going to use the on-premise data gateway connection in Fabric.

Go to your Workspace in Fabric and click on New and then select Copy Job. 


These are pretty self explanatory steps, you need to select the connection first, ensure right database name you provide and go next.


For example, if you see the image below, at the left hand side the Fabric UI which shows the database and table information it able to retrive and the right SQL Server management studio is my local one which confirm me right information Fabric able to pull from local SQL.


I keep this below image because I face huge data delayed while move next to data destination tab. In fact I need to close this operation and start from the begining of the Step 3. So, just keep in mind Fabric is new and you just created a new gateway connection :)

As a destination, I choose the lakehouse and put a name there. Click Create and connect.



This is pretty ETL stuff. Remember SSIS/SSRS days, map the data UI. First it will ask you for your destination table name, and then your source data i.e. column name and type, and what could be your destination column name and data type. 


Once done, click Next.



Now, choose what kind of copy operation it would be - full or incrementa or stream copy. Choose the the one you want and click on Next.



Job creation completed now. Click on Save and Run.



Once you run i, you will able to see the job is running in Fabric UI.



Once job finishes, it will display the completion message.



Step 4

Validation steps. Go to your Workspace in Fabric and review the newly created job and the table in Lakespace. Click on the table.



You can simple create a new query in Fabric to pull the data from Lakespace. Pretty simple :)



Enjoy Fabric :)

Comments

Popular posts from this blog

How to fix Azure DevOps error MSB4126

SharePoint Admin Center

Entity Framework common error - no such table: __EFMigrationsHistory + ConnectionString property has not been initialized + certificate chain was issued by an authority that is not trusted