Microsoft Fabric - Use Lakehouse to upload source

Microsoft Fabric comes up with multiple capabilities/wings and one of it is Data Enginnering where you brings your data to next generation AI. In Data Enginnering platform, you are going to load your data, perform operation on your data to process it, and finally display your finetune data in nice way using Power BI capabilities.

Tables are Files hold your data in Data Enginnering landscape.  Table will allow to hold data in table structure format while you can upload your data file using csv/json/parquet. Upload option is there to upload your file(s) into Microsoft Fabric. 

           

Files uploading in Lakehouse




Files uploaded in Lakehouse



Simple way to display record is to create one Notebook and drag the file there, Fabric will create the code for you :)



Click on the table data, and options are there to display the data in different format. For example, you can view your data in chart format (like bar/chart/pie).
 
Bar Format


Pie Format



Notebook where you can tricks using SQL to work on the data. One example is for group by clause. For example, group by the patient data based on the country.

df = spark.sql("SELECT * FROM LahaLakeHouse.bing_covid_19_data group by country_region")

display(df)



Here is one example below on how to read the data from Files and display in Notebook using Panda.


Another one example below on how to read the data from Files and display in Notebook using SCLA.







Fabric is Fun!

Comments

Popular posts from this blog

How to fix Azure DevOps error MSB4126

How to create Custom Visuals in Power BI – Initial few Steps

How to fix Azure DevOps Error CS0579