Skip to main content

About Microsoft Data Fabric with demo

Microsoft Data Fabric

Microsoft introduces new feature called Microsoft data fabric. Now this feature is in public preview mode. Microsoft data fabric mainly used for data analytical purpose. 

All in one platform to gathering the data from multiple data sources, It can be SQL, DataLake, Dataverse, Excel, CSV etc. After fetching the data from the corresponding sources, all the data should come under one lake house as table format.

It has multiple flexibility for data analytical platform. We can implement Lake house, data warehouse or can combine these both together to transform of simplifying the data for power bi presentation.

1. Data Engineering
  • Data engineering enables users to design, build, and maintain infrastructures and systems that enable their organizations to collect, store, process, and analyze large volumes of data.
2. Data Science
  • Data Science experiences to empower users to complete end-to-end data science workflows for the purpose of data enrichment and business insights.
  • At a high level, the process involves these steps:
    • Problem formulation and ideation
    • Data discovery and pre-processing
    • Experimentation and modeling
    • Enrich and operationalize
    • Gain insights
    3. Data Warehouse
    • Unified product that addresses every aspect of their data estate by offering a complete, SaaS-ified Data, Analytics and AI platform.
    4. Real Time Analytics
    • Real-Time Analytics is a fully managed big data analytics platform optimized for streaming, time-series data. It contains a dedicated query language and engine with exceptional performance for searching structured, semi-structured, and unstructured data with high performance.


    Steps to create Microsoft data fabric
    1. Sign in to your Power BI account, or if you don't have one yet, sign up for a free trial.
    2. Build and implement an end-to-end lakehouse for your organization:Create a Fabric workspace
    3. Create a lakehouse. It includes an optional section to implement the medallion architecture that is the bronze, silver, and gold layers.
    4. Ingest data, transform data, and load it into the lakehouse. Load data from the bronze, silver, and gold zones as delta lake tables. You can also explore the OneLake, OneCopy of your data across lake mode and warehouse mode.
    5. Connect to your lakehouse using TDS/SQL endpoint and Create a Power BI report using DirectLake to analyze sales data across different dimensions.
    6. Optionally, you can orchestrate and schedule data ingestion and transformation flow with a pipeline.
    7. Clean up resources by deleting the workspace and other items.

    1. Select Data Engineering Option from data fabric portal Power BI (microsoft.com).

    2. Create new workspace.

    3. New workspace name is 'Demo'

    4. Create new lake house.

    5. 

    6. Select New Dataflow Gen2. We can also use other three techniques to inject data from the source.

    7. We have multiple data sources, Now we an select Import From Excel.

    8. Upload the Excel from the local machine or any cloud storage.


    9. Select the appropriate table and click 'Create' button.


    10. Now we can see the dataflow Gen2. We can add more flows in that and can do data transformation also.

    11. Choose the destination where we want to push the data. Here I have selected the lake house.


    12. Select the lake house which is present inside of the Demo workspace. Click Next.

    13. Save settings


    14. Click 'Publish'

    15. Once we publish the flow, then the below components will get generate automatically inside of the new lake house which we created.


    16. Select properties of the dataflow1 to renaming the flow.

    17. Click the Lakehouse to open the tables.

    18. We can see the imported table.

    19. The same way we can import the data from SQL, Dataverse, DataLake and OData etc to this lake house.

    20. We can use the SQL end point to create the stored procedure, Functions, Views for transforming the data to project the visualized report in Power Bi.


    Comments

    Popular posts from this blog

    Data transfer from dataverse to Azure SQL using Microsoft Data Fabric

    Transfer the data from dataverse to Azure SQL using Microsoft Data Fabric 1. Open  Microsoft Fabric 2. Select Data Engineering . 3. Go to your workspace. 4. Select the  Dataflow Gen2 . 4. Click the hyper link ' Get data from another source ' 5. Popup will open, Search by Dataverse and select. 6. Select your environment domain and click Next. 7. We can select our entities and click Create . 8. We have the option to Choose the specific columns. 9. I have selected accountid and name column from account entity. 10. Select the data destination from the drop down. I have selected Azure SQL Database. 11. Popup will open there we can fill the azure sql database server details. then Click Next. 12. Select the Database and choose Replace option then Click Save Settings. 13. Click Publish. 14. Dataflow is created and loading the data to Azure SQL. 15. Now the selected entity and corresponding data has been populated in the Azure SQL database. 16. We can set the timer to schedule the data...