Configure Fabric OneLake Data Sources in NCC Portal¶
Follow this guide to set up and manage Fabric OneLake data sources in the NCC Portal.
Prerequisites¶
Before you begin, ensure you have the following:
- Lakehouse GUID
- Workspace GUID
TIP
The OneLake Lakehouse you want to access must be accessible by theOwnerof the pipelines.
Step 1: Set Up the Fabric OneLake Connection¶
No connection needs to be created in Fabric for this data source.
Step 2: Add a Data Source¶
NCC supports two types of OneLake data sources:
- OneLake Files: Extract files from lakehouses in OneLake.
- OneLake Tables: Extract tables from lakehouses in OneLake.
To add a data source:
- In the NCC Portal, select Tenant Settings > Data Sources.
- Choose Add DataSource.
- Complete the following fields:
| Field | Description | Example/Default Value |
|---|---|---|
| Name | Data source name in NCC | |
| Data Source Type | Type of data source | Onelake |
| Namespace | Prefix for storing data in Lakehouses | |
| Code | Identifier for pipelines | FILES or TABLES |
| Description | Description of data source | |
| Connection | Name of the connection in Fabric | |
| Environment | NCC environment for the data source | Development |
Step 3: Create a Landing Zone Entity¶
To create a landing zone entity:
- Go to Landing Zone Entities.
- Select New Entity.
- Enter the required details:
| Field | Description | Example/Default Value |
|---|---|---|
| Pipeline | Not used | |
| Data Source | Data source for connection | - Set in previous step |
| Source schema | Path for file in FILES Schema name in Lakehouse for TABLES |
|
| Source name | Filename with extension for FILES Table name for TABLES |
|
| Incremental | Extract data incrementally | False |
| Has encrypted columns | Indicates if table has sensitive data | False |
| Entity value | Database, SourceLakehouseGuid, and SourceWorkspaceGuid required. Entity values reference |
|
| Lake house | Lakehouse for storing data | LH_Data_Landingzone |
| File path | File path for data storage | Filled automatically |
| File name | File name for data storage | Filled automatically |
| File type | Expected file type | Parquet for TABLES Otherwise: Json, Csv, Parquet, Xlsx, Txt, or Xml |
Step 4: Create a Bronze Zone Entity¶
To create a bronze zone entity:
- Go to Bronze Zone Entities.
- Select New Entity.
- Enter the following information:
| Field | Description | Example/Default Value |
|---|---|---|
| Pipeline | Orchestrator pipeline for parsing | PL_BRZ_COMMAND |
| Landing zone entity | Landing zone entity to be parsed | - Set in previous step |
| Entity value | Optional. Entity values reference | |
| Column mappings | Optional. Column mapping info | |
| Lake house | Lakehouse for storing data | LH_Bronze_Layer |
| Schema | Schema for storing data | dbo |
| Name | Table name for storing data | Filled automatically |
| Primary keys | Unique identifier fields (Case sensitive) | |
Step 5: Create a Silver Zone Entity¶
To create a silver zone entity:
- Go to Silver Zone Entities.
- Select New Entity.
- Provide the following details:
| Field | Description | Example/Default Value |
|---|---|---|
| Pipeline | Orchestrator pipeline for parsing | PL_SLV_COMMAND |
| Bronze layer entity | Bronze layer entity to be parsed | - Set in previous step |
| Entity value | Optional. Entity values reference | |
| Lake house | Lakehouse for storing data | LH_Silver_Layer |
| Schema | Schema for storing data | dbo |
| Name | Table name for storing data | Filled automatically |
| Columns to exclude | Comma-separated columns to exclude (Case sensitive) | |
| Columns to exclude from history | Comma-separated columns to exclude from compare (Case sensitive) | |
Example¶
The company InSpark has a table on a lakehouse in Fabric named InSpark_Sales. On this lakehouse, there is a table total_sales in the schema dbo that the company wants to extract. The configuration would be:
Data Source¶
| Field | Value |
|---|---|
| Name | InSpark_Sales |
| Data Source Type | Onelake |
| Namespace | InSpark_Sales |
| Code | TABLES |
| Description | Onelake connection to InSpark_Sales |
| Connection | |
| Environment | Development |
Landing Zone Entity¶
| Field | Value |
|---|---|
| Pipeline | PL_LDZ_COPY_FROM_ADF_PIPELINE |
| Data Source | InSpark_Sales |
| Source schema | dbo |
| Source name | total_sales |
| Incremental | False |
| Entity value | |
| Lake house | LH_Data_Landingzone |
| File path | InSpark_Sales |
| File name | total_sales |
| File type | Parquet |
Example Entity value:
| Name | Value |
|---|---|
| Database | LH_InSpark_Sales |
| SourceLakehouseGuid | 9x999x9x-99x9-9999-99x9-99x99999xxx9 |
| SourceWorkspaceGuid | 9x999x9x-99x9-9999-99x9-99x99999xxx9 |
Bronze Zone Entity¶
| Field | Value |
|---|---|
| Pipeline | PL_BRZ_COMMAND |
| Landing zone entity | InSpark_Sales/total_sales |
| Entity value | |
| Column mappings | |
| Lake house | LH_Bronze_Layer |
| Schema | dbo |
| Name | total_sales |
| Primary keys | id |
Silver Zone Entity¶
| Field | Value |
|---|---|
| Pipeline | PL_SLV_COMMAND |
| Bronze layer entity | dbo.total_sales |
| Entity value | |
| Lake house | LH_Silver_Layer |
| Schema | dbo |
| Name | total_sales |
| Columns to exclude | |
| Columns to exclude from history | |