Skip to content

Configure Azure Blob Storage Data Sources in NCC Portal

This article describes how to connect Azure Blob Storage to the NCC Portal and configure entities for data ingestion and processing.

Prerequisites

Before you start, gather the following information:

  • Server name
  • Username
  • Password

Step 1: Connect Azure Blob Storage

To connect Azure Blob Storage to your Fabric tenant:

  1. Visit Connect to Azure Data Lake Storage Gen2 and follow the instructions to create a new connection.
  2. If required, create a private endpoint.
  3. Name your connection using the CON_NCC prefix. This naming convention helps you manage connections in the NCC Portal.

Step 2: Add a Data Source

  1. In the NCC Portal, go to Tenant Settings > Data Sources.
  2. Select Add DataSource.
  3. Complete the following fields:
Field Description Example/Default Value
Name Data source name in NCC
Data Source Type Type of data source ADLS
Namespace Prefix for storing data in Lakehouses
Code Identifier for pipelines ADLS_01
Description Description of data source
Connection Name of the connection in Fabric - Set in previous step
Environment NCC environment for the data source Development

Step 3: Create a Landing Zone Entity

  1. Go to Landing Zone Entities.
  2. Select New Entity.
  3. Fill in the required details:
Field Description Example/Default Value
Pipeline Not used
Data Source Data source for connection - Set in previous step
Source schema Path to the file you want to extract from ADLS
Source name Name of the file to extract, including file extension
Incremental Extract data incrementally False
Has encrypted columns Indicates if the table has sensitive data False
Entity value Optional. Entity values reference
Lake house Lakehouse for storing data LH_Data_Landingzone
File path File path for data storage Filled automatically
File name File name for data storage Filled automatically
File type Expected file type E.g. Json, Csv, Parquet, Xlsx, Txt, Xml

TIP

  • Click here to see how to apply data encryption to your sensitive data.

Step 4: Create a Bronze Zone Entity

  1. Go to Bronze Zone Entities.
  2. Select New Entity.
  3. Enter the following information:
Field Description Example/Default Value
Pipeline Orchestrator pipeline for parsing PL_BRZ_COMMAND
Landing zone entity Landing zone entity to be parsed - Set in previous step
Entity value Optional. Entity values reference
Column mappings Optional. Column mapping info
Lake house Lakehouse for storing data LH_Bronze_Layer
Schema Schema for storing data dbo
Name Table name for storing data Filled automatically
Primary keys Unique identifier fields (Case sensitive)

Step 5: Create a Silver Zone Entity

  1. Go to Silver Zone Entities.
  2. Select New Entity.
  3. Provide the following details:
Field Description Example/Default Value
Pipeline Orchestrator pipeline for parsing PL_SLV_COMMAND
Bronze layer entity Bronze layer entity to be parsed - Set in previous step
Entity value Optional. Entity values reference
Lake house Lakehouse for storing data LH_Silver_Layer
Schema Schema for storing data dbo
Name Table name for storing data Filled automatically
Columns to exclude Comma-separated columns to exclude (Case sensitive)
Columns to exclude from history Comma-separated columns to exclude from compare (Case sensitive)

Example configuration

The company InSpark has an Azure Data Lake Storage connection in Fabric called NCC_ADLS_SALES. On this storage account, there is a file named total_sales.csv that the company wants to extract. The configuration would look like this:

Data Source

Field Value
Name InSpark_Sales
Data Source Type ADLS
Namespace InSpark_Sales
Code ADLS_01
Description ADLS connection to InSpark_Sales
Connection NCC_ADLS_SALES
Environment Development

Landing Zone Entity

Field Value
Pipeline PL_LDZ_COPY_FROM_ADLS_01
Data Source InSpark_Sales
Source schema home/sales
Source name total_sales.csv
Incremental False
Entity value
Lake house LH_Data_Landingzone
File path InSpark_Sales
File name total_sales
File type Csv

Bronze Zone Entity

Field Value
Pipeline PL_BRZ_COMMAND
Landing zone entity InSpark_Sales/total_sales
Entity value
Column mappings
Lake house LH_Bronze_Layer
Schema dbo
Name total_sales
Primary keys id

Example Entity value:

Name Value
ColumnDelimiter ,
CompressionType none
Encoding UTF-8
EscapeCharacter \
FirstRowIsHeader 1
RowDelimiter \r\n

Silver Zone Entity

Field Value
Pipeline PL_SLV_COMMAND
Bronze layer entity dbo.total_sales
Entity value
Lake house LH_Silver_Layer
Schema dbo
Name total_sales
Columns to exclude
Columns to exclude from history

Next steps