Skip to content

Configure PostgreSQL Data Sources in NCC Portal

This article guides you through connecting a PostgreSQL database to the NCC Portal, configuring data sources, and creating entities for data management.

Prerequisites

Before you begin, gather the following details:

  • Server name
  • Database name
  • Username (or Client ID)
  • Password (or Client Secret)

Step 1: Connect PostgreSQL to Your Fabric Tenant

To establish a connection:

  1. Visit Connect to PostgreSQL Server and follow the instructions to create a new connection.
  2. If required, create a private endpoint.
  3. Name your connection using the CON_NCC prefix. This convention helps maintain visibility and organization within the NCC Portal.

Step 2: Add a Data Source in NCC Portal

  1. In NCC Portal, navigate to Tenant Settings > Data Sources.
  2. Select Add DataSource.
  3. Complete the following fields:
Field Description Example/Default Value
Name Name of the source PostgreSQL database
Data Source Type Type of data source POSTGRESQL
Namespace Prefix for storing data in Lakehouses
Code Identifier for pipelines POSTGRESQL_01
Description Description of data source
Connection Name of the connection in Fabric - Set in previous step
Environment NCC environment for the data source Development

Step 3: Create a Landing Zone Entity

  1. Go to Landing Zone Entities.
  2. Select New Entity.
  3. Enter the required details:
Field Description Example/Default Value
Pipeline Not used
Data Source Data source for connection - Set in previous step
Source schema Schema name in PostgreSQL
Source name Table or view name in PostgreSQL
Incremental Extract data incrementally False
Has encrypted columns Indicate if the table contains sensitive data False
Entity value Optional. Entity values reference
Lake house Lakehouse for storing data LH_Data_Landingzone
File path File path for data storage Is filled automatically
File name File name for data storage Is filled automatically
File type Expected file type Parquet

TIP

  • Click here to see how to configure the source incrementally.
  • Click here to see how to apply data encryption to your sensitive data.

Step 4: Create a Bronze Zone Entity

  1. Navigate to Bronze Zone Entities.
  2. Select New Entity.
  3. Provide the following information:
Field Description Example/Default Value
Pipeline Orchestrator pipeline for parsing PL_BRZ_COMMAND
Landing zone entity Landing zone entity to be parsed - Set in previous step
Entity value Optional. Entity values reference
Column mappings Optional. Column mapping info
Lake house Lakehouse for storing data LH_Bronze_Layer
Schema Schema for storing data dbo
Name Table name for storing data Is filled automatically
Primary keys Unique identifier fields (Case sensitive)

Step 5: Create a Silver Zone Entity

  1. Go to Silver Zone Entities.
  2. Select New Entity.
  3. Enter the following details:
Field Description Example/Default Value
Pipeline Orchestrator pipeline for parsing PL_SLV_COMMAND
Bronze layer entity Bronze layer entity to be parsed - Set in previous step
Entity value Optional. Entity values reference
Lake house Lakehouse for storing data LH_Silver_Layer
Schema Schema for storing data dbo
Name Table name for storing data Is filled automatically
Columns to exclude Comma-separated columns to exclude (Case sensitive)
Columns to exclude from history Comma-separated columns to exclude from compare (Case sensitive)

Example Configuration

The company InSpark has a PostgreSQL database connection in Fabric named NCC_POSTGRESQL_SALES for the database InSpark_Sales. The table total_sales in the schema dbo is to be extracted. The configuration is as follows:

Data Source

Field Value
Name InSpark_Sales
Data Source Type POSTGRESQL
Namespace InSpark_Sales
Code POSTGRESQL_01
Description PostgreSQL connection to InSpark_Sales
Connection NCC_POSTGRESQL_SALES
Environment Development

Landing Zone Entity

Field Value
Pipeline PL_LDZ_COPY_FROM_POSTGRESQL_01
Data Source InSpark_Sales
Source schema dbo
Source name total_sales
Incremental False
Entity value
Lake house LH_Data_Landingzone
File path InSpark_Sales
File name total_sales
File type Parquet

Bronze Zone Entity

Field Value
Pipeline PL_BRZ_COMMAND
Landing zone entity InSpark_Sales/total_sales
Entity value
Column mappings
Lake house LH_Bronze_Layer
Schema dbo
Name total_sales
Primary keys id

Silver Zone Entity

Field Value
Pipeline PL_SLV_COMMAND
Bronze layer entity dbo.total_sales
Entity value
Lake house LH_Silver_Layer
Schema dbo
Name total_sales
Columns to exclude
Columns to exclude from history

Next steps