This article describes how to connect and manage Oracle data sources in the NCC Portal, following Microsoft Learn Fabric documentation style.
Prerequisites
Before you start, make sure you have the following details:
- Server name
- Database name
- Username (or Client ID)
- Password (or Client Secret)
Step 1: Connect to Oracle Database
To connect your Oracle Database to your Fabric tenant:
- Visit Connect to Oracle Database and follow the instructions to create a new connection.
- If required, create a private endpoint.
- Name your connection using the CON_NCC prefix. This naming convention helps ensure visibility and manageability within the NCC Portal.
Step 2: Register a Data Source
- In the NCC Portal, navigate to Tenant Settings > Data Sources.
- Select Add DataSource.
- Complete the following fields:
| Field |
Description |
Example/Default Value |
| Name |
Name of the source Oracle database |
|
| Data Source Type |
Type of data source |
ORACLE |
| Namespace |
Prefix for storing data in Lakehouses |
|
| Code |
Identifier for pipelines |
ORACLE_01 |
| Description |
Description of data source |
|
| Connection |
Name of the connection in Fabric |
- Set in previous step |
| Environment |
NCC environment for the data source |
Development |
Step 3: Create a Landing Zone Entity
- Go to Landing Zone Entities.
- Select New Entity.
- Fill in the required details:
| Field |
Description |
Example/Default Value |
| Pipeline |
Not used |
|
| Data Source |
Data source for connection |
- Set in previous step |
| Source schema |
Schema name in Oracle |
|
| Source name |
Table or view name in Oracle |
|
| Incremental |
Extract data incrementally |
False |
| Has encrypted columns |
Indicate if the table contains sensitive data |
False |
| Entity value |
Optional. Entity values reference |
|
| Lake house |
Lakehouse for storing data |
LH_Data_Landingzone |
| File path |
File path for data storage |
Is filled automatically |
| File name |
File name for data storage |
Is filled automatically |
| File type |
Expected file type |
Parquet |
TIP
- Click here to learn how to configure incremental loading.
- Click here to apply data encryption to sensitive data.
Step 4: Create a Bronze Zone Entity
- Go to Bronze Zone Entities.
- Select New Entity.
- Enter the following information:
| Field |
Description |
Example/Default Value |
| Pipeline |
Orchestrator pipeline for parsing |
PL_BRZ_COMMAND |
| Landing zone entity |
Landing zone entity to be parsed |
- Set in previous step |
| Entity value |
Optional. Entity values reference |
|
| Column mappings |
Optional. Column mapping info |
|
| Lake house |
Lakehouse for storing data |
LH_Bronze_Layer |
| Schema |
Schema for storing data |
dbo |
| Name |
Table name for storing data |
Is filled automatically |
| Primary keys |
Unique identifier fields (Case sensitive) |
|
NOTE:
While this connector can copy data types from source to destination, some types are cast to DECIMAL because Oracle precision may exceed Parquet file limits. The following data types are converted to DECIMAL: (NUMBER, DECIMAL, FLOAT).
Step 5: Create a Silver Zone Entity
- Go to Silver Zone Entities.
- Select New Entity.
- Provide the following details:
| Field |
Description |
Example/Default Value |
| Pipeline |
Orchestrator pipeline for parsing |
PL_SLV_COMMAND |
| Bronze layer entity |
Bronze layer entity to be parsed |
- Set in previous step |
| Entity value |
Optional. Entity values reference |
|
| Lake house |
Lakehouse for storing data |
LH_Silver_Layer |
| Schema |
Schema for storing data |
dbo |
| Name |
Table name for storing data |
Is filled automatically |
| Columns to exclude |
Comma-separated columns to exclude (Case sensitive) |
|
| Columns to exclude from history |
Comma-separated columns to exclude from comparison (Case sensitive) |
|
Example
The company InSpark has an Oracle database connection in Fabric named NCC_ORACLE_SALES to the database InSpark_Sales. In this database, there is a table total_sales in the schema dbo that the company wants to extract. The configuration would be as follows:
Data Source
| Field |
Value |
| Name |
InSpark_Sales |
| Data Source Type |
ORACLE |
| Namespace |
InSpark_Sales |
| Code |
ORACLE_01 |
| Description |
Oracle connection to InSpark_Sales |
| Connection |
NCC_ORACLE_SALES |
| Environment |
Development |
Landing Zone Entity
| Field |
Value |
| Pipeline |
PL_LDZ_COPY_FROM_ORACLE_01 |
| Data Source |
InSpark_Sales |
| Source schema |
dbo |
| Source name |
total_sales |
| Incremental |
False |
| Entity value |
|
| Lake house |
LH_Data_Landingzone |
| File path |
InSpark_Sales |
| File name |
total_sales |
| File type |
Parquet |
Bronze Zone Entity
| Field |
Value |
| Pipeline |
PL_BRZ_COMMAND |
| Landing zone entity |
InSpark_Sales/total_sales |
| Entity value |
|
| Column mappings |
|
| Lake house |
LH_Bronze_Layer |
| Schema |
dbo |
| Name |
total_sales |
| Primary keys |
id |
Silver Zone Entity
| Field |
Value |
| Pipeline |
PL_SLV_COMMAND |
| Bronze layer entity |
dbo.total_sales |
| Entity value |
|
| Lake house |
LH_Silver_Layer |
| Schema |
dbo |
| Name |
total_sales |
| Columns to exclude |
|
| Columns to exclude from history |
|
Next steps