Skip to content

Configure AFAS Data Sources in NCC Portal

This article describes how to configure an AFAS data source in the NCC Portal, including connection setup, entity creation, and best practices.

Prerequisites

Before you begin, ensure you have the following:

  • AFAS endpoint
  • AFAS token

Step 1: Connect to AFAS Server

No connection needs to be created in Fabric for this data source.

Step 2: Add a Data Source

  1. In the NCC Portal, select Tenant Settings > Data Sources.
  2. Select Add DataSource.
  3. Complete the following fields:
Field Description Example/Default Value
Name Name of the data source in NCC ``
Data Source Type Type of data source NOTEBOOK
Namespace Prefix for storing data in Lakehouses ``
Code Identifier for pipelines NB
Description Description of the data source ``
Connection Name of the connection in Fabric (not required)
Environment NCC environment for the data source Development

Step 3: Create a Landing Zone Entity

  1. Navigate to Landing Zone Entities.
  2. Select New Entity.
  3. Fill in the required details:
Field Description Example/Default Value
Pipeline Not used
Data Source Data source for connection - Set in previous step
Source schema Enter AFAS AFAS
Source name Name of data product in AFAS to extract
Incremental Extract data incrementally False
Has encrypted columns Check if table has sensitive data False
Entity value Optional. Entity values reference
Lake house Lakehouse for storing data LH_Data_Landingzone
File path File path for data storage Filled automatically
File name File name for data storage Filled automatically
File type Expected file type Parquet

Entity Values:

  • CustomParametersJSON: {"Environment":"xxx", "Token": "xxx", "KeyColumns": "ClientId,ChildId", "KeyName": "xxx", "EncryptionKeyVault": "xxx"}
  • NotebookName: NB_NCC_LDZ_COPY_FROM_REST_AFAS

IMPORTANT
You need to add your AFAS environment in the CustomParametersJSON and the AFAS Token in the secret AFAS-SIAN-TOKEN.

TIP

  • Click here to learn how to configure incremental loading.
  • Click here to learn how to apply data encryption to sensitive data.

Step 4: Create a Bronze Zone Entity

  1. Go to Bronze Zone Entities.
  2. Select New Entity.
  3. Enter the following information:
Field Description Example/Default Value
Pipeline Orchestrator pipeline for parsing PL_BRZ_COMMAND
Landing zone entity Landing zone entity to be parsed - Set in previous step
Entity value Optional. Entity values reference
Column mappings Optional. Column mapping info
Lake house Lakehouse for storing data LH_Bronze_Layer
Schema Schema for storing data dbo
Name Table name for storing data Filled automatically
Primary keys Unique identifier fields (Case sensitive)

Step 5: Create a Silver Zone Entity

  1. Go to Silver Zone Entities.
  2. Select New Entity.
  3. Provide the following details:
Field Description Example/Default Value
Pipeline Orchestrator pipeline for parsing PL_SLV_COMMAND
Bronze layer entity Bronze layer entity to be parsed - Set in previous step
Entity value Optional. Entity values reference
Lake house Lakehouse for storing data LH_Silver_Layer
Schema Schema for storing data dbo
Name Table name for storing data Filled automatically
Columns to exclude Comma-separated columns to exclude (Case sensitive)
Columns to exclude from history Comma-separated columns to exclude from comparison (Case sensitive)

Example

The company InSpark has an AFAS server connection in Fabric called NCC_AFAS_EMPLOYEES. On this database, there is a dataset employees that the company wants to extract. The configuration is as follows:

Data Source

Field Value
Name InSpark
Data Source Type NOTEBOOK
Namespace InSpark_Sales
Code NB
Description Custom notebooks for InSpark
Connection
Environment Development

Landing Zone Entity

Field Value
Pipeline PL_LDZ_COPY_FROM_ADF_PIPELINE
Data Source InSpark_employees
Source schema AFAS
Source name employees
Incremental False
Entity value
Lake house LH_Data_Landingzone
File path InSpark_AFAS
File name employees
File type Json

Example Entity value:

Name Value
CustomParametersJSON {"Environment":"xxx", "Token": "xxx", "KeyColumns": "ClientId,ChildId", "KeyName": "xxx", "EncryptionKeyVault": "xxx"}
NotebookName NB_NCC_LDZ_COPY_FROM_REST_AFAS

Bronze Zone Entity

Field Value
Pipeline PL_BRZ_COMMAND
Landing zone entity InSpark_AFAS/employees
Entity value
Column mappings See example below table
Lake house LH_Bronze_Layer
Schema dbo
Name employees
Primary keys Mdw

Example Column mappings:

[
  {
   "DataType": "Decimal",
   "SourceColumn": "`rows`.`Aantal_FTE`",
   "TargetColumn": "Aantal_FTE"
  },
  {
   "DataTypeOverride": "String",
   "SourceColumn": "`rows`.`Mdw`",
   "TargetColumn": "Mdw"
  }
]

Silver Zone Entity

Field Value
Pipeline PL_SLV_COMMAND
Bronze layer entity dbo.employees
Entity value
Lake house LH_Silver_Layer
Schema dbo
Name employees
Columns to exclude
Columns to exclude from history

Next steps