Microsoft Fabric Connector

Overview

You can connect SAP systems with Microsoft Fabric.

Required components of ASAPIO Integration Add-ons:

Add-on/component name Type
ASAPIO Integration Add-on – Framework Base component (required)
ASAPIO Integration Add-on – Connector for Microsoft® Azure® Additional package

 

Pre-requisites for Microsoft® Fabric® services

Note
The following settings are specific to the connector for Microsoft® Fabric®.

A Microsoft® Fabric® and Azure® account is required, with access to one of the following services:

  • Lakehouse

 

Steps required to establish connectivity

To establish connectivity with the Microsoft Fabric platform, please proceed with the following activities and refer to the specific documentation articles please.

  1. Create RFC destinations to Microsoft Fabric platform in SAP system settings
  2. Set-up connection instance to Microsoft Fabric platform in ASAPIO Integration Add-on
  3. Endpoint configuration (Lakehouse)
    Please see article “Authentication options” for an overview of supported authentication options for the different endpoints.
  4. See article example outbound message for a simple example to test the connectivity

 

Authentication options

The Fabric service offer OAuth via Entra ID as authorization

To learn how to configure the service, you can see in the specific article.

Create RFC destinations

Create RFC destination for messaging endpoints

Create a new RFC destination of type “G” (HTTP Connection to External Server).

  • Transaction: SM59
  • Create new destination of type “G”
  • Specify Target Host: ‘’onelake.dfs.fabric.microsoft.com’’

Add the certificates for the created destinations to the certificate list selected in tab Logon & Security:

 

Create RFC destination for OAuth Authentication

To use OAuth authentication with Entra ID, please configure a RFC destination specifying the OAuth endpoint

  • Transaction: SM59
  • Create new destination of type “G”
  • Specify Target Host: login.microsoftonline.com
  • Specify Path Prefix: <tenantID>/oauth2/token

Add the certificates for the created destinations to the certificate list selected in tab Logon & Security:

Add Certificates to Trust Store

  • Transaction: STRUST
  • Select Certificate List as used in RFC destination created above
  • Click button Import certificate (1)
  • Click button Add to Certificate List (2)

Set-up connection instance

Create the connection instance customizing that ties together the RFC destination created earlier and the cloud connector type:

  • Transaction: SPRO
  • Goto ASAPIO Cloud Integrator – Connection and Replication Object Customizing
  • Or go directly to transaction: /ASADEV/68000202
  • Add New Entry and specify:
    • Field Instance: a name for this connection in
    • Field RFC Dest. (Upload): the RFC destination create for the messaging endpoint (please see article Create RFC destinations)
    • Field ISO Code: the code page to use
    • Field Cloud Type: AZURE (or the name you chose when adding the connector)

Set-up Error Type Mapping

Create an entry in section Error Type Mapping and specify at least the following mapping:

Resp. Code: 201 Message Type: Success
Resp. Code: 202 Message Type: Success

Save Secret in SAP Secure Store

For OAuth authentication, a secret has to be stored in the systems’ SAP Secure Store.

  • Client Secretfor OAuth

Enter the secret in the SAP Secure Store:

  • Transaction: SPRO
  • GotoASAPIO Cloud Integrator – Set the cloud connection password
  • Or go directly to transaction:/ASADEV/SCI_TPW
  • Select the createCloud Instance
  • Enter the client secret in theCloud Shared Secret field and execute

Send outbound message

Create Message Type

 

Example: in the example below, we use the Material Change event. Please choose any other suitable example if required.

 

For each object to be sent via ACI you have to create a message type:

  • Transaction: WE81
  • Add New Entry and specify:
    • Message Type: unique name for the integration

Description: description of the purpose

Activate Message Type

The created message type has to be activated:

  • Transaction: BD50
  • Add New Entryand specify:
    • Message Type: the created message type
    • Active: tick the checkbox

 

Create Payload Design

Go to the ASAPIO Payload Designer with the transaction /asadev/design. There you can create a new payload using Create payload (Shift+F4).

  • Now create a payload with the following criteria:
    • View: Payload name
    • Application: Data modell application
    • Version: Version of payload

Push the join builder button to add your table, DB view or CDS view:

Use the button Insert table (Shift+F1).

Go back and save.

Create Outbound Object configuration

  • Transaction: SPRO
  • Goto ASAPIO Cloud Integrator – Connection and Replication Object Customizing
  • Or go directly to transaction: /ASADEV/68000202
  • Select the created Connection
  • Go to section Outbound Objects
  • Add New Entry and specify:
    • Object: name of the outbound configuration
    • Extraction Func. Module: /ASADEV/ACI_GEN_PDVIEW_EXTRACT
    • File Name: name with file format
    • Message Type: the created message type
    • Load Type: Incremental Load
    • Trace: activate for testing purposes
    • Payload View Name: name of payload design
    • Payload View Version: version of payload design

Set up ‘Business Object Event Linkage’

Link the configuration of the outbound object to a Business Object event:

  • Transaction: SWE2 or got to Event Linkage on the left side
  • Add New Entry and specify:
    • Object Category: BO BOR Object Type
    • Object Type: Business Object Type sending the event
    • Event: the event to react to
    • Receiver Type: the message type of the outbound object (this is the link to the Addon configuration)
    • Receiver Call: Function Module
    • Receiver Function Module: /ASADEV/ACI_EVENTS_TRIGGER
    • Linkage Activated: tick the checkbox

Set-up target endpoint in ‘Header Attributes’

Configure the Lakehouse endpoint:

  • Go to section Header Attributes
  • Add New Entry and specify:
Header Attribute Header Attribute Value
ACI_HTTP_METHOD PATCH
FABRIC_FILE_PATH your path to file folder e.g. ‘’/Files’’
FABRIC_LAKEHOUSE your lakehouse id specified in Fabric
FABRIC_WORKSPACE your workspace id specified in Fabric

Activating the change pointer information with header attribute (Optional)

Configure the change pointer info:

  • Go to section Header Attributes
  • Add New Entry and specify:
Header Attribute Header Attribute Value
ACI_HTTP_METHOD PATCH
ACI_CP_INFO X

Information can be transferred from the change pointer to the payload.
The following information can be used:

Information field Description
ACICPIDENT Change pointer ID
ACITABNAME Table name (Event Object Type)
ACITABKEY Composed of: Event Type, Mandt, Entry Key
ACICRETIME Creation time
ACIACTTIME Activation time
ACICDCHGID Change Indicator

Customize the change pointer information

Fields can be renamed or omitted:

  • Go to section Field Mapping
  • Add New Entry and specify:
Target structure Target field Default value
FIELD_RENAME ACICDCHGID ChangeIndicator
SKIP_FIELD ACITABKEY

 

Test the outbound event creation

In the example above, please pick any test sales order in transaction /nVA02 and force a change event, e.g. by changing the requested delivery date on header level.

Set-up Packed Load/Initial load (split large data)

Create payload

Note
E.g. Payload Design created in transaction /ASADEV/DESIGN

 

Create Outbound Object configuration

  • Transaction: SPRO
  • Goto ASAPIO Cloud Integrator – Connection and Replication Object Customizing
  • Or go directly to transaction: /ASADEV/68000202
  • Select the created Connection
  • Go to section Outbound Objects
  • Add New Entry and specify:
    • Object: name of the outbound configuration
    • Extraction Func. Module: /ASADEV/ACI_GEN_PDVIEW_EXTRACT
    • File Name: name with file format
    • Message Type: the created message type (optional)
    • Load Type: Packed Load
    • Trace: activate for testing purposes
    • Payload View Name: name of payload design
    • Payload View Version: version of payload design

 

Set-up ‘Header Attributes’

  • Go to section Header Attributes of the outbound object created previously
  • Add New Entry and specify the header attributes and values

 

Header attribute Header attribute value Example
ACI_PACK_BDCP_COMMIT Flag for changepointer creation.

If set, changepointers will be generated for every entry.

IF this flag is set, a messagetype has to be maintained in the outbound object.

Caution:

This may heavily impact performance.

X
ACI_PACK_TABLE Name of the table to take the keyfields from. This is typically different then the db view specified in ‘ACI_VIEW‘ as we only want to build packages based on the header object and the db view typically contains sub-objects as well  VBAK
ACI_PACK_TRY Number of attempts to get a new ressource from the servergroup (typically you should choose a very high number here) 100000
ACI_PACK_RETRY_TIME Time in seconds. This is the duration in which the framework will attempt to get a new ressource from the servergroup Note: this is not yet released, use ACI_PACK_TRY instead
ACI_PACK_WHERE_COND

 

Condition that is applied to the table defined in ‘ACI_PACK_TABLE
ACI_PACK_SIZE Number of entries to send 500
ACI_PACK_KEY_LENGTH Length of the key to use from the ACI_PACK_TABLE (e.g. MANDT + MATNR) 13

 

Execute the initial load

Warning
depending on the amount of data this can stress the SAP system servers immensely.
Please always consult with your basis team for the correct server group to use!

  • Transaction: /ASADEV/ACI
  • Select the Connection and hit enter
  • Select Upload Type: P
  • Select Replication Object 
  • Select a Servergroup (this is mandatory)

Monitoring, traces and logs

ASAPIO Integration Add-on Monitor allows for monitoring all outbound and inbound messages of the Add-on, with the following features:

  1. View statistical and graphical analysis of data volume, times, and errors
  2. Logging of HTTP return codes and messages
  3. Logging of requests (RAW data) can be switched on/off in application customizing (IMG)
  4. Retransmission control through SAP change pointers (to ensure event delivery) if errors occur
  5. Notification and/or escalation to system administrators (or through SAP Workflow) if errors occur

You can find detailed information here: Monitoring Documentation

Import CSV files into Lakehouse

A generic Fabric Notebook can be utilized to facilitate the automatic import of CSV files to a Lakehouse database.

Create a Fabric Notebook

Create a new notebook in your Lakehouse and use the example code from below.

  • Go to your Fabric Workspace
  • Use New and Import notebook
  • Upload the ASAPIO Notebook file

Configure the Notebook

To utilize the notebook, it is first necessary to personalize the file name, the path to the files and the path to the database.

To change variable Description
path_to_csv_files Path of the CSV file folder
path_to_delta_table Path of the database
pattern The regular expression pattern of the CSV file name must be modified to align with the configured file name in SAP. Only the marked part needs to be changed.
pattern_with_groups The regular expression pattern of the CSV file name must be modified to align with the configured file name in SAP. Only the marked part needs to be changed.

Schedule the Notebook as a job

You can schedule the notebook as a job, e.g. hourly.

  • Go to tab Run and use Schedule
  • Configure to your plan and use Apply

 

Example coding:

from pyspark.sql import SparkSession
from delta.tables import DeltaTable
import re

# Initialize a SparkSession with Hive support enabled.
spark = SparkSession.builder.appName("LakehouseUpdater").enableHiveSupport().getOrCreate()

# Specify the path to the CSV files.
path_to_csv_files = "abfss://AsapioDemoWorkspace@onelake.dfs.fabric.microsoft.com/AsapioDemoLakehouse.Lakehouse/Files/SalesOrder"

# Specify the path to the Delta table.
path_to_delta_table = "abfss://AsapioDemoWorkspace@onelake.dfs.fabric.microsoft.com/AsapioDemoLakehouse.Lakehouse/Tables/aci_sales_order"

# Compile a regex pattern to filter the file names.
pattern = re.compile(r'aci_sales_order_\d{8}_\d{6}_[a-f0-9]{6}\.csv')

# Define the pattern with groups for sorting based on date and time.
pattern_with_groups = re.compile(r'aci_sales_order_(\d{8})_(\d{6})_([a-f0-9]{6})\.csv')

# Retrieve a list of file names.
files = mssparkutils.fs.ls(path_to_csv_files)

# Initialize a list to hold filtered file names.
filtered_files = []

# Iterate through the file list and add names that match the pattern to the list.
for file in files:
if pattern.match(file.name):
filtered_files.append(file.name)

# Sort the file list based on the defined pattern.
filtered_files.sort(key=lambda x: pattern_with_groups.match(x).groups())

# Create a DeltaTable object for the specified Delta table.
delta_table = DeltaTable.forPath(spark, path_to_delta_table)

# Loop over each file in the filtered list.
for file in files:

# Load the content of the file into an RDD.
rdd = spark.sparkContext.textFile(file.path)
# Retrieve the first line which contains field names.
first_line = rdd.first()

# Extract field names from the first line.
_, field_names_str = first_line.split("=")
field_names = field_names_str.split(",")

# Construct a condition format for merging data based on field names.
condition_format = " AND ".join([f"old_data.{field} = new_data.{field}" for field in field_names])

# Create a new RDD without the first line.
rdd_without_first_line = rdd.filter(lambda line: line != first_line)

# Convert the RDD to a DataFrame, now that the non-CSV compliant first line is removed.
df = spark.read.csv(rdd_without_first_line, header=True, inferSchema=True)

# Perform a merge operation between the existing Delta table and the new data,
# updating existing records and inserting new ones as necessary.
(delta_table
.alias("old_data")
.merge(df.alias("new_data"), condition_format)
.whenMatchedUpdateAll()
.whenNotMatchedInsertAll()
.execute()
)

# Remove the processed file.
mssparkutils.fs.rm(file.path, True)
Scroll to Top