Friday, July 25, 2025

Raise Fault Policy | API Management

What is Raise Fault Policy..?

The RaiseFault policy allows you to create custom messages in case of error conditions. This policy returns a FaultResponse to the requesting application if it encounters an error condition.

 A FaultResponse can consist of HTTP headers, query parameters, and a message payload. These elements can be populated using variables. This enables you to send customized FaultResponses that are specific to the error conditions.

During execution, the RaiseFault policy transfers the message flow to the default ErrorFlow, which in turn returns the designated FaultResponse to the requesting application.

When the message flow switches to the default ErrorFlow, no further policy processing occurs. All remaining processing steps are bypassed, and the FaultResponse is returned directly to the requesting app.

Example:

We will create a API Proxy and specify end-point URL and then try to access it from Postman by using username/password.

If login credentials are correct then we will get response from the end-point, else we will get custom error message from RaiseFault policy.

API Proxy Creation:


End Point URL: https://dummy.restapiexample.com/api/v1/employees

(replace it with your URL)

Save it and go to Policies

Keep Basic Authentication Policy and Raise Fault Policy as shown below..



For Basic Authentication Policy:

<BasicAuthentication async='true' continueOnError='false' enabled='true' xmlns='http://www.sap.com/apimgmt'>
	<Operation>Decode</Operation>
	<IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
	<User ref='current.username'></User>
	<Password ref='current.password'></Password>
	<Source>request.header.Authorization</Source>
</BasicAuthentication>

For Raise Fault Policy:

<RaiseFault async="true" continueOnError="false" enabled="true" xmlns="http://www.sap.com/apimgmt">
    <FaultResponse>
        <Set>
            <Headers/>
            <Payload contentType="application/json">{"status" : "Error", "messege" : "401 Invalid User or Password", "Suggestion" : "Try with correct user name / password " } </Payload>
            <StatusCode>401</StatusCode>
            <ReasonPhrase>Unauthorized</ReasonPhrase>
        </Set>
    </FaultResponse>
    <IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
</RaiseFault>



Specify Condition String:
(current.username != "YOUR_USERNAME")  OR (current.password != "YOUR_PASSWORD")


Save it and deploy it.

Now test it from Postman..with Correct login credentials


With incorrect logins


Based on your requirement, you can customize RaiseFault Policy.

That's it.

Source: SAP Community

Thanks for reading :-)

Wednesday, July 2, 2025

Solace(EventMesh Replica) Setup

In this blog, I am going to explain how to setup Solace (EventMesh Replica) environment and establish connectivity with CPI and small Pub/Sub case study.

First, setup a Solace account with your mail account.

https://console.solace.cloud/login

Upon successful logged in, 

Click on Cluster Manager and create a service ..(it's a self-explanatory process).

Click on service-name..


Go to Manage, where you can see Clients, Queues..etc

Click on Queues, where you can create queues and manage it.


Click on edit to change the settings..


Click on Queue and then create a Topic if you want it.

Connection details, which should be used to connect from CPI


Now, create a small iFlow ..

Specify, connection parameters for AdvancedEventMesh connector.



Now, save and deploy the iFlow.

Upon execution, you can see the payload/messages in queue(Solace)


That's all for Publishing part.

                                                                           *********

Now, build a Subscription iFlow.


Specify connection parameters as created in Publishing side...


You need to deploy the iFlow and test it.

Thanks for reading :-)

Tuesday, July 1, 2025

Value - Mapping

In this blog, I am going to explain what is value-mapping, importance and how to implement it.?

Value Mapping in SAP CPI is a critical feature that enables seamless and consistent data transformation between disparate systems in an integration landscape.

  • In integration scenarios, you frequently encounter scenarios where a "country code" might be "US" in one system (source), "USA" in another (target).
  • Similarly, "material type" could be "ROH" in SAP ERP and "Raw Material" in a manufacturing execution system. 

Without a centralized mapping, data becomes inconsistent, leading to errors, incorrect reporting, and operational inefficiencies.

Various features of Value-Mapping:

  • Centralized Management and Governance: Value Mappings are stored as separate artifacts within integration packages, providing a structured way to manage and govern conversion rules. This centralization aids in documentation, auditing, and ensuring adherence to enterprise-wide data standards.
  • Improved Maintainability and Agility: Value Mappings allow for dynamic updates without necessarily changing the iFlow's core logic. Business users or functional consultants (with appropriate access) can often manage these mappings directly in the CPI tenant, empowering them to react quickly to changes without requiring full developer involvement for every minor adjustment.
  • Enhanced Reusability: Value Mappings are reusable artifacts. Once defined, they can be used across any message mapping within the same integration package, promoting consistency and reducing development effort. If a value changes (e.g., a new country code is introduced), you only need to update the Value Mapping artifact once, and all dependent iFlows automatically inherit the change upon deployment.
  • Bidirectional Mapping: Value Mapping in CPI supports bidirectional lookups.  This means you can not only map from a source value to a target value but also, if needed, reverse the lookup from the target value back to its original source value, which can be useful in certain integration scenarios or for reconciliation purposes.
First, create Value-Mapping 
Under Package<sample> > Artifacts >Add>Value Mapping



Then, save and deploy it.

Create a simple iflow ...





Under Message Mapping, specify source meta data and target meta data, here I used same meta data.

<?xml version="1.0" encoding="utf-8"?>
<xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
  <xs:element name="emprecords">
    <xs:complexType>
      <xs:sequence>
        <xs:element maxOccurs="unbounded" name="emprecord">
          <xs:complexType>
            <xs:sequence>
              <xs:element name="empid" type="xs:unsignedShort" />
              <xs:element name="emplname" type="xs:string" />
  <xs:element name="empfname" type="xs:string" />
              <xs:element name="empage" type="xs:unsignedByte" />
              <xs:element name="emporigin" type="xs:string" />
            </xs:sequence>
          </xs:complexType>
        </xs:element>
      </xs:sequence>
    </xs:complexType>
  </xs:element>
</xs:schema>



Select conversion>value mapping and specify SourceAgency/Target Agency and it's identifiers


On Failure is essential when there is no incoming field is available. 
Where you can select ...below options based on your required.
  • Use Key,
  • Use Default value
  • Throw exception

Now, test the iflow with below given input and you will get given output.

Input Data/Payload:

<?xml version="1.0"?>
<emprecords>
    <emprecord>
        <empid>4001</empid>
        <emplname>Paul</emplname>
        <empfname>David</empfname>
        <empage>20</empage>
        <emporigin>INDIA</emporigin>
    </emprecord>
    <emprecord>
        <empid>4002</empid>
        <emplname>PP</emplname>
        <empfname>Ramesh</empfname>
        <empage>20</empage>
        <emporigin>Germany</emporigin>
    </emprecord>
    <emprecord>
        <empid>4003</empid>
        <emplname>DD</emplname>
        <empfname>Kiran</empfname>
        <empage>20</empage>
        <emporigin>United States</emporigin>
    </emprecord>
    <emprecord>
        <empid>4004</empid>
        <emplname>DD</emplname>
        <empfname>Desh</empfname>
        <empage>20</empage>
        <emporigin>India</emporigin>
    </emprecord>
    <emprecord>
        <empid>4005</empid>
        <emplname>DD</emplname>
        <empfname>Raju</empfname>
        <empage>20</empage>
        <emporigin>Swiss</emporigin>
    </emprecord>
    <emprecord>
        <empid>4006</empid>
        <emplname>DD</emplname>
        <empfname>Pavan</empfname>
        <empage>20</empage>
        <emporigin>Brazil</emporigin>
    </emprecord>
</emprecords>


Output Data/Payload:

<?xml version="1.0" encoding="UTF-8"?>

<emprecords>
    <emprecord>
        <empid>4001</empid>
        <emplname>Paul</emplname>
        <empfname>David</empfname>
        <empage>20</empage>
        <emporigin>IN</emporigin>
    </emprecord>
    <emprecord>
        <empid>4002</empid>
        <emplname>PP</emplname>
        <empfname>Ramesh</empfname>
        <empage>20</empage>
        <emporigin>GM</emporigin>
    </emprecord>
    <emprecord>
        <empid>4003</empid>
        <emplname>DD</emplname>
        <empfname>Kiran</empfname>
        <empage>20</empage>
        <emporigin>-NA-</emporigin>
    </emprecord>
    <emprecord>
        <empid>4004</empid>
        <emplname>DD</emplname>
        <empfname>Desh</empfname>
        <empage>20</empage>
        <emporigin>IN</emporigin>
    </emprecord>
    <emprecord>
        <empid>4005</empid>
        <emplname>DD</emplname>
        <empfname>Raju</empfname>
        <empage>20</empage>
        <emporigin>-NA-</emporigin>
    </emprecord>
    <emprecord>
        <empid>4006</empid>
        <emplname>DD</emplname>
        <empfname>Pavan</empfname>
        <empage>20</empage>
        <emporigin>BLR</emporigin>
    </emprecord>
</emprecords>

Sample Output:


Note: We can achieve same thing with Fix Values feature also. But, you need to update the iflow, whenever you add Key/Value pair or update or delete it and then deploy it. So downtime will be there. So use Value-Mapping instead.

Under Conversions> Fix Values or Value Mapping


Thanks for reading :-)

Optimizing Resource Consumption

In this blog, I am going to explain the process of inspecting resource consumption in SAP CPI by using the Inspect Resource Consumption dashboard. 

By analyzing key metrics such as database connections, data store usage, and monitoring storage you can identify potential issues and then optimize performance. 

Dashboard Overview: 

The Inspect Resource Consumption dashboard provides a snapshot of essential metrics and it displays various tiles that represent the consumption levels of different resources in your tenant. Each tile shows the overall consumption of a resource over the past 24 hours.


Click on any tile (connections/data store) for detailed information about consumption..



Database Resources

Inspect the usage of integration resources associated with the tenant database and with the database connection pool.

  • Connections: you can inspect resource usage of the database connections caused by integration flows.
  • Data Store: you can inspect resource usage of the tenant database caused by integration flows using data store operations steps.
  • Transactions: you can inspect resource usage of the database transactions caused by integration flows.
  • Monitoring Storage: you can inspect resource usage of the monitoring database storage caused by integration flows.

System Resources
Inspect system resource usage caused by your active integration flows.
  • Memory: you can inspect resource usage of system memory caused by integration flows.
  • Temporary Storage: you can inspect the storage usage of temporary files.
Content Resources
Inspect content resource usage caused by your integration artifacts.
  • Content Size: you can inspect the file size that integration artifacts occupy.
  • Integration Flows: you can inspect the integration flows with the highest resource consumption.
Important Points:
  • Resources should be monitored during Load / Performance testing
  • And, Level 2 / support team should monitor the resources in PROD tenant regularly.

Thanks for reading :-)

SFTP to DB (Multiple line items)

In this blog, I am going to explain how to retrieve the data/payload from SFTP and send it to DB (PostgreSQL).



Step 1:

Establish connectivity with SFTP

Get the SFTP server, port, user name, password from the infra team / client team

In my example, 

Location: demo.wftpserver.com

Username: demo

Password: demo

FTP Port: 21

FTPS Port: 990

SFTP Port: 2222





Schedule it as per the requirement...
Time Zone is very important, if you want to schedule on particular time.



Step 2:

Establish connectivity with DB (PostgreSQL)

Refer my blog - how to establish the connectivity and sample queries..

JDBC - blog


Sender Side:

Employee XML

<?xml version="1.0"?>

<emprecords>

  <emprecord>

    <empid>4001</empid>

    <emplname>Paul</emplname>

    <empfname>David</empfname>

    <empage>20</empage>

  </emprecord>

  <emprecord>

    <empid>4002</empid>

    <emplname>PP</emplname>

    <empfname>Ramesh</empfname>

    <empage>20</empage>

  </emprecord>

  <emprecord>

    <empid>4003</empid>

    <emplname>DD</emplname>

    <empfname>Kiran</empfname>

    <empage>20</empage>

  </emprecord>

</emprecords>

Employee XSD

<?xml version="1.0" encoding="utf-8"?>

<xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">

  <xs:element name="emprecords">

    <xs:complexType>

      <xs:sequence>

        <xs:element maxOccurs="unbounded" name="emprecord">

          <xs:complexType>

            <xs:sequence>

              <xs:element name="empid" type="xs:unsignedShort" />

              <xs:element name="emplname" type="xs:string" />

  <xs:element name="empfname" type="xs:string" />

              <xs:element name="empage" type="xs:unsignedByte" />

            </xs:sequence>

          </xs:complexType>

        </xs:element>

      </xs:sequence>

    </xs:complexType>

  </xs:element>

</xs:schema>

Receiver Side:

Receiver JDBC - XSD

<?xml version="1.0" encoding="utf-8"?>

<xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">

  <xs:element name="root">

    <xs:complexType>

      <xs:sequence>

        <xs:element name="StatementName">

          <xs:complexType>

            <xs:sequence>

              <xs:element name="dbTableName">

                <xs:complexType>

                  <xs:sequence>

                    <xs:element name="table" type="xs:string" />

                    <xs:element maxOccurs="unbounded" name="access">

                      <xs:complexType>

                        <xs:sequence>

                          <xs:element name="empid" type="xs:unsignedByte" />

                          <xs:element name="emplname" type="xs:string" />

                          <xs:element name="empfname" type="xs:string" />

                          <xs:element name="empage" type="xs:unsignedByte" />

                        </xs:sequence>

                      </xs:complexType>

                    </xs:element>

                  </xs:sequence>

                  <xs:attribute name="action" type="xs:string" use="required" />

                </xs:complexType>

              </xs:element>

            </xs:sequence>

          </xs:complexType>

        </xs:element>

      </xs:sequence>

    </xs:complexType>

  </xs:element>

</xs:schema>

Message Mapping:

Upload sender side xml XSD as source payload structure and Upload target side xml XSD as per target accepted structure...


Note: DB action should be specified like INSERT, UPDATE or DELETE..




Now, deploy the iflow and test it.

Note: Link all the steps and double check the connection parameters and ensure that the source and target systems are up and running, else you may encounter connectivity related exceptions.

Thanks for reading :-)

 


Monday, June 30, 2025

Transaction Handling .?

In this blog, we will see what is Transaction-Handling and how to use it.?

Transaction-Handling:

It refers to the management of a series of operations so they behave as a single unit of work — where either all steps succeed or the process handles failures gracefully to maintain data integrity.

When you enable transaction-handling enabled, it hold locks on resources (e.g., database records, queue entries) for their duration. While this is necessary for consistency, effective transaction management encourages short, well-defined transactions. This minimizes the time resources are locked, reducing contention and improving overall system performance and scalability.

Options:

  • No Transaction: Each step is committed independently. Only suitable for very simple, non-critical scenarios where data consistency is not paramount.
  • Required for JDBC: This ensures that all operations interacting with CPI's internal database (e.g., Data Store operations, Write Variables used with Data Stores) are part of a single transaction. If one fails, all are rolled back.
  • Required for JMS: Similarly, this applies transactional behavior to operations involving JMS queues managed by CPI.

Main Iflow:

Local / Child Iflow:


Transactional processing means that the message (as defined by the steps contained in a process) is processed within one transaction.

For example, consider a process with a Data Store Write operation. If transaction handling is activated, the Data Store entry is only committed if the whole process is executed successfully. In an error case, the transaction is rolled back and the Data Store entry isn't written. If transaction handling is deactivated, the Data Store entry is committed directly when the integration flow step is executed. In an error case, the Data Store entry is nevertheless persisted (and not removed or rolled back).

Note:

  • It's crucial to understand that CPI's transaction handling primarily covers its internal resources (like Data Stores and JMS queues). 
  • It does not provide a distributed transaction coordinator (like a two-phase commit) across heterogeneous external systems (e.g., atomically committing changes in an SAP ERP system and an external Salesforce system). 

For such complex cross-system consistency, you often need to implement patterns like Saga patterns, compensating transactions, or ensure idempotent operations on the external systems.

In conclusion, transaction handling in SAP CPI is a critical enabler for building reliable, fault-tolerant, and data-consistent integration solutions that can gracefully handle errors and maintain the integrity of your business data across interconnected systems.


Source: community.sap.com

Thanks for reading :-)


Data Store | Importance

In this blog, I am going to explain what is Data Store, Use Cases and various operations..?

Data Store plays an important role in handling persisted messages or data during integration flows. It allows temporary or long-term storage of messages in a structured way. 

Importance and use cases:

Message Persistence

  • Stores messages that may need to be retried in case of failure (e.g., during asynchronous communication).
  • Useful in scenarios where message delivery guarantees (like at-least-once delivery) are required.

Decoupling Processing Logic

  • Enables decoupling between sender and receiver systems.
  • Data can be collected from multiple sources and stored, then processed later in batches or at scheduled times.

Reliable Integration

  • Acts as a temporary queue or buffer, improving the reliability of the integration process.
  • Helps in recovery from transient failures, since messages can be reprocessed from the data store.

Error Handling and Retry Mechanism

  • Failed messages can be stored for later inspection or manual/automated retry.
  • Greatly enhances monitoring and troubleshooting capability.

Audit and Tracking

  • Supports audit trails, allowing you to track message payloads for compliance or debugging.
  • Messages stored can be indexed with metadata (e.g., ID, timestamp) for searching and reporting.



Considerations:
  • Storage Limits: Be mindful of the overall storage limit (e.g., 32 GB) for data on your CPI tenant. Data stores are for temporary storage, not for long-term archiving or acting as a persistent database.
  • Performance: While data stores offer performance benefits through decoupling, excessive or inefficient use of SELECT operations on very large data stores can impact performance.
  • Security: CPI provides options to encrypt stored messages for enhanced security.

Types of operations:

  1. Write
  2. Get
  3. Select
  4. Delete
Write Operation: It stores entries in the data store.



  • Data Store Name: Name of the data store where you want to store the message.
  • Visibility: Indicates if the data store is only visible within the integration flow where it is defined or Global - to all the integration flows that are deployed on that tenant.
  • Entry ID: A unique identifier to identify the entry. It can be picked from a header or from the message using XPath. If nothing is provided, the system generates a random GUID for this entry.
  • Retention Threshold for Alerting (in d): Defines the number of days by when the entry must be read. Else an alert is generated in the cloud platform account. Default – 2days.
  • Expiration Period (in d): The number of days (from when the entry is written into the data store) after which the entry must be deleted from the data store.
  • Encrypt Stored Message: Check this if you want to encrypt the message before storing.
  • Overwrite Existing Message:  Check this if you want to overwrite a message (if one already exists) with the same Entry ID in this data store.

Get Operation: To retrieve one entry from the data store



  • Data Store Name:  The name of the data store from where the entry needs to be retrieved.
  • Visibility: indicates if the data store is visible only to this integration flow or to all the integration flows on the tenant.
  • Entry ID: the unique identifier of the entry you want to retrieve from the data store. If this is left empty, the last entry that was added to the data store is retrieved. It can also be picked from a header or from the message using XPath.
  • Delete On Completion: check this if you want to delete the entry from the data store after retrieving.


Select Operation: To retrieve multiple messages from the data store in one bulk.


  • Data Store Name: The name of the data store from where the entries need to be retrieved
  • Visibility: indicates if the data store is visible only to this integration flow or to all the integration flows on the tenant.
  • Number of Polled Messages: enter the number of messages that you would like to retrieve from the data store.
  • Delete On Completion: check this if you want to delete the entry from the data store after retrieving.


Delete Operation: To delete messages from the data store.


  • Data Store Name:  The name of the data store from where an entry needs to be deleted.
  • Visibility: indicates if the data store is visible only to this integration flow or to all the integration flows on the tenant.
  • Entry ID: the unique identifier of the entry you want to delete from the data store. If this is left empty, the last entry that was added to the data store is retrieved. It can also be picked from a header or from the message using XPath.

Notes:

  • Never use the name sap_global_store for your data store, because it is used by the system to store variables created by the Write Variable step.
  • If the message processing fails, the transaction is rolled back and the entry added to the data store will be deleted.
  • If you try to Write an entry to the data store with an entry ID that already exists and “Overwrites Existing Message” is not checked, the message processing will fail.

Thanks for reading :-)

Groovy - Script

Introduction

Groovy Script is an important component in the SAP Integration Suite, especially in the place of SAP Cloud Platform Integration (CPI). It's a dynamic scripting language for the Java platform that enhances the productivity of integration developers.

Purpose: Groovy Script is used to implement custom logic during the integration process or mapping. It provides flexibility to manipulate messages, transform data formats, and handle complex integration flows.

Integration: It seamlessly integrates with Java, allowing developers to use existing Java libraries and frameworks in their scripts.

Features: Groovy offers features like easy-to-read syntax, error handling, and native support for XML and JSON, making it ideal for integration jobs.

Use Cases: Common use cases in SAP CPI include message mapping, data transformation, content enrichment and dynamic routing.

Why Groovy Script.?

  • In the place of Groovy script, you can choose Jave script also. But Groovy is built on Java Platform and allowing seamless integration with Java libraries and functionalities, crucial in Java-heavy SAP environments.
  • Groovy has excellent built-in support for XML and JSON, simplifying parsing and manipulation of these data formats in integration processes.
  • Groovy provides superior error handling, especially useful in managing integration-specific exceptions.

Common use cases..
  • Dynamic Header and Property manipulation
  • Content transformation like XML to JSON, XML to CSV or vice-versa
  • Conditional Logic
  • Payload Enrichment or cleanup
  • Custom Logging
  • Data Mapping and Lookup
  • Custom Error Handling
  • Signature Generation
  • Date and Time Calculation
  • Complex math calculation

Build a simple flow and place Groovy script step in iFlow and copy/paste the following script..
import com.sap.gateway.ip.core.customdev.util.Message
 
def Message processData(Message message) {
    //code here
    message.setBody('Hello World')
    return message
}


Then, deploy the flow and test it.

Ref.. for more examples:


XSLT - Mapping

In this blog, we will see what is XSLT mapping and how to use it in SAP CPI.? 

What is XSLT.?
XSLT is Extensible Stylesheet Language Transformations
XSLT is a language designed for transforming XML documents. In SAP Cloud Integration, it allows for dynamic modifications to messages, including altering headers and properties. This capability is particularly useful for applying conditional logic or handling complex transformations that go beyond simple field mapping.

Benefits of XSLT Mapping in SAP CPI

  • Data Consistency: Data consistency can be achieved by XSLT mapping, which is one of its main benefits. You can preserve the integrity of your data as it flows across various systems and apps by establishing precise rules for transformation.
  • Flexibility: Data transformations are flexible when using XSLT mapping. The mapping rules can be adjusted and changed as your company's needs.
  • Integration Capabilities: The smooth integration of SAP CPI with multiple systems is a key feature. This integration is made possible through XSLT mapping, which makes sure that data is formatted correctly for each target system.
  • Efficiency: A crucial component of XSLT mapping is efficiency. Processes for transforming data are streamlined, which lowers the chance of mistakes.

Example:
At the target side, or in the output we want that the Employee whose age is not mentioned should be excluded from the xml. So in this case we used the XSLT Mapping.

Incoming payload is:

<root>
<employee>
<name>Jay</name>
<age>20</age>
</employee>
<employee>
<name>Kiran</name>
</employee>
<employee>
<name>Kisan</name>
<age>21</age>
</employee>
</root>

XSLT:
<xsl:stylesheet version="2.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:template match="node()|@*">
<xsl:copy>
<xsl:apply-templates select="node()|@*"/>
</xsl:copy>
</xsl:template>
<xsl:template match = "employee[not(Age)]"/>
</xsl:stylesheet>


Save the iflow and deploy it. Test it from Postman.



In the response payload, employee details are excluded only when <age> is missing from incoming payload.

Source: community.sap.com

Thanks for reading :-)

Externalization / External Parameters - How to use it

Externalization refers to the practice of storing configuration parameters, credentials, and other dynamic values outside the integration flow (iFlow) itself. This approach provides several important benefits and is crucial for maintainable, secure, and scalable integration design.

Various Features:

Promotes Reusability

  • By externalizing values like endpoints, credentials, or message headers, you can reuse the same iFlow across different environments (Dev, Test, Prod) without modifying the iFlow logic.
  • This minimizes duplication and development effort.

Environment-Specific Configurations

  • Externalized parameters allow you to configure iFlows differently for each environment using configuration values defined in the “Configuration” tab of the iFlow.
  • Example: Different API keys, URLs, or proxy settings for Dev and Prod.

Separation of Logic and Configuration

  • Keeps business logic clean and separate from configuration data.
  • Makes integration flows easier to understand and maintain.

Simplifies Maintenance and Updates

  • Updates to configuration (e.g., endpoint changes) don’t require redeploying or modifying the iFlow.
  • You can just change the externalized value in the runtime configuration.

Improves Security

  • Sensitive values like passwords and tokens can be stored securely using:
    • Secure Parameters (in CPI)
    • Credential Artifacts (e.g., User Credentials or OAuth2 Client Credentials)

  • Reduces the risk of exposing credentials in code.

Supports Parameterization

  • Externalization enables dynamic behavior based on parameters (like setting headers or file names).
  • You can pass values into iFlows during runtime via headers or query parameters and map them using externalized parameters.

Facilitates DevOps and CI/CD

  • Makes it easier to integrate with DevOps pipelines by avoiding hardcoded values.
  • Configuration files or scripts can inject the right values during deployment.


How Externalization Works in SAP CPI:

When designing an iFlow, you can mark certain fields (e.g., an adapter's "Address" field, properties in a Content Modifier, or parameters in a script) as "externalized." This creates a parameter that can then be configured with a "Default Value" during design time. Once the iFlow is deployed, you can access the "Configure" view for that iFlow in the CPI tenant and provide "Configured Values" for these externalized parameters. These configured values override the default values at runtime for that specific tenant.

Let's build a small iFlow..


Click on OData connection..

Click on Externalize, as shown above..


And, then specify connection parameters that are being changed based on target environment like QA, PROD etc and leave remaining parameters like time-out, CSRF Protected..etc.

Then, save the iFlow and download it.


Now, go to QA or PROD environment, upload the iFlow 

Select Upload and specify Integration Flow and then click on Add.


Then, click on "Configure" option and select Receiver (drop down) and enter the values as per the target environment (QA or PROD).


Recommendations:

1. Externalize all the fields of the integration flow that you envisioned and provide the appropriate default values.

2. Be sure not to provide the tenant/landscape-specific value as a default parameter value.

3. Validate the default value of parameters through validation checks. Saving the integration flow will run validation checks.

4. Always provide the tenant/landscape-specific value in the Configure view.

5. Before downloading the integration flow or exporting the content package, always leverage the benefit of the Externalized Parameters view to compare the default and configured value of parameters for quality assurance. Update the default parameter value from the Externalized Parameters view or externalization editor for any correction.

6. Download the integration flow with Default Values Only if you do not want to reuse the configurations of the source system while importing into the target system.

7. Download the integration flow from source system with Merged Configured and Default Values if you want to reuse the configurations of the source system while importing into the target system.


Source: community.sap.com

Thanks for reading :-)

 

APIM (API Management) - APIKey Policy

What is an API-M?

In SAP CPI (Cloud Platform Integration), APIM stands for API Management, which is a capability within the SAP Integration Suite. It is designed to help organizations create, publish, and manage APIs that connect applications and systems both within and outside the enterprise. APIM provides a centralized platform to secure APIs with policies (such as authentication, rate limiting, IP whitelisting), monitor API usage, control access, and expose APIs through developer portals for external consumption..

Key components: 

API Provider: Represents backend systems or services such as iFlow end point alias (which will do basic transformation or get the data from back end systems like DB, ECC, S4 Hana, Salesforce or any) or any other system that expose APIs.

API Proxy: The actual API interface for consumers/third pary systems, which are created to apply policies, transformations, and security before exposing the backend API URL.

API Product: A collection or bundle of one or more API proxies made available together to developers.

Security and Monitoring: Features like OAuth2, API keys, rate limiting, IP whitelisting, and built-in protections against attacks to secure APIs, plus analytics and monitoring to track API performance and usage.



Case Study:

1. Create a iflow, which will call back-end system to fetch the data either by using any connector or HTTP call.


Sample third party URL: (dummy)
https://dummy.restapiexample.com/api/v1/employees
(response is subject to URL availability)
Method : GET

Now, deploy it and test it.

2. Create API Provider 
Go to Configure > API and then go to API Provider click on "Create"



Specify the details as shown below, save it and test the connection
(API  - Service Key details..)

3. Create API Proxy
    Click on Create 



Select API from drop down list and click on "Discover"


Select the required one and click on Next
Select OAuth and specify PRT Service Keys - Client ID/ Client-Secret, Token URL ..

Proxy URL will be created as shown below..




Now test the end-point from Postman...



It means, anyone can access. But we need to protect it by using Application Key or APIKey...

For that, we need to create a Product..

4. Go to Engage and click Create..



Enter Name and Tile and Quota and its interval as required..




Go to APIs..select the API and then click on OK.
Then Save it and Publish it to Developer HUB



After successful Publish... you can see it in HUB



Click on the product, then create New Subscription for Application..



Specify the name and save it to get APIKey


This key is Unique to identify end consumer of the API.



In order to accept the request based on APIKey, we need to apply "Verify API Key" policy ..

Go to API Proxy, select the API and click on policies..



Accepting the API Key thru header

 <!--Specify in the APIKey element where to look for the variable containing the api key--> 
<VerifyAPIKey async='true' continueOnError='false' enabled='true' 
xmlns='http://www.sap.com/apimgmt'>
    <APIKey ref='request.header.APIKey '/>
</VerifyAPIKey>

Accepting the API Key thru queryparam

 <!--Specify in the APIKey element where to look for the variable containing the api key--> 
<VerifyAPIKey async='true' continueOnError='false' enabled='true' 
xmlns='http://www.sap.com/apimgmt'>
     <APIKey ref='request.queryparam.APIKey '/>
</VerifyAPIKey>

Save it and deploy the API Proxy.

Now test the API from Postman either APIKey as header or queryparam...

APIKey as queryparam


APIKey as header


That's all.

If you want to see list of subscriptions for the Product..

Go to HUB, select the product..


You can test from API portal also...and we can debug to identify the issue w.r.to policies / conversions and it's useful for complex flows.

Source: SAP Community.

Thanks for reading :-)

Raise Fault Policy | API Management

What is Raise Fault Policy..? The RaiseFault policy allows you to create custom messages in case of error conditions. This policy returns a ...