Friday, July 25, 2025

Raise Fault Policy | API Management

What is Raise Fault Policy..?

The RaiseFault policy allows you to create custom messages in case of error conditions. This policy returns a FaultResponse to the requesting application if it encounters an error condition.

 A FaultResponse can consist of HTTP headers, query parameters, and a message payload. These elements can be populated using variables. This enables you to send customized FaultResponses that are specific to the error conditions.

During execution, the RaiseFault policy transfers the message flow to the default ErrorFlow, which in turn returns the designated FaultResponse to the requesting application.

When the message flow switches to the default ErrorFlow, no further policy processing occurs. All remaining processing steps are bypassed, and the FaultResponse is returned directly to the requesting app.

Example:

We will create a API Proxy and specify end-point URL and then try to access it from Postman by using username/password.

If login credentials are correct then we will get response from the end-point, else we will get custom error message from RaiseFault policy.

API Proxy Creation:


End Point URL: https://dummy.restapiexample.com/api/v1/employees

(replace it with your URL)

Save it and go to Policies

Keep Basic Authentication Policy and Raise Fault Policy as shown below..



For Basic Authentication Policy:

<BasicAuthentication async='true' continueOnError='false' enabled='true' xmlns='http://www.sap.com/apimgmt'>
	<Operation>Decode</Operation>
	<IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
	<User ref='current.username'></User>
	<Password ref='current.password'></Password>
	<Source>request.header.Authorization</Source>
</BasicAuthentication>

For Raise Fault Policy:

<RaiseFault async="true" continueOnError="false" enabled="true" xmlns="http://www.sap.com/apimgmt">
    <FaultResponse>
        <Set>
            <Headers/>
            <Payload contentType="application/json">{"status" : "Error", "messege" : "401 Invalid User or Password", "Suggestion" : "Try with correct user name / password " } </Payload>
            <StatusCode>401</StatusCode>
            <ReasonPhrase>Unauthorized</ReasonPhrase>
        </Set>
    </FaultResponse>
    <IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
</RaiseFault>



Specify Condition String:
(current.username != "YOUR_USERNAME")  OR (current.password != "YOUR_PASSWORD")


Save it and deploy it.

Now test it from Postman..with Correct login credentials


With incorrect logins


Based on your requirement, you can customize RaiseFault Policy.

That's it.

Source: SAP Community

Thanks for reading :-)

Wednesday, July 2, 2025

Solace(EventMesh Replica) Setup

In this blog, I am going to explain how to setup Solace (EventMesh Replica) environment and establish connectivity with CPI and small Pub/Sub case study.

First, setup a Solace account with your mail account.

https://console.solace.cloud/login

Upon successful logged in, 

Click on Cluster Manager and create a service ..(it's a self-explanatory process).

Click on service-name..


Go to Manage, where you can see Clients, Queues..etc

Click on Queues, where you can create queues and manage it.


Click on edit to change the settings..


Click on Queue and then create a Topic if you want it.

Connection details, which should be used to connect from CPI


Now, create a small iFlow ..

Specify, connection parameters for AdvancedEventMesh connector.



Now, save and deploy the iFlow.

Upon execution, you can see the payload/messages in queue(Solace)


That's all for Publishing part.

                                                                           *********

Now, build a Subscription iFlow.


Specify connection parameters as created in Publishing side...


You need to deploy the iFlow and test it.

Thanks for reading :-)

Tuesday, July 1, 2025

Value - Mapping

In this blog, I am going to explain what is value-mapping, importance and how to implement it.?

Value Mapping in SAP CPI is a critical feature that enables seamless and consistent data transformation between disparate systems in an integration landscape.

  • In integration scenarios, you frequently encounter scenarios where a "country code" might be "US" in one system (source), "USA" in another (target).
  • Similarly, "material type" could be "ROH" in SAP ERP and "Raw Material" in a manufacturing execution system. 

Without a centralized mapping, data becomes inconsistent, leading to errors, incorrect reporting, and operational inefficiencies.

Various features of Value-Mapping:

  • Centralized Management and Governance: Value Mappings are stored as separate artifacts within integration packages, providing a structured way to manage and govern conversion rules. This centralization aids in documentation, auditing, and ensuring adherence to enterprise-wide data standards.
  • Improved Maintainability and Agility: Value Mappings allow for dynamic updates without necessarily changing the iFlow's core logic. Business users or functional consultants (with appropriate access) can often manage these mappings directly in the CPI tenant, empowering them to react quickly to changes without requiring full developer involvement for every minor adjustment.
  • Enhanced Reusability: Value Mappings are reusable artifacts. Once defined, they can be used across any message mapping within the same integration package, promoting consistency and reducing development effort. If a value changes (e.g., a new country code is introduced), you only need to update the Value Mapping artifact once, and all dependent iFlows automatically inherit the change upon deployment.
  • Bidirectional Mapping: Value Mapping in CPI supports bidirectional lookups.  This means you can not only map from a source value to a target value but also, if needed, reverse the lookup from the target value back to its original source value, which can be useful in certain integration scenarios or for reconciliation purposes.
First, create Value-Mapping 
Under Package<sample> > Artifacts >Add>Value Mapping



Then, save and deploy it.

Create a simple iflow ...





Under Message Mapping, specify source meta data and target meta data, here I used same meta data.

<?xml version="1.0" encoding="utf-8"?>
<xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
  <xs:element name="emprecords">
    <xs:complexType>
      <xs:sequence>
        <xs:element maxOccurs="unbounded" name="emprecord">
          <xs:complexType>
            <xs:sequence>
              <xs:element name="empid" type="xs:unsignedShort" />
              <xs:element name="emplname" type="xs:string" />
  <xs:element name="empfname" type="xs:string" />
              <xs:element name="empage" type="xs:unsignedByte" />
              <xs:element name="emporigin" type="xs:string" />
            </xs:sequence>
          </xs:complexType>
        </xs:element>
      </xs:sequence>
    </xs:complexType>
  </xs:element>
</xs:schema>



Select conversion>value mapping and specify SourceAgency/Target Agency and it's identifiers


On Failure is essential when there is no incoming field is available. 
Where you can select ...below options based on your required.
  • Use Key,
  • Use Default value
  • Throw exception

Now, test the iflow with below given input and you will get given output.

Input Data/Payload:

<?xml version="1.0"?>
<emprecords>
    <emprecord>
        <empid>4001</empid>
        <emplname>Paul</emplname>
        <empfname>David</empfname>
        <empage>20</empage>
        <emporigin>INDIA</emporigin>
    </emprecord>
    <emprecord>
        <empid>4002</empid>
        <emplname>PP</emplname>
        <empfname>Ramesh</empfname>
        <empage>20</empage>
        <emporigin>Germany</emporigin>
    </emprecord>
    <emprecord>
        <empid>4003</empid>
        <emplname>DD</emplname>
        <empfname>Kiran</empfname>
        <empage>20</empage>
        <emporigin>United States</emporigin>
    </emprecord>
    <emprecord>
        <empid>4004</empid>
        <emplname>DD</emplname>
        <empfname>Desh</empfname>
        <empage>20</empage>
        <emporigin>India</emporigin>
    </emprecord>
    <emprecord>
        <empid>4005</empid>
        <emplname>DD</emplname>
        <empfname>Raju</empfname>
        <empage>20</empage>
        <emporigin>Swiss</emporigin>
    </emprecord>
    <emprecord>
        <empid>4006</empid>
        <emplname>DD</emplname>
        <empfname>Pavan</empfname>
        <empage>20</empage>
        <emporigin>Brazil</emporigin>
    </emprecord>
</emprecords>


Output Data/Payload:

<?xml version="1.0" encoding="UTF-8"?>

<emprecords>
    <emprecord>
        <empid>4001</empid>
        <emplname>Paul</emplname>
        <empfname>David</empfname>
        <empage>20</empage>
        <emporigin>IN</emporigin>
    </emprecord>
    <emprecord>
        <empid>4002</empid>
        <emplname>PP</emplname>
        <empfname>Ramesh</empfname>
        <empage>20</empage>
        <emporigin>GM</emporigin>
    </emprecord>
    <emprecord>
        <empid>4003</empid>
        <emplname>DD</emplname>
        <empfname>Kiran</empfname>
        <empage>20</empage>
        <emporigin>-NA-</emporigin>
    </emprecord>
    <emprecord>
        <empid>4004</empid>
        <emplname>DD</emplname>
        <empfname>Desh</empfname>
        <empage>20</empage>
        <emporigin>IN</emporigin>
    </emprecord>
    <emprecord>
        <empid>4005</empid>
        <emplname>DD</emplname>
        <empfname>Raju</empfname>
        <empage>20</empage>
        <emporigin>-NA-</emporigin>
    </emprecord>
    <emprecord>
        <empid>4006</empid>
        <emplname>DD</emplname>
        <empfname>Pavan</empfname>
        <empage>20</empage>
        <emporigin>BLR</emporigin>
    </emprecord>
</emprecords>

Sample Output:


Note: We can achieve same thing with Fix Values feature also. But, you need to update the iflow, whenever you add Key/Value pair or update or delete it and then deploy it. So downtime will be there. So use Value-Mapping instead.

Under Conversions> Fix Values or Value Mapping


Thanks for reading :-)

Optimizing Resource Consumption

In this blog, I am going to explain the process of inspecting resource consumption in SAP CPI by using the Inspect Resource Consumption dashboard. 

By analyzing key metrics such as database connections, data store usage, and monitoring storage you can identify potential issues and then optimize performance. 

Dashboard Overview: 

The Inspect Resource Consumption dashboard provides a snapshot of essential metrics and it displays various tiles that represent the consumption levels of different resources in your tenant. Each tile shows the overall consumption of a resource over the past 24 hours.


Click on any tile (connections/data store) for detailed information about consumption..



Database Resources

Inspect the usage of integration resources associated with the tenant database and with the database connection pool.

  • Connections: you can inspect resource usage of the database connections caused by integration flows.
  • Data Store: you can inspect resource usage of the tenant database caused by integration flows using data store operations steps.
  • Transactions: you can inspect resource usage of the database transactions caused by integration flows.
  • Monitoring Storage: you can inspect resource usage of the monitoring database storage caused by integration flows.

System Resources
Inspect system resource usage caused by your active integration flows.
  • Memory: you can inspect resource usage of system memory caused by integration flows.
  • Temporary Storage: you can inspect the storage usage of temporary files.
Content Resources
Inspect content resource usage caused by your integration artifacts.
  • Content Size: you can inspect the file size that integration artifacts occupy.
  • Integration Flows: you can inspect the integration flows with the highest resource consumption.
Important Points:
  • Resources should be monitored during Load / Performance testing
  • And, Level 2 / support team should monitor the resources in PROD tenant regularly.

Thanks for reading :-)

SFTP to DB (Multiple line items)

In this blog, I am going to explain how to retrieve the data/payload from SFTP and send it to DB (PostgreSQL).



Step 1:

Establish connectivity with SFTP

Get the SFTP server, port, user name, password from the infra team / client team

In my example, 

Location: demo.wftpserver.com

Username: demo

Password: demo

FTP Port: 21

FTPS Port: 990

SFTP Port: 2222





Schedule it as per the requirement...
Time Zone is very important, if you want to schedule on particular time.



Step 2:

Establish connectivity with DB (PostgreSQL)

Refer my blog - how to establish the connectivity and sample queries..

JDBC - blog


Sender Side:

Employee XML

<?xml version="1.0"?>

<emprecords>

  <emprecord>

    <empid>4001</empid>

    <emplname>Paul</emplname>

    <empfname>David</empfname>

    <empage>20</empage>

  </emprecord>

  <emprecord>

    <empid>4002</empid>

    <emplname>PP</emplname>

    <empfname>Ramesh</empfname>

    <empage>20</empage>

  </emprecord>

  <emprecord>

    <empid>4003</empid>

    <emplname>DD</emplname>

    <empfname>Kiran</empfname>

    <empage>20</empage>

  </emprecord>

</emprecords>

Employee XSD

<?xml version="1.0" encoding="utf-8"?>

<xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">

  <xs:element name="emprecords">

    <xs:complexType>

      <xs:sequence>

        <xs:element maxOccurs="unbounded" name="emprecord">

          <xs:complexType>

            <xs:sequence>

              <xs:element name="empid" type="xs:unsignedShort" />

              <xs:element name="emplname" type="xs:string" />

  <xs:element name="empfname" type="xs:string" />

              <xs:element name="empage" type="xs:unsignedByte" />

            </xs:sequence>

          </xs:complexType>

        </xs:element>

      </xs:sequence>

    </xs:complexType>

  </xs:element>

</xs:schema>

Receiver Side:

Receiver JDBC - XSD

<?xml version="1.0" encoding="utf-8"?>

<xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">

  <xs:element name="root">

    <xs:complexType>

      <xs:sequence>

        <xs:element name="StatementName">

          <xs:complexType>

            <xs:sequence>

              <xs:element name="dbTableName">

                <xs:complexType>

                  <xs:sequence>

                    <xs:element name="table" type="xs:string" />

                    <xs:element maxOccurs="unbounded" name="access">

                      <xs:complexType>

                        <xs:sequence>

                          <xs:element name="empid" type="xs:unsignedByte" />

                          <xs:element name="emplname" type="xs:string" />

                          <xs:element name="empfname" type="xs:string" />

                          <xs:element name="empage" type="xs:unsignedByte" />

                        </xs:sequence>

                      </xs:complexType>

                    </xs:element>

                  </xs:sequence>

                  <xs:attribute name="action" type="xs:string" use="required" />

                </xs:complexType>

              </xs:element>

            </xs:sequence>

          </xs:complexType>

        </xs:element>

      </xs:sequence>

    </xs:complexType>

  </xs:element>

</xs:schema>

Message Mapping:

Upload sender side xml XSD as source payload structure and Upload target side xml XSD as per target accepted structure...


Note: DB action should be specified like INSERT, UPDATE or DELETE..




Now, deploy the iflow and test it.

Note: Link all the steps and double check the connection parameters and ensure that the source and target systems are up and running, else you may encounter connectivity related exceptions.

Thanks for reading :-)

 


Raise Fault Policy | API Management

What is Raise Fault Policy..? The RaiseFault policy allows you to create custom messages in case of error conditions. This policy returns a ...