SAP Process Orchestration - ERP Q&A https://www.erpqna.com/tag/sap-process-orchestration/ Trending SAP Career News and Guidelines Mon, 23 Dec 2024 04:55:09 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP Process Orchestration - ERP Q&A https://www.erpqna.com/tag/sap-process-orchestration/ 32 32 SAP PO to Integration Suite | Migration Assessment Tool https://www.erpqna.com/sap-po-to-integration-suite-migration-assessment-tool/?utm_source=rss&utm_medium=rss&utm_campaign=sap-po-to-integration-suite-migration-assessment-tool Fri, 05 Jul 2024 10:44:43 +0000 https://www.erpqna.com/?p=86192 SAP Integration Suite is a new age solution designed to streamline the integration of SAP and non-SAP systems. Integration Suite extends its reach to integrate Cloud and On-premises applications, data sources and APIs. SAP is encouraging customers to transition older integration solutions (like PI/PO) to a more unified and cloud centric solution which is the […]

The post SAP PO to Integration Suite | Migration Assessment Tool appeared first on ERP Q&A.

]]>
SAP Integration Suite is a new age solution designed to streamline the integration of SAP and non-SAP systems. Integration Suite extends its reach to integrate Cloud and On-premises applications, data sources and APIs.

SAP is encouraging customers to transition older integration solutions (like PI/PO) to a more unified and cloud centric solution which is the Integration Suite. This means that now we must consider doing migration assessments from PI/PO to Integration Suite.

Fortunately, SAP provides a ready to use Migration Assessment tool with the Integration Suite subscription.

High level process flow of how we can achieve the Migration Assessment

Pre-requisite: Application subscription to Integration Suite in your BTP subaccount.

Preparation: When you intend to do the assessment on SAP PO which is hosted on-premises using a tool which is enabled by Integration Suite, a cloud hosted service, you must make sure that you have a valid and secure connection well established, this is where we use SAP Cloud Connector service.

SAP cloud connector is a service hosted on-premises and is connected with BTP subaccount and further leveraged by Cloud/SaaS applications. The diagram below provides a general overview:

On-Premise setup and configurations:

Once installed, you can perform the configuration by logging into the Cloud connector using the url – Access https://localhost:8443, initially with Name “Administrator” and Password “manage”, password to be changed on initial logon.

Provide the BTP subaccount details where you have your Integration Suite subscription and save.

Check for successful connection with BTP subaccount on the top bar of the Cloud Connector logon page.

Now, we will have to create a new connection with the backend SAP PI/PO system.

In the Cloud Connector, on the left side bar, click ‘Cloud to On-Premise’ and click on ‘add’ system

From the drop-down options select ‘SAP Process Integration’

Next, provide the protocol (HTTP/HTTPS)

Next, provide PI/PO host and port information, note that this must be accessible from Cloud connector host.

Next, provide a virtual host and port, which will be mapped to the actual ones

Next, choose the option to use Virtual host and port

Next, check Internal host and finish the configuration

Under the check result if we see ‘Reachable’ comment that means we have a successful connection from Cloud Connector to SAP PO. Otherwise, troubleshooting may be required.

In addition to this we will have to expose following PO services for Migration assessment (service names are listed below)

/CommunicationChannelInService
/IntegratedCongurationInService
/SenderAgreementInService
/AlertRuleInService
/IntegratedConfiguration750InService
/ValuemappingInService
/ConfigurationScenarionInService
/BPMFacaseBeanImplService
/ReceiverAgreementInService
/rep/read/ext
/dir/read/ext
/rep/support/SimpleQuery

Add each of the services here with path and all sub paths selection

This is where our Cloud Connector configuration is completed.

Configurations and Setup on cloud:

To check the connectivity from Integration Suite, log in to Integration Suite, click ‘monitor’ from the left pane and select ‘integrations’

Go to the ‘Cloud Connector’ tab, provide location ID that was defined during Cloud Connector setup and click on ‘Send’

If you get a message like below, we have a successful connection between Cloud Connector and the Integration suite which also means we have successfully connected to On-premise PO system.

Now, that we have the connectivity set up and we can proceed with the assessment.

Please make sure that you have a valid SAP PO user setup with the following roles/authorizations and that you have the user credentials.

SAP_XI_API_DISPLAY_J2EE

SAP_XI_API_DEVELOP_J2EE

SAP_XI_MONITOR_J2EE

Let us now explore how we can perform the assessment on PI/PO environment.

Data extraction:

On the Integration Suite, select ‘Access Migration Scenario’ -> ‘create requests’

On the left had panel, select ‘settings’ to add the on-premise PO system

add to the fields. Remember the credentials we created in PO and the location ID of the cloud connector.

now we can select the system and perform test. look out for ‘successfully connected’ message

Now that this is done.

Click ‘Request’ -> ‘Data Extraction’

create data extraction request

provide a request name of choice and the system (that we created in previous step)

as soon as you click ‘create’ the extraction starts, look out for the ‘in progress’ message.

This will take a while, wait for ‘completed’ message. monitor the logs in the meantime.

Evaluation:

Once the extraction is completed. we have to go to ‘Scenario Evaluation’

Click on ‘create’

The following fields will be filled based on the data extraction we ran in the previous step, specially the ‘data extraction request’ name

Click create and watch out for evaluation completed status

Download the artefacts

Assessment result interpretation:

The artefacts contain an excel sheet and a pdf document

In the excel we will get two sheets. ‘Evaluation by integration scenario’ and ‘Full evaluation results’.

Let us look at ‘Evaluation by integration scenario’ first. This will give a list of all the Integration scenarios that are available in the PO system

Notice the three attributes – ‘weight’, ‘T shirt size’ and the ‘Assessment category’.

The ‘Assessment Category’ signifies how easy or difficult it is to migrate the Integrations to Integration Scenario. See following definitions for assessment categories

  • Ready to migrate: can be migrated automatically using the migration tool, some post migration efforts may still be required.
  • Adjustments required: Can be migrated partially by the tool and remaining adjustments will have to be performed before activating the scenario.
  • Evaluation Required: means that the tool could not evaluate the migration scenario and that manual assessment is required.

‘Weight’ and ‘T shirt size’ are indicators of the efforts required for the migration.

T-shirt sizing is derived based on below rules

Looking at the ‘Full Evaluation Results’, we will see a set of ‘rules’ against each ‘integration scenario’

Each rule will indicate the interfaces engaged and each of these rules will have respective weightage

The sum of all the weightage for rules will provide the weightage of the Integration Scenario.

Another important thing to note is that even if one of the rule is categorized as ‘Evaluation required’ then the whole Integration scenario will be marked as ‘Evaluation required’ and should be dealt with accordingly, similarly for ‘Adjustment required’ cases.

The pdf from the assessment reports provides a lot of details too.

The summary provides the efforts estimated and a view of the assessment category

a pie chart view of the assessment categories

The report will also provide Sender and Receiver adapters count that will be migrated and that may not be supported.

Estimations by effort classification

Tabular view of the efforts required for each T shirt size – against the assessment categories

In the report, there are recommendations provided for the migration approach, eg. Reference templates from Integration suite can be used.

Rating: 0 / 5 (0 votes)

The post SAP PO to Integration Suite | Migration Assessment Tool appeared first on ERP Q&A.

]]>
SAP Event Mesh: S4 HANA On-Prem and SAP CI(BTP IS) through WebSocket and Webhooks https://www.erpqna.com/sap-event-mesh-s4-hana-on-prem-and-sap-cibtp-is-through-websocket-and-webhooks/?utm_source=rss&utm_medium=rss&utm_campaign=sap-event-mesh-s4-hana-on-prem-and-sap-cibtp-is-through-websocket-and-webhooks Wed, 11 Oct 2023 09:39:07 +0000 https://www.erpqna.com/?p=78869 Introduction What is Event Mesh? SAP Event Mesh allows applications to communicate through asynchronous events. Experience greater agility and scalability when you create responsive applications that work independently and participate in event-driven business processes across your business ecosystem. In this decoupled integration the producers of the events will not know the consumers of the events. […]

The post SAP Event Mesh: S4 HANA On-Prem and SAP CI(BTP IS) through WebSocket and Webhooks appeared first on ERP Q&A.

]]>
Introduction

What is Event Mesh?

SAP Event Mesh allows applications to communicate through asynchronous events. Experience greater agility and scalability when you create responsive applications that work independently and participate in event-driven business processes across your business ecosystem. In this decoupled integration the producers of the events will not know the consumers of the events.

Publish business events from SAP and non-SAP sources across hybrid landscapes from the digital core to extension applications through event-driven architecture. Consume business events from SAP and non-SAP sources throughout SAP’s event-driven ecosystem including SAP Extension Suite, SAP Integration Suite, and selected inbound enabled SAP backends. Achieve reliable data transmission for extension and integration scenarios through decoupled communication.

What is Websocket?

WebSocket is a stateful protocol used in the client-server communications used to enable the application you are using to receive the data without requesting it from the server like two-way communication. It means the connection between client and server will keep alive until it is terminated by either party (client or server). Data is sent and received much faster than HTTP as HTTP has a 2000-byte overhead whereas WebSocket has only a 2-byte cost. WebSockets supports multiple data types. In SAP CI AMQP adapter supports the websocket transport protocol.

What is Webhook?

Webhooks are stateless, meaning that each event notification is independent and carries all the necessary information about the event. Webhooks are for server to server communication. Web-hooks are commonly used to perform smaller requests and tasks on top of standard API calls. Webhooks are suitable for POST/PUSH mechanism. In the Event Driven Architecture when the Producer publish the events to the respective topics, then the Queues subscribed to those topics will trigger the event data to the consumer using the Webhook instead of consumer trying to pull the events.

What is Queue?

A queue is used to store a message until it’s received by the subscribed consumer. Queue cannot process message, it is first consumer and just store message/event. If any events are published in the Queue then only one subscriber to that Queue received a message. Queues are managed and persistent. When there are no consumers to a queue, messages are stored in the queue.

What is Topic?

Topics are not managed and not persistent. A topic is created on the go when we start publishing messages to a topic, and destroyed when there are no consumers listening to that topic. If the consumer to a topic is not running when a message is published to the topic, that event will not be received. One Event Producer can publish to one Topic. One Topic can be connected to multiple queues. Multiple queues can be subscribed to one topic.

Prerequisites:

  1. BTP cockpit access with Event Mesh subscription.
  2. S4 HANA access in tcodes STRUST and SPRO
  3. SAP BTP IS Tenant Access with required roles/roles collection having Enterprise Messaging*.

Step-1: Add Service Plans in the Entitlements of subaccount for Event mesh along with creation of instance and subscription.

Select the standard and default service plans for the Event Mesh in the Entitlement

“Standard” for Event Mesh subscription and “default” for instance creation

In the Instances and Subscription, create the “standard” event mesh subscription and “default” instance creation on top of the cloud foundry space and runtime environment.

While creating the service instance creation the Instance Name and EMNAME in JSON file should be same along with the Namespace in required format e.g. a/b/c

Step-2: Creation of Service Key for eminstance created

The clientid, clientsecret,tokenendpoint, protocol and uri should be noted for further configurations

Step-3: Creation of new Queue in Instance of Event Mesh

Step-4: Test publishing and consuming messages on a queue. [Optional]

Select the queuename and message client name for publishing and consuming the messages.

Step-5: Creation of OAuth2 client credentials in Security Material of Integration Suite

Use the TokenEndPoint URL and ClientID and Client Secret from the Event Mesh service Key created.

Step-6: Produce the events from SAP CI into Event Mesh

A new IFlow needs to be created with either HTTPS as sender for triggering from Postman or timer based event with some message in content modifier and AMQP as receiver.

For Demo purpose we will use the HTTPS as sender and AMQP receiver with WebSocket message protocol. Host will be the URI from the service key. Port:443 Path will be the protocol from service key. Credential Name from the security material.

We can publish on topics or the queue from the AMQP(Websocket) of CI.

For topic, topic:topicname(this topicname should be subscribed on the queue created in event mesh) and for queue, queue:queuename

Test from Postman with deployed IFLOW endpoint with the clientid and clientsecret of the Process Integration Runtime service key. This is to understand how the event are published. Not a realtime scenario.

Check the CI Message Monitor for successful message to Event Mesh

Check in the Event Mesh Queue with messages count increased from 0 to 1 along with queue size.

Step-7: Consume the events from Event Mesh into CI

Consuming the events from Event Mesh into CI can be done either with websocket(AMQP) or webhook.

In this case will create a websocket(AMQP) connection by configuring a new IFLOW with HOST, Path from Event Mesh service key, Port:443 and Credential Name from Security Material.

The queue name as queue:queuename. While consuming the Events from CI its always best to consume from queue rather the topic as topic are not persisted and queues are persisted which we can use for guaranteed delivery. As the message will be consume in JSON format we can convert it into XML as required.

Once deployed the message in the queue will be consumed by the CI IFLOW

Step-8: Create Topic and upload the certificate to produce the Events from S4 HANA onto Event Mesh

The overall design for S4 HANA to publish and CI to consume can be done in two ways

A. Websocket Communication

B. Webhook Communication

Websocket is already explained in the previous steps. Will check now the Webhook communication.

On top of the created Queue in Event Mesh we need to subscribe to a Topic

Download the certificate from the Token Endpoint along with clientid and clientsecret generated from the Event Mesh Service Key

Upload the certificate in S4 HANA STRUST in SSL Client SSL Client(Standard) and add to certificate list

Step-9: Create a AMQP Channel in SPRO tcode

Goto SAP Reference IMG=>ABAP Platform=>Enterprise Event Enablement=>Administration=>Channel Connection Settings=>Manage Channel and Parameters

Use the service key option to create the AMQP channel

Check the connection and activate the channel

Step-10: Create outbound binding on the created AMQP channel

Select the created channel and click on outbound binding and provide the topic name.

Step-11: Create webhook in the SAP Event Mesh

In order to create the webhook we need any HTTPs URL to be build and deployed. In our case we will create a new IFLOW with HTTPS as sender in CI itself.

Create a webhook with queue name, CI deloyed IFLOW endpoint and authentication with client id and client secret of the CI Process Integration Runtime

Testing:

Testing the Event from S4 HANA onto Event Mesh which will push through the WEBHOOK on CI.

Goto SAP Reference IMG=>ABAP Platform=>Enterprise Event Enablement=>Test Tools=>Produce Test Events

Select the channel name created and trigger the Event.

Check the CI Monitoring whether the event triggered from S4 HANA to Event Mesh was pushed onto CI through Webhook

Question 1??? What happens for triggered event from S4 HANA(Producer) is published on one topic in Event Mesh and there are both Websocket Queue Consumer to CI and Webhook Queue Consumer to CI with same topic/queue details.(Ideally it will not be the case in realtime but for testing will explore that case here)

Answer!!! As the queue consumer will be in continuous consumption mode it will be more quick to take the message in rather waiting for the webhook to trigger the message to the configured HTTPs endpoint.

Evidence***

The deployed consumer IFLOW is in continuous consumption mode. Also the consumer should be subscribed from Queue instead of Topic for persistence.

Question 2??? What happens when there are some errors(mapping/runtime) during the consumer receiving messages from the Event Mesh Queue? Will the message be deleted from Queue immediately or it will persist till the successful consumption?

Answer!!! The message in the queue will be present till the consumer receives and process it successfully. During that time the consumer will continuously retry.

Evidence***

When the consumer IFLOW intentionally failed for some content exception the message in the queue was still persisted and the IFLOW was automatically retried almost 8770 times in less than a minute.

Rating: 0 / 5 (0 votes)

The post SAP Event Mesh: S4 HANA On-Prem and SAP CI(BTP IS) through WebSocket and Webhooks appeared first on ERP Q&A.

]]>
Integration of SAP CI(BTP IS) with IBM MQ through AMQP https://www.erpqna.com/integration-of-sap-cibtp-is-with-ibm-mq-through-amqp/?utm_source=rss&utm_medium=rss&utm_campaign=integration-of-sap-cibtp-is-with-ibm-mq-through-amqp Sat, 02 Sep 2023 11:34:04 +0000 https://www.erpqna.com/?p=77129 Introduction What is IBM MQ? IBM MQ is a family of message-oriented middleware products that IBM launched in December 1993. It was originally called MQSeries, and was renamed WebSphere MQ in 2002 to join the suite of WebSphere products. In April 2014, it was renamed IBM MQ. IBM MQ supports the exchange of information between […]

The post Integration of SAP CI(BTP IS) with IBM MQ through AMQP appeared first on ERP Q&A.

]]>
Introduction

What is IBM MQ?

IBM MQ is a family of message-oriented middleware products that IBM launched in December 1993. It was originally called MQSeries, and was renamed WebSphere MQ in 2002 to join the suite of WebSphere products. In April 2014, it was renamed IBM MQ. IBM MQ supports the exchange of information between applications, systems, services and files by sending and receiving message data via messaging queues. This simplifies the creation and maintenance of business applications.

What is SAP CI(BTP-IS)?

Cloud Integration(BTP-IS) is a set of services and tools provided by SAP on its cloud-based Business Technology Platform (BTP) to enable integration between different systems, applications, and data sources. The key benefit of CI(BTP IS) is that it enables organizations to quickly and easily integrate their systems, data, and applications without the need for extensive coding or custom development. This helps to streamline business processes, reduce costs, and improve operational efficiency.

How IBM MQ can be integrated?

IBM MQ provides the messaging and queuing capabilities across multiple modes of operation: point-to-point ; publish/subscribe. IBM MQ has the Queue Managers(QM) in which different types of queues will be created. The QM can be connected directly or using client channel definition table or using a intermediate queue manager. All these will be associated with the channels which provides the in and out movement of the data from the Queues. Along with Queues we can have Topics also which can work with pub-sub approach. Rest APIS, JMS and MFTs can also be leveraged with the IBM MQ package installation.

How CI integrates with IBM MQ?

The integration between CI and IBM MQ can be best done using the AMQP 1.0 protocol. There are 7.5,8.0,9.0,9.1,9.2,9.3 versions of IBM MQ installations available in the market today. Out of which only 9.2 and above versions can support the integration between CI and IBM MQ.

The Queues on IBM MQ can be connected from CI and the Topics which can be published can also be subscribed from CI using AMQP protocol.

Note: Among the possible integrations with IBM MQ, Message Queues Integration using the AMQP protocol will be explained in detail.

Integration of SAP CI(BTP IS) with IBM MQ through AMQP

Prerequisites:

  1. Any IBM MQ server with version 9.2 and above. For Demo purpose using the trial IBM MQ from https://www.ibm.biz/ibmmqtrial.
  2. SAP Cloud Connector with required roles to connect IS tenant and IBM MQ
  3. SAP BTP IS Tenant Access with required CI roles.

Step-1: Install the IBM MQ 9.2 from the downloaded setup file

Select all the features and install them along with the MQ Explorer

IBM MQ Setup File

Once installed successfully open the IBM MQ Explorer which should open as below

IBM MQ Explorer

Step-2: Create the new Queue Manager (QM1)under the left side Queue Managers Pane

Queue Manager

Step-3: Create the new Queue (Q2) under the left side Queue Managers (QM1)

New Queue in the Queue Manager

Step-4: Create the new AMQP channel under the left side Queue Manager (QM1)

While Creating the AMQP channel provide the port as 5672 and start the channel. There will be a default AMQP channel which comes along with the installation stop it and use our configured AMQP channel. This is for the reason of handling the queues explicitly.

AMQP Channel

Step-5: Configure the Virtual Mapping to IBM MQ Internal System in Cloud Connector

Make sure the cloud connector is already connected successfully to the SAP BTP subaccount.

The backend type should be “NON-SAP System” and the protocol should be “TCP” only

Use the same port (5672) in system mapping which was given in the AMQP channel in IBM MQ and internal host on which the IBM MQ is installed and remaining virtual details as required.

Cloud Connector Virtual Mapping

Check the result of the reachable status on the internal host and port

Test the Virtual Mapping

Step-6: Test the connectivity from SAP Integration Suite to Cloud Connector.

The location ID (TESTCC) is the same name which was given during the connection between cloud connector and SAP BTP subaccount.

Cloud Connector Connectivity Test from IS

Step-7: Test the connectivity from SAP Integration Suite to IBM MQ through AMQP

The virtual host and virtual port which was given during system mapping in cloud connector should be used along with the same location id.

AMQP Connectivity Test from IS

Step-8: Deploy the SASL username/password of IBM MQ in IS Security Material

SASL Credentials Deployed in Security Material

Step-9: Create an IFLOW to send the message to the IBM MQ queue using AMQP channel

Provide the virtual host and port details along with the Location ID as tested earlier.

The credential name should be from the deployed security material.

To IBM MQ Through AMQP

Provide the destination type as Queue and use the same queue (Q2) created in IBM MQ

Destination and Queue Details

For testing purpose providing the sample message in the Content Modifier as below.

Content Modifier with Message Body

Step-10: Create an IFLOW to read the message from the IBM MQ queue using AMQP channel

Provide the virtual host and port details along with the Location ID as tested earlier.

The credential name should be from the deployed security material.

From IBM MQ through AMQP

For testing purpose storing the payload read from IBM MQ queue (Q2) using groovy script

Queue Details to be processed

Monitoring the Messages in CI

CI Monitoring Dashboard
Payload read from IBM MQ Queue

Monitoring the Messages in IBM MQ

Before Execution:

Before Execution with 0 read and write status

After Execution:

After Execution with 1 received and 1 sent status

Conclusion:

AMQP which is used mostly for Event Broker topics pub-sub can also be used for the exchanging of messaging queues data which provides a tight integration between CI and IBM MQ.

If the data in the IBM MQ TOPICS to be stored and retrieved then similar configurations to be done considering the pub-sub approach of the Topics using AMQP protocol itself.

Rating: 0 / 5 (0 votes)

The post Integration of SAP CI(BTP IS) with IBM MQ through AMQP appeared first on ERP Q&A.

]]>
Test SAP CPI Mappings using Postman https://www.erpqna.com/test-sap-cpi-mappings-using-postman/?utm_source=rss&utm_medium=rss&utm_campaign=test-sap-cpi-mappings-using-postman Tue, 06 Jun 2023 11:00:54 +0000 https://www.erpqna.com/?p=75242 Overview In this blog, we will see how you can use postman, to test the SAP CPI mappings. You can use this approach for testing your groovy/xslt mapping as well if you find it useful. SAP CPI mapping simulation lacks more functionality where as SAP PO has better one. In SAP CPI there is no […]

The post Test SAP CPI Mappings using Postman appeared first on ERP Q&A.

]]>
Overview

In this blog, we will see how you can use postman, to test the SAP CPI mappings. You can use this approach for testing your groovy/xslt mapping as well if you find it useful.

SAP CPI mapping simulation lacks more functionality where as SAP PO has better one. In SAP CPI there is no way you can save your xml files as instance, generate xml, copy paste, etc. Even in SAP PO, you have to manually to go output and check the result. For verifying output, you have to manually check it, no way to automate it.

Use Case

For illustration purpose let us consider this simple Invoice xml with one field.

xsd:

<?xml version="1.0" encoding="UTF-8"?>
<schema xmlns="http://www.w3.org/2001/XMLSchema" xmlns:tns="http://www.example.org/Shipping/" targetNamespace="http://www.example.org/Shipping/">
    <element name="Invoice" type="tns:Invoice"></element>
    
    <complexType name="Invoice">
    	<sequence maxOccurs="1" minOccurs="1">
    		<element name="ProductType" type="string"></element>
    	</sequence>
    </complexType>
</schema>

xml:

<?xml version="1.0" encoding="utf-8"?>
<!-- Created with Liquid Technologies Online Tools 1.0 (https://www.liquid-technologies.com) -->
<ns1:Invoice xmlns:ns1="http://www.example.org/Shipping/">
  <ProductType>Power</ProductType>
</ns1:Invoice>

Mapping Artifact: We have simple transformation which is using fix values to transport input.

Simulate Mapping test

output xml

<?xml version="1.0" encoding="UTF-8"?>
<ns1:Invoice xmlns:ns1="http://www.example.org/Shipping/">
    <ProductType>P</ProductType>
</ns1:Invoice>

Problem

We have to run the mapping manually four times to validate all four values. And we have to manually check the values every time we run the mapping. This might sound easy, but any incremental changes, we have to run these tests again manually and check the output by looking at the output xml.

For each simulation test, you have to wait for few secs to get the output.

What if you have more than 1 field is using transform logic and more than 10 fix values to test. Not easy to do it manually.

For the documentation purpose, you have to capture the inputs and outputs and keep it as record. It is cumbersome to do it manually.

Solution

What I need

  1. Automate the mapping simulation test
  2. Automate cross checking output. No need to go to output xml and check it manually.
  3. Automate capturing input and output

Let us create a separate Iflow to test the mapping. Here is the simple Iflow with https sender adapter.

Create request in postman for each test case.

Write a test case for checking product type for all types. I have used chatGPT for writing test script as I am new to java script.

// Parse the XML response
const responseXml = pm.response.text();

// Extract the ProductType value
const productType = responseXml.match(/<ProductType>(.*?)<\/ProductType>/)[1];

// Check the ProductType value
pm.test("Check ProductType Power", () => {
    pm.expect(productType).to.eql("P");
});

Run the collection

Here you go.

So now we are able to automate mapping test via postman and able to very the output. Creating test scripts should be one time activity, once done, you can run it multiple time.

If you create postman workspace with visibility as team, you should be able to share it with your team.

Documentation

For documenting your tests, you can make use of Newman htmlextra for postman. With this you can document it.

Rating: 0 / 5 (0 votes)

The post Test SAP CPI Mappings using Postman appeared first on ERP Q&A.

]]>
Creating a Sync integration for downloading Open Text Content with REST API from Sap PO, Using API with multipart/form-data authentication https://www.erpqna.com/creating-a-sync-integration-for-downloading-open-text-content-with-rest-api-from-sap-po-using-api-with-multipart-form-data-authentication/?utm_source=rss&utm_medium=rss&utm_campaign=creating-a-sync-integration-for-downloading-open-text-content-with-rest-api-from-sap-po-using-api-with-multipart-form-data-authentication Wed, 29 Mar 2023 04:26:08 +0000 https://www.erpqna.com/?p=73237 Introduction: The purpose of this document is to develop a synchronous interface to download content from Open Text Content Server (OTCS) using REST API. I have tried to cover the overall design with code snippets for reference. Scope: Overall Design: Solution Flow: Rest API consumed from Open Text Content Server (OTCS): In Content-Type: multipart/form-data, credentials […]

The post Creating a Sync integration for downloading Open Text Content with REST API from Sap PO, Using API with multipart/form-data authentication appeared first on ERP Q&A.

]]>
Introduction:

The purpose of this document is to develop a synchronous interface to download content from Open Text Content Server (OTCS) using REST API. I have tried to cover the overall design with code snippets for reference.

Scope:

  • Rest OTCS authentication using Content-Type: multipart/form-data (Credentials as a header of the multipart body) for the token (otcsticket) generation.
  • Parameterized Mapping for accessing the OTCS credentials from ICO.
  • Calling OTCS authentication lookup, accessing parameters from java mapping.
  • Creating DynamicConfigurationKey for rest channel Parameter from java mapping.
  • OTCS session token management is out of scope.

Overall Design:

Sync Interface to get the Document from OpenText via PO 7.5

Solution Flow:

  1. SAP ECC calls a proxy to send the Document ID of the document in OTCS.
  2. PO Request java mapping receives Document ID.
    • Calls OTCS – Authentication API for a token (otcsticket) via REST lookup
    • Post ID and token to OTCS – Content API
  3. PO Response Java Mapping receives Document as an inputstream and maps it to the content field.
  4. Base64 content Field is sent to SAP for further processing.

Rest API consumed from Open Text Content Server (OTCS):

  • Authentication API – /otcs/cs.exe/api/v1/auth: API needs to be called with credentials in Content-Type: multipart/form-data section to generate a token, which is called otcsticket. Otcsticket needs to be present in the header for content API to be called.

In Content-Type: multipart/form-data, credentials need to be present, separated by a boundary.

  • Content API – /otcs/cs.exe/api/v1/nodes/{ID}/content: API would return the document as a byte stream, when called with the token and ID of the document in the header.
otcsticket and ID of the document in the http header

PO Objects and code snippets:

Data Structure

Document ID for OTCS comes as DataID from SAP. The document is returned to SAP as Content.

ICO

Mapping (MM) & Operational mapping (OM)

Please take care of the above ICO parameter in

  • OM-> Parameters section and Request Mapping Binding section
  • MM -> Signature tab

Request Mapping with java mapping in the Attributes and Methods Section

public void transform(TransformationInput in, TransformationOutput out) throws StreamTransformationException {

		try {

			getTrace().addDebugMessage("***OTCS-Request-JavaMapping-Start");
			
			//Get the mapping parameter from ICO
			String paramChannel = in.getInputParameters().getString("lookupChannel"); 
			String paramUserName = in.getInputParameters().getString("username"); 
			String paramPassword = in.getInputParameters().getString("password");
			String paramBoundary = in.getInputParameters().getString("boundary");
			getTrace().addDebugMessage("***OTCS-Step1-LogPramFromICO-lookupChannel:" + paramChannel + "-username:" 
				+ paramUserName + "-password:" + paramPassword +"-boundary:" +  paramBoundary);
			
			//Creating multipart/form-data for OTCS authentication
			String LINE_FEED = "\r\n";
			String ContentDisposition = "Content-Disposition: form-data; name=\"";
			String authReqFormData ="";
			authReqFormData =  LINE_FEED + paramBoundary + LINE_FEED + ContentDisposition + "username\"" + LINE_FEED 
				+ LINE_FEED + paramUserName + LINE_FEED + paramBoundary +  LINE_FEED +ContentDisposition 
				+ "password\"" + LINE_FEED + LINE_FEED + paramPassword + LINE_FEED + paramBoundary + "–-" + LINE_FEED;
			getTrace().addDebugMessage("***OTCS-Step2-multipart/form-data:" + authReqFormData);
			
			//Read message header value for Receiver 
			String paramReceiver = in.getInputHeader().getReceiverService();
			getTrace().addDebugMessage("***OTCS-Step3-ReceiverService:" + paramReceiver);
			
			//Get the OTCS rest lookup Channel Object for authentication
			Channel lookup_channel = LookupService.getChannel(paramReceiver, paramChannel);
			
			//Call rest lookup channel, with multipart/form-data payload
			SystemAccessor accessor = null;
			accessor = LookupService.getSystemAccessor(lookup_channel);
			InputStream authInputStream = new ByteArrayInputStream(authReqFormData.getBytes("UTF-8"));
			Payload authPayload = LookupService.getXmlPayload(authInputStream);
			Payload tokenOutPayload = null;
			//Call lookup
			getTrace().addDebugMessage("***OTCS-Step4-CallLookupChannel");
			tokenOutPayload = accessor.call(authPayload);
			//Parse for Lookup response for token
			InputStream authOutputStream = tokenOutPayload.getContent();
			DocumentBuilderFactory authfactory = DocumentBuilderFactory.newInstance();
			DocumentBuilder authbuilder = authfactory.newDocumentBuilder();
			Document authdocument = authbuilder.parse(authOutputStream);
			NodeList nlticket = authdocument.getElementsByTagName("ticket");
			String tokenTicket = "Empty";
			Node node = nlticket.item(0);
			if (node != null){
				node = node.getFirstChild();
				if (node != null){
					tokenTicket = node.getNodeValue();
				}
			}
			getTrace().addDebugMessage("***OTCS-Step5-TokenFromLookup:" + tokenTicket);
						
			//Parse input stream and get DataID from SAP
			DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();
			DocumentBuilder dBuilder = dbFactory.newDocumentBuilder();
			Document doc = dBuilder.parse(in.getInputPayload().getInputStream());
			String DataID = doc.getElementsByTagName("DataID").item(0).getTextContent();
			getTrace().addDebugMessage("***OTCS-Step6-DataIDFromSAP: " + DataID);
			
			//Create HTTP Header for rest call via setting DynamicConfiguration keys, that can be used in reciver channel
			DynamicConfiguration conf = in.getDynamicConfiguration();
			DynamicConfigurationKey keytokenTicket = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/REST","HeadertokenTicket");
			DynamicConfigurationKey keyDataID = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/REST","HeaderDataID");
			conf.put(keytokenTicket, tokenTicket);
			conf.put(keyDataID, DataID);

			String DummyPayload =  "DummyPayload";
			// Instantiating output stream to write at Target message
			OutputStream os = out.getOutputPayload().getOutputStream();
			// writing idoc to output stream
			os.write(DummyPayload.getBytes("UTF-8"));
			os.flush();
			os.close();
			getTrace().addDebugMessage("***OTCS-Request-JavaMapping-End");
		}
		catch (Exception e){
			getTrace().addDebugMessage(e.getMessage().toString());
			throw new StreamTransformationException(e.getMessage());
		}

	}

Response Mapping with java mapping in the Attributes and Methods Section

public void transform(TransformationInput in, TransformationOutput out) throws StreamTransformationException {
		try 
		{
			getTrace().addDebugMessage("***OTCS-Respose-JavaMapping-Start");
			InputStream inputstream = in.getInputPayload().getInputStream();
			OutputStream outputstream = out.getOutputPayload().getOutputStream();
			//Copy Input Payload into Output xml
			byte[] b = new byte[inputstream.available()];
			inputstream.read(b);
			//Form Output xml
			String outputStart = "<?xml version=\"1.0\" encoding=\"UTF-8\"?><ns0:MT_DocContent_Res xmlns:ns0=\"urn://XXXXXXXXXXX.com/OTCS/DocDownload\"><Content>";
			String outputEnd = "</Content></ns0:MT_DocContent_Res>";
			outputstream.write(outputStart.getBytes("UTF-8"));
			outputstream.write(b);
			outputstream.write(outputEnd.getBytes("UTF-8"));
			outputstream.flush();
			outputstream.close();
			getTrace().addDebugMessage("***OTCS-Respose-JavaMapping-End");
		}

		catch (Exception e)
		{
			getTrace().addDebugMessage(e.getMessage().toString());
			throw new StreamTransformationException(e.getMessage());

		}
	}

Channels

We have three channels in the flow, Such as Proxy channel from SAP, Rest channels for Token Lookup, and document fetching from the OTCS.

Proxy – CC_OTCS_GetDoc_Proxy_Sender

Rest lookup channel – CC_OTCS_Rest_LookUp

  • URL- http://{Server}/otcs/cs.exe/api/v1/auth
  • Rest Operation- POST
  • Data Format- UTF-8
  • HTTP Headers, as below

Content-Type multipart/form-data; boundary=SapPO75FormBoundaryhahahahahahahaEND

Rest document fetch channel– CC_OTCS_Rest_Receiver

  • URL- http:// /{Server}/otcs/cs.exe/api/v1/nodes/{ID}/content
  • Variable Substitution-
  • Rest Operation- GET
  • Data Format- UTF-8
  • HTTP Headers, as below

otcsticket {otcsticket}

ID {ID}

Rating: 0 / 5 (0 votes)

The post Creating a Sync integration for downloading Open Text Content with REST API from Sap PO, Using API with multipart/form-data authentication appeared first on ERP Q&A.

]]>
Integration of SAP CPI(BTP IS) with ChatGPT https://www.erpqna.com/integration-of-sap-cpibtp-is-with-chatgpt/?utm_source=rss&utm_medium=rss&utm_campaign=integration-of-sap-cpibtp-is-with-chatgpt Mon, 27 Feb 2023 11:40:03 +0000 https://www.erpqna.com/?p=72381 Integration of SAP CPI(BTP IS) with ChatGPT Introduction What is ChatGPT? ChatGPT is a language model developed by OpenAI. It is designed to understand natural language, generate human-like responses to a wide range of questions, and carry out various language-related tasks. It has been trained on a large corpus of text data and can understand […]

The post Integration of SAP CPI(BTP IS) with ChatGPT appeared first on ERP Q&A.

]]>
Integration of SAP CPI(BTP IS) with ChatGPT

Introduction

What is ChatGPT?

ChatGPT is a language model developed by OpenAI. It is designed to understand natural language, generate human-like responses to a wide range of questions, and carry out various language-related tasks. It has been trained on a large corpus of text data and can understand a broad range of topics. Its purpose is to assist users in their quest for knowledge and provide them with informative and useful responses.

What is CPI(BTP-IS)?

CPI(BTP-IS) is a set of services and tools provided by SAP on its cloud-based Business Technology Platform (BTP) to enable integration between different systems, applications, and data sources. The key benefit of CPI(BTP IS) is that it enables organizations to quickly and easily integrate their systems, data, and applications without the need for extensive coding or custom development. This helps to streamline business processes, reduce costs, and improve operational efficiency.

How ChatGPT can be integrated?

ChatGPT can be integrated using APIs (Application Programming Interfaces) provided by OpenAI. The API can be used to send text inputs to ChatGPT and receive responses in real-time. Chatbot Platform Integration and Custom Integration using libraries and SDK also possible.

How CPI integrates with ChatGPT?

CPI interacts ChatGPT through HTTP requests using API Keys for authentication. ChatGPT has multiple APIs depending on the usage or self learning basis like Models, Completions, Edits, Images, Create Image Variations, Embeddings, Files, Fine-tunes and Moderations APIs

Note: All the available ChatGPT APIs can be integrated with SAP CPI but for demonstration will use the Completion API.

Integration of SAP CPI(BTP IS) with ChatGPT

Prerequisites:

  1. SAP BTP IS Tenant Access
  2. ChatGPT openai platform Access
  3. Postman Tool for testing

Step-1: Create a new secret key on ChatGPT

After login to the account on https://platform.openai.com/account/api-keys goto API Keys and click on Create new secret key.

Image source from my account on platform.openai.com

Step-2: Download the security certificate from respective ChatGPT API

For demo we will use the API “ https://api.openai.com/v1/completions” from other available APIs of ChatGPT.

Open the API url in any browser and click on the lock(HTTPS) icon from the address bar and select show certificate.

Export the root certificate and save to your local desktop.

Image source from my account on platform.openai.com

Step-3: Upload the root security certificate in the SAP BTP IS(CPI)=>KeyStore

Login to BTP account and redirect to your Integration Suite home page. From the left side pane select the monitor artifacts options and open the Keystore page.

Click on Add dropdown option from right side top pane and select the Certificate option and upload the already downloaded security certificate from Step-1.

Image source from my trail account on SAP BTP IS

Step-4: Create a new Integration Flow under the desired package with appropriate naming convention.

From the left side pane design artifacts option select or create a package and create a new IFLOW.

We need to create a scenario of HTTPS sender adapter to HTTP receiver adapter with intermediate call on chatGPT API .

Step-5: Configure a sender HTTPS adapter as we will test it from Postman tool.

As per the requirement we can have any sender adapter which can able to provide the expected input text.

Image source from my trail account on SAP BTP IS

Provide the desired Address path which will be added to the deployed tenant URL.

Step-6: Configure the JSON to XML converter as the input text from Postman will be provided as JSON.

As per the requirement we can have any format input can be sent from sender system which can be modified accordingly.

Image source from my trail account on SAP BTP IS

Remove the Namespace Mapping if selected.

In our case JSON to XML only is used as input from POSTMAN will be JSON and for further processing of exchange properties we will use XML .

Step-7: Configure the Content Modifier to set the Message Header, Exchange Property and Message Body.

The ChatGPT HTTP API will be called with Authorization and ContentType values in the Message Header

So accordingly provide the Header constant in the Content Modifier with Content-Type as application/json

And Authorization will be the Bearer (space) API key generated in the step-1

Image source from my trail account on SAP BTP IS

The sample input from source will be in the below JSON format

{

“prompt”: “Input Text here

}

In order to capture the Input “Prompt” value and prepare complete request to API the exchange property will be configured with a “input” referring to the Xpath of the JSON converted XML field value.

Image source from my trail account on SAP BTP IS

The original API HTTP call will be expecting the actual source input with its respective parameters. Those parameters and source input will be formulated in the Message Body.

Image source from my trail account on SAP BTP IS

Each ChatGPT API call will be processed based on the model name provided. In our case will provide the latest model “text-davinci-003” which is the most capable GPT-3 model. Can do any task the other models can do, often with higher quality, longer output and better instruction-following. And supports inserting completions within text. Also its trained with the internet content available upto Jun2021.

Step-8: Configure a Request-reply with HTTP adapter receiver connection to ChatGPT API.

Provide the mandatory field values like Address of the ChatGPT API, Method as POST, Authentication as Client Certificate and the Request/Response headers. The Request headers can be given as * or Authorization|Content-Type etc.

Image source from my trail account on SAP BTP IS

Step-9: Handling the API JSON response.

When any ChatGPT API is invoked with JSON request then there will be a JSON response which can be converted to XML to produce the desired field output.

Configure the JSON to XML converter to select the respective XML field values to be populated in the output.

Image source from my trail account on SAP BTP IS

Step-10: Configure the Content Modifier for converted XML payload.

Create a “output” property referring to the XML text field which will be only populated as final output.

Image source from my trail account on SAP BTP IS

Modify the Message body also with the selected property value.

Image source from my trail account on SAP BTP IS

Final Integration Flow will look like:

Image source from my trail account on SAP BTP IS

Once the IFLOW is saved and deployed , go to the Managed Integration Content and get the endpoint which will be used in the source system to trigger a message to CPI.

Image source from my trail account on SAP BTP IS

In order to view the traces in the CPI monitor message processing. Select the log level as Trace instead of default option Info.

Image source from my trail account on SAP BTP IS

Testing in POSTMAN:

Image source from local installed Postman

Provide the CPI tenant endpoint URL for the created IFLOW along with the BTP account credentials in the Postman Authorization section.

For the sample input provided we have received the response accordingly from the chatGPT API.

Monitoring the IFLOW from CPI tenant with the trace enabled.

Image source from my trail account on SAP BTP IS
Image source from my trail account on SAP BTP IS
Rating: 0 / 5 (0 votes)

The post Integration of SAP CPI(BTP IS) with ChatGPT appeared first on ERP Q&A.

]]>
SAP CPI – End to End VAT integration using OAuth 2.0 with client credentials grant to generate Access token authentication https://www.erpqna.com/sap-cpi-end-to-end-vat-integration-using-oauth-2-0-with-client-credentials-grant-to-generate-access-token-authentication/?utm_source=rss&utm_medium=rss&utm_campaign=sap-cpi-end-to-end-vat-integration-using-oauth-2-0-with-client-credentials-grant-to-generate-access-token-authentication Sat, 16 Apr 2022 03:42:07 +0000 https://www.erpqna.com/?p=61987 When we are doing integration between Cloud – Cloud and Cloud to On-Premise, then we have multiple types of Authentication to access the API’s. User Credentials: User Id and Password OAuth 2.0 Credentials: Client secure url, Client ID , Client Secret and Scope OAuth 2.0 Authorization Code: Atheization URL, Token URL , Client ID, Client […]

The post SAP CPI – End to End VAT integration using OAuth 2.0 with client credentials grant to generate Access token authentication appeared first on ERP Q&A.

]]>
When we are doing integration between Cloud – Cloud and Cloud to On-Premise, then we have multiple types of Authentication to access the API’s.

  • User Credentials:

User Id and Password

  • OAuth 2.0 Credentials:

Client secure url, Client ID , Client Secret and Scope

  • OAuth 2.0 Authorization Code:

Atheization URL, Token URL , Client ID, Client Secret and User Name and password

  • Client Certificate:

client certificate

In this blog post I am going to explain the end-to-end VAT return outbound interface integration between SAP to HMRC UK government portal using the OAuth 2.0 with the Client Credentials Grant to generate Access Token.

Integration Flow Chart

Step-1 : OAuth 2.0 Access Token Configuration

The Following diagram illustrates the process to get access token

  • Logon to your SAP Cloud Platform Integration web application
  • Navigate to the Monitor option then we can see overview.
  • Then go to Manage Security > Security Material

Here choose Create > OAuth2 Authorization Code.

Then need to maintain the Authorization URL, Token Service URL, Client ID, Client Secret, User Name and scope of the your interface (for the VAT return interface scope is write:vat)

Click on the Deploy > Authorize > then it will redirect to the Callback URL, enter the Credentials once enter the credentials it will show as below then click on Grant Authority

Now successfully deployed the OAuth2.0 Access token configuration this will access later in the VAT return interface IFlow.

Step-2 End-to-End VAT return Interface IFlow

  • Navigate to Design > Package > Artifacts Tab
  • Choose Add > Integration Flow

Step-3

From the sender Drag the connecting arrow to connect to start action, from the Adapter Type box select the SOAP/HTTPS.

SOAP: WSDL based integration then we can go for the SOAP adapter.

HTTPS: JSON based integrations then we can go for the HTTPS Adapter, and also it will work for the xml.

  • In Connection tab, enter the Address or URL details
  • Enter HRMC/VAT/returns Optionally, you can enter any value of your choice, but ensure that you use “/” symbol before specifying the endpoint name

Step-4 Groovy Script

The groovy script contains the functionality to fetch an access token from the OAuth2.0 Authorization Code credential which we have configured in the Step-1(Security Material).

import com.sap.gateway.ip.core.customdev.util.Message;
import com.sap.it.api.securestore.SecureStoreService;
import com.sap.it.api.securestore.AccessTokenAndUser;
import com.sap.it.api.securestore.exception.SecureStoreException;
import com.sap.it.api.ITApiFactory;
def Message processData(Message message) {
     
    SecureStoreService secureStoreService = ITApiFactory.getService(SecureStoreService.class, null);
 
    AccessTokenAndUser accessTokenAndUser = secureStoreService.getAccesTokenForOauth2AuthorizationCodeCredential("OAuth2.0");
    String token = accessTokenAndUser.getAccessToken();
        
    message.setHeader("Authorization", "Bearer "+token);
    
     
    
   return message;
}
  • By calling the method

getAccesTokenForOauth2AuthorizationCodeCredential(“OAuth2.0”),

  • you fetch the access token of the OAuth2 Authorization Code credential with name “OAuth2.0”.
  • once we get token from this method then we need to pass same Access token in the header level using the below groovy syntax.

message.setHeader(“Authorization”, “Bearer “+token);

Step-5 Content Modifier

By using content modifier, we are passing the Accept and Content-type parameters in the http header request as below.

Step-6 Request-Reply

Connect the Request Reply to Receiver by dragging the arrow icon on Request Reply to the Receiver

Select the Connection tab. In the Address field, enter the Target endpoint/Url(https://hostname/org/vrn/return)

Step-7

  • Click Save to persist the changes to integration Flow.
  • Click Deploy to deploy the integration Flow.
  • Navigate to the Monitor View.
  • Under the Mange Integration Content Section > Choose Start to access all the Started Artifacts that we have deployed.
  • Select the integration flow > Endpoint tab then you can notice REST API URL for the integration flow.
  • This URL can be used to invoke the integration flow as a REST API from any REST client like postman.

Step-8 Testing The Integration Flow Using POSTMAN

Step-9 Trace and Monitoring in SAP CPI

Navigate to the Monitor > Manage Integration Content > Select the Iflow > Monitor Message Processing

  • Now we can see the messages status as completed
  • If you want to trace Iflow logs step-to-step then click on the trace option
  • Then we can see step-to-step process logs with content.
Rating: 0 / 5 (0 votes)

The post SAP CPI – End to End VAT integration using OAuth 2.0 with client credentials grant to generate Access token authentication appeared first on ERP Q&A.

]]>
SAP CPI – Encryption and Decryption using PGP keys https://www.erpqna.com/sap-cpi-encryption-and-decryption-using-pgp-keys/?utm_source=rss&utm_medium=rss&utm_campaign=sap-cpi-encryption-and-decryption-using-pgp-keys Sat, 09 Apr 2022 12:08:36 +0000 https://www.erpqna.com/?p=61737 When we are doing HR or Bank related integration here main factor is data secure, so one of way is encrypt data using PGP keys. In this tutorial I am going to explore How to generate PGP Keys using Kleopatra Software How to Encrypt data using PGP Public key in SAP CPI How to Decrypt […]

The post SAP CPI – Encryption and Decryption using PGP keys appeared first on ERP Q&A.

]]>
When we are doing HR or Bank related integration here main factor is data secure, so one of way is encrypt data using PGP keys.

In this tutorial I am going to explore

  1. How to generate PGP Keys using Kleopatra Software
  2. How to Encrypt data using PGP Public key in SAP CPI
  3. How to Decrypt the data using PGP Private Key in SAP CPI

Generating PGP Keys using Kleopatra Software

Open the Kleopatra application > File > New Key Pair

Provide the Key Pair Name > Advanced Settings > Select Validity

Enter the Passphrase

Click on Finish

Key Pair are Successfully generated.

Now we need to export the PGP Public and Private keys.

Exporting PGP Public Key

Save as FileName .pgp format

Exporting PGP Private Key

Finally, we have generated the PGP public and private keys

These keys we are going to use in the SAP CPI Iflow to Encrypt and Decrypt the content.

PGP Encryption

In this case we need to get PGP public key from the Non-SAP/Third party application team.

CPI Flow Chart

Step-1

Importing the PGP Public Key in Security Material

Step -2 Navigate to Design > Select Package > Artifact tab > Add Iflow

From the sender Drag the connecting arrow to connect to start action, from the Adapter Type box select the HTTPS.

  • In Connection tab, enter the Address or URL details
  • Enter /PGPEncryption Optionally, you can enter any value of your choice, but ensure that you use “/”symbol before specifying the endpoint name

Groovy script for the payload logging

import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {
    def body = message.getBody(java.lang.String) as String;
    def messageLog = messageLogFactory.getMessageLog(message);
    if(messageLog != null){
        messageLog.setStringProperty("Logging#1", "Printing Payload As Attachment")
        messageLog.addAttachmentAsString("ResponsePayload:", body, "text/plain");
     }
    return message;
}

Step-3 PGP Encryptor

Here we are using the PGP Encryptor pallet to encrypt the incoming data

Drag and Drop the PGP Encryptor function from Security tab into Iflow space. In the Processing tab we can select the dropdown values to adjust the algorithm, Key length, Compression etc. values but MUST specify the UID(User ID) of Public/Private Key pair to be used for Encryption

Save and deploy the Iflow

Step-4 Testing the Integration Flow Using Postman

PGP Decryption

In this case we need to generate PGP keys (Public and private), and public key will share with the third-party application team to encrypt the data.

CPI Flow Chart

Step-1

Importing the PGP Private Key in Security Material

Step-2

Here we are using the PGP Decryptor pallet to Decrypt the incoming content

In the Processing tab Specify the UID(User ID) of Public/Private Key pair to be used for Decryption

Step-3 Testing the Integration flow using Postman

Rating: 0 / 5 (0 votes)

The post SAP CPI – Encryption and Decryption using PGP keys appeared first on ERP Q&A.

]]>
SAP Password reset tool using Azure Logic App, SAP PO/Integration Suite and ABAP https://www.erpqna.com/sap-password-reset-tool-using-azure-logic-app-sap-po-integration-suite-and-abap/?utm_source=rss&utm_medium=rss&utm_campaign=sap-password-reset-tool-using-azure-logic-app-sap-po-integration-suite-and-abap Wed, 02 Mar 2022 09:05:56 +0000 https://www.erpqna.com/?p=60319 Introduction Recently while working on Azure Logic App, I felt we can make use of Office 365 email connector to automate a few manual processes. I thought, why not create a password reset tool? So, I designed a Logic App that picks up email from a specific folder(outlook) and passes on the information to SAP […]

The post SAP Password reset tool using Azure Logic App, SAP PO/Integration Suite and ABAP appeared first on ERP Q&A.

]]>
Introduction

Recently while working on Azure Logic App, I felt we can make use of Office 365 email connector to automate a few manual processes.

I thought, why not create a password reset tool?

So, I designed a Logic App that picks up email from a specific folder(outlook) and passes on the information to SAP ECC6 via SAP PO, finally using the ABAP program user password is reset/unlocked and messages are returned back to the sender by the Logic App.

Implementation

Develop a logic app in Azure that is connected with an outlook account(in the real world email must be a service account) and look for new emails.

Outlook connector configuration
  • Emails are looked at under the Password Reset folder every 10 seconds, so any new email that comes in will be immediately picked up.

Make sure the email body is in a specific format (end-users need to be trained about the email body format), otherwise parsing at the logic app will fail and the correct format will be returned back the sender.

Default email format(JSON)

{

“User”: {

“Client”: “230”,

“UNAME”: “MOHAMM4A”,

“email”: ”

“action” : “unlock”

}

}
Email body parsing

whenever an email trigger invokes, it will have a unique Message-Id. using message-id reply is sent back to the same email sender.

As soon as the parsing completes, a proper JSON structure request will be created and hit the SAP PO REST adapter endpoint using HTTP connector in Logic App.

HTTP connector

SAP PO or Integration Suite configuration:

In our DEV SAP ECC6, we have 2 clients(100 and 230), so the end-user needs to specify the proper client in the email body. once the payload is received in SAP PO it will be converted to XML by the REST adapter.

1. Develop ID objects in SAP PO with one REST sender adapter and 2 SOAP adapters(XI protocol) for two clients.

SAP PO ID objects

2. In the ICO, I have added conditions in the receiver determination so that based on the client its business system can be invoked for a password reset.

ICO – Receiver determination

3. Create ESR objects, thus the structure appears in SPROXY for ABAP coding.

ESR objects

4. Mapping in ESR is straightforward (one-to-one mapping).

Message Mapping

We are done with the SAP PO configuration.

If anyone is using SAP Integration suite, we can completely ignore Azure Logic App. It is a known thing that not everyone will be using Azure in their organization, hence we will be performing all the operations in SAP.

SAP Integration Suite IFlow

In my case I have used XI adapter. cloud connector was already in place, so I directly hit my ECC system.

I reused the same SAP PO generated Service Interface in ECC.

It is also possible to connect via SOAMANAGER.

IFlow can be found at my Git Repository.

Password reset IFlow – GitHub

Sender MAIL adapter is connected to my email account via IMAP and polls from the folder CPI at an interval of 10 seconds.

sender mail adapter

Once we have an email in the folder. It will pick, process it and change the email to read from unread.

For the purpose of sending the sender’s email ID along with the request payload for unlocking/reset, I’ve extracted the adapter header parameter(FROM) using a groovy script and passed it to ECC by forming a payload in the content modifier.

Finally, the response from ECC must be returned to the sender via receiver email adapter.

Receiver email configuration

SAP ABAP development:-

1. Generate proxy for the service provider in t-code SPROXY.

Service Proxy Proxy

2. Once we generate a proxy, An ABAP class will be generated.

Auto-generated ABAP class

3. We can either code in the same method or create a separate class where all the operations can be performed. I have created a separate class*(SE24) and did all the validations and password reset/unlock operation.

4. The most important thing is, how are we getting user details from SAP? to do this we can make use of standard BAPI: BAPI_USER_GET_DETAIL where we pass user-id(UNAME from email) as an exporting parameter and return user address details and SCN(secure network communication data )

BAPI_USER_GET_DETAIL

ls_address-e_mail will be holding the email address of the user in SAP.

the returned email(BAPI returned) will be validated against the sender’s email address (coming from Logic App or SAP Integration Suite).

if the email-id in SAP doesn’t match with the email sender’s email-id, then a reply will be sent back to the sender.

Otherwise, based on the action(UNLOCK/RESET) program will either reset or unlock the user’s password

5. The user details can be seen in transaction code SU01.

6. I have added a few validations within the ABAP code before resetting/unlocking the user.

There would be situations when the BASIS team will lock all the users during system maintenance, during such situations users shouldn’t be allowed to unlock. (if the status of usr02-uflag is 32 & 66 then it is locked by the administrator)

person who leaves the organization should not be allowed to reset/unlock. (usr02-class will hold user class)

email-sender needs to specify the correct user-id in the email body.

7. After validating and getting details from SAP, we can go ahead and reset/unlock users based on their choice.

8. Finally, using standard BAPI: BAPI_USER_UNLOCK user can be unlocked.

BAPI_USER_UNLOCK – To unlock user

9. If the choice is to reset, we can use BAPI: BAPI_USER_CHANGE.

I have used FM-GENERAL_GET_RANDOM_STRING to generate a random 8 character password and concatenated “@1” to it strong.

By passing a new string and username into FM-BAPI_USER_CHANGE, the password will be reset, and the response will be returned back to SAP PO –> Logic App –> email.

  1. We are done with the ABAP development.

Note: Validations are based on my own understanding post discussing with the security team, you can add as many validations you want. Also, If you have any suggestions regarding validations please write in the comment section.

Now, It’s time to test our tool

Test:1 – If is user tries to reset his password

A new password is generated and email is received as an email reply

Test 2: If the user tries to unlock.

User was not locked in SAP, so a proper response was returned back to the sender

Test 3: Few more validations.

Invalid action and invalid email validations

Test 4: In case the email body is not correct, the default email template will be returned as an email response.

Invalid email body

Monitoring:-

Azure Logic App

Logic App
Invalid JSON
HTTP connector

SAP PO:

SAP PO Message Monitor

SAP ECC6 Logs:

For the audit log purpose, I have added logs in ABAP SLG1 for each password reset request.

Logs in ECC can be checked in the SLG1 transaction code.

SLG1 log

SAP Integration Suite:

SAP IS Message Log
Email picked and responded by SAP Integration Suite
Rating: 0 / 5 (0 votes)

The post SAP Password reset tool using Azure Logic App, SAP PO/Integration Suite and ABAP appeared first on ERP Q&A.

]]>
Salesforce integration with SAP PO Rest Adapter using SFDC Bulk API Concept https://www.erpqna.com/salesforce-integration-with-sap-po-rest-adapter-using-sfdc-bulk-api-concept/?utm_source=rss&utm_medium=rss&utm_campaign=salesforce-integration-with-sap-po-rest-adapter-using-sfdc-bulk-api-concept Tue, 18 Jan 2022 09:26:07 +0000 https://www.erpqna.com/?p=58989 This is blog how to connect with SFDC via REST adapter using SFDC bulk API concept. (without BPM) When to use Bulk API? Bulk API is based on REST and is optimized for loading or deleting large sets of data. We can query, insert, update, upsert, or delete many records asynchronously by submitting batches. Salesforce […]

The post Salesforce integration with SAP PO Rest Adapter using SFDC Bulk API Concept appeared first on ERP Q&A.

]]>
This is blog how to connect with SFDC via REST adapter using SFDC bulk API concept. (without BPM)

When to use Bulk API?

Bulk API is based on REST and is optimized for loading or deleting large sets of data. We can query, insert, update, upsert, or delete many records asynchronously by submitting batches. Salesforce processes batches in the background. Bulk API is designed to make it simple to process data from a few thousand to millions of records.

Objective:

In salesforce, there are multiple standard SFDC objects for example Contact, Account. The objective of this blog is to create a generic solution to integrate with SFDC API 1.0 through SAP PO. This generic solution can be used to call different-different SFDC objects. Only we need to change the structure corresponding to the SFDC object. We can use this approach to call any SFDC object (standard/custom) using the bulk API concept.

Record processing through Bulk API Calls: Steps are defined below:

  1. Create job
  2. Add batch to job
  3. Close job to start processing
  4. Retrieve Batch Result

Firstly, we will understand the concept using POSTMAN. Let’s see how to send required requests using the POSTMAN tool. Please note that in this example, I am going to use :

SFDC custom object: LOM_Tender__c (It’s a custom SFDC Object)

externalIdFieldName: Reference__c

Operation: upsert

1) Create a job: Create a job by sending a POST request to this URI. The request body identifies the type of object processed in all associated batches.

URI : /services/async/APIversion/job

Example XML request body:

<?xml version="1.0" encoding="utf-8"?>
<jobInfo xmlns="http://www.force.com/2009/06/asyncapi/dataload">
    <operation>upsert</operation>
    <object>LOM_Tender__c</object>
    <externalIdFieldName>Reference__c</externalIdFieldName>
    <contentType>XML</contentType>
</jobInfo>

In Request XML, XML field <Object> is SFDC object name.

externalIdFieldName is noting but a primary field in SFDC object.

HTTP Header:

X-SFDC-Session: is an Access token which needs to be retrieved first using Login API. There are multiple blogs available on how to get session ID using SOAP call to SFDC.

ContentType: application/XML;charset=UTF-8

In Response, a job ID needs to be retrieved that would be passed as a query string in subsequent calls. In this case, Job Id is “75077000000A8qZAAS”

<?xml version="1.0" encoding="UTF-8"?>
<jobInfo
   xmlns="http://www.force.com/2009/06/asyncapi/dataload">
    <id>75077000000A8qZAAS</id>
    <operation>upsert</operation>
    <object>LOM_Tender__c</object>
    <createdById>005G0000002kHpQIAU</createdById>
    <createdDate>2021-12-30T05:05:31.000Z</createdDate>
    <systemModstamp>2021-12-30T05:05:31.000Z</systemModstamp>
    <state>Open</state>
    <externalIdFieldName>Reference__c</externalIdFieldName>
    <concurrencyMode>Parallel</concurrencyMode>
    <contentType>XML</contentType>
    <numberBatchesQueued>0</numberBatchesQueued>
    <numberBatchesInProgress>0</numberBatchesInProgress>
    <numberBatchesCompleted>0</numberBatchesCompleted>
    <numberBatchesFailed>0</numberBatchesFailed>
    <numberBatchesTotal>0</numberBatchesTotal>
    <numberRecordsProcessed>0</numberRecordsProcessed>
    <numberRetries>0</numberRetries>
    <apiVersion>47.0</apiVersion>
    <numberRecordsFailed>0</numberRecordsFailed>
    <totalProcessingTime>0</totalProcessingTime>
    <apiActiveProcessingTime>0</apiActiveProcessingTime>
    <apexProcessingTime>0</apexProcessingTime>
</jobInfo>

2) Add batch to job:

Add a new batch to a job by sending a POST request to this URI. The request body contains a list of records for processing. We need to pass Job ID in URL which is received in the previous call.

URI: /services/async/APIversion/job/jobid/batch

In Body, Actual data would be transferred to SFDC.

Header Parameters are the same as the previous request.

Request XML: In this step, we will upload the actual business data which needs to be processed. We can upload multiple records by repeating this step. The maximum limit of records is 10,000 with the bulk API concept.

<?xml version="1.0" encoding="UTF-8"?>
<ns1:sObjects xmlns:ns1="http://www.force.com/2009/06/asyncapi/dataload">
    <ns1:sObject>
        <ns1:Store_Count__c>527</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 2736.55</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-22T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Cash                </ns1:Card_Type__c>
        <ns1:Reference__c>31</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>48</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 1685.03</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-22T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Master Card         </ns1:Card_Type__c>
        <ns1:Reference__c>332</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2102</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
</ns1:sObjects>

Here payload would get differ based on SFDC object name.

In Response, we will receive a job ID & Batch ID.

3) Close Job: Close a job by sending a POST request to this URI. Once the record is uploaded, we must close the job so that record processing will be started. This API request also informs Salesforce that no more batches will be submitted. To close the job, the below detail will be required.

URI : /services/async/APIversion/job/jobId

Header Parameters are the same as the previous request.

Request Payload:

<?xml version="1.0" encoding="UTF-8"?>
<ns0:jobInfo xmlns:ns0="http://www.force.com/2009/06/asyncapi/dataload">
    <ns0:state>Closed</ns0:state>
</ns0:jobInfo>

Response Payload:

<?xml version="1.0" encoding="UTF-8"?>
<jobInfo
   xmlns="http://www.force.com/2009/06/asyncapi/dataload">
    <id>75077000000A8qZAAS</id>
    <operation>upsert</operation>
    <object>LOM_Tender__c</object>
    <createdById>005G0000002kHpQIAU</createdById>
    <createdDate>2021-12-30T05:05:31.000Z</createdDate>
    <systemModstamp>2021-12-30T05:05:31.000Z</systemModstamp>
    <state>Closed</state>
    <externalIdFieldName>Reference__c</externalIdFieldName>
    <concurrencyMode>Parallel</concurrencyMode>
    <contentType>XML</contentType>
    <numberBatchesQueued>0</numberBatchesQueued>
    <numberBatchesInProgress>0</numberBatchesInProgress>
    <numberBatchesCompleted>1</numberBatchesCompleted>
    <numberBatchesFailed>0</numberBatchesFailed>
    <numberBatchesTotal>1</numberBatchesTotal>
    <numberRecordsProcessed>2</numberRecordsProcessed>
    <numberRetries>0</numberRetries>
    <apiVersion>47.0</apiVersion>
    <numberRecordsFailed>0</numberRecordsFailed>
    <totalProcessingTime>157</totalProcessingTime>
    <apiActiveProcessingTime>80</apiActiveProcessingTime>
    <apexProcessingTime>0</apexProcessingTime>
</jobInfo>

4) Retrieve Batch Results: When a batch is completed, you must retrieve the batch result to see the status of individual records.

HTTP Method: Get

URL: https://instance.salesforce.com/services/async/53.0/job/jobId/batch/batchId/result

Job id and batch id are part of URL.

jobId is the job ID that was returned when you created the job. batchId is the batch ID that was returned when you added a batch to the job. Here in this example, batch id is 75177000000BPU8AAO

In response, if the XML field <Success>is true, that means the record is successfully processed on the SFDC side. In case of error, the value would be false.

Now we will see how to design this concept in SAP PO using a REST adapter.

High-level Diagram: This is a high-level design that will be implemented in SAP PO. In this requirement, SFDC sends back the response to Sender to let it know the status of each record.

In SAP PO, we will implement this scenario using 4 different ICOs which will be called one after one using the Async-Sync bridge. (Using Module Request Response Bean and Response One way bean). Let’s move to interface designing in SAP PO.

1. First ICO: System A (Sender) to SFDC :

This first ICO will do the below tasks:

  • Firstly, it will get data from Sender.
  • Lookup to login API to get Session ID which needs to be transferred in subsequent calls to SFDC as part of HTTP header.
  • Create Target SFDC Structure
  • Create a job again via Lookup to SFDC. (Create Job API call)
  • Will send business data to SFDC which is received from the Sender side. (Add batch to job API call)
  • Receive Data from Sender: Firstly we receive data from Sender which is actual business data via any method (JDBC/SOAP/REST). Data can be fetched via any transport protocol.
  • Retrieve Session ID: Once it is received in SAP PO, we will first call Login API via SOAP lookup. There are multiple blogs available already on how to call Login API via lookup.

Operation Mapping: Here in Operation mapping, we have one Message mapping and then Java mapping to be executed.

Message Mapping: Here we create a Target SFDC structure based on the SFDC object.

On root element sObjects, we have SOAP lookup

SOAP lookup java code to retrieve Session ID:

AbstractTrace trace = container.getTrace();
String sessionId = "";
String serverUrl = "";
try {
	//instance the channel to invoke the service.
	Channel channel = LookupService.getChannel("SFDC","CC_SFDC_SOAP_Login_Rcv");
	SystemAccessor accessor = null;
	accessor = LookupService.getSystemAccessor(channel);
	// The Request message in XML. THIS IS THE LOOKUP SERVICE
	
	String loginxml = "<login xmlns=\"urn:enterprise.soap.sforce.com\"> <username>"
	+ UserName
	+ "</username> <password>"
	+ Pass
	+ "</password> </login>";
	
	InputStream inputStream = new ByteArrayInputStream(loginxml.getBytes());
	Payload payload = LookupService.getXmlPayload(inputStream);
	Payload SOAPOutPayload = null;
	
	//The response will be a Payload. Parse this to get the response field out.
	SOAPOutPayload = accessor.call(payload);

	/* Parse the SOAPPayload to get the SOAP Response back */ 
	InputStream inp = SOAPOutPayload.getContent();

	/* Create DOM structure from input XML */  
	DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
	DocumentBuilder builder = factory.newDocumentBuilder();
	Document document = builder.parse(inp);
	//Node node=null;
	NodeList list = document.getElementsByTagName("serverUrl");
	NodeList list2 = document.getElementsByTagName("sessionId");

	//NodeList list = document.getChildNodes();
	int len=list.getLength();
	//String sessionId="";
	for(int i=0;i<len;i++)
	{
		Node     node = list.item(i);
		Node node2=list2.item(i);
		if (node != null && node2 != null)
		{

			node = node.getFirstChild();
			node2= node2.getFirstChild();
			if (node != null && node2 != null)
			{
				serverUrl = node.getNodeValue();
				sessionId = node2.getNodeValue();
			}
		}
	}
}

catch(Exception e)
{e.printStackTrace();}
return (sessionId) ;

Once session ID is received, set it to one of the Header attribute using Dynamic Configuration:

UDF Code to set attribute “sessionID”:

String NameSpace = "http://sap.com/xi/XI/System/REST" ;
DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
DynamicConfigurationKey key  = DynamicConfigurationKey.create( NameSpace, "sessionID");
conf.put(key,sessionID); 
return sessionID;

Now Session is set to one of Dynamic config attribute.

  • Create Target Structure:

With simple Message Mapping, we can pass source data to the target. However In target structure, if you compare the target structure which is shown above in the 2nd API call (Add batch to a Job), we created one extra segment with a name (“AdditionalSegment”) that is actually not a part of the SFDC object.

Question: Why “AdditionalSegment” is required?

Answer: Here in this requirement, we are supposed to send the status of each record to the sender system whether that particular record is successfully processed to SFDC or not.

However, when we make the last call to SFDC (Retrieve Batch Results API), There is no primary field returned by SFDC based on which SAP PO needs to update the status on the sender side.

The primary field on the sender side is STG_TENDER_SEQ. This is mapped to Reference__c on SFDC side which is externalIdFieldName(primary field in SFDC object)

As per the above screenshot, the status of each record (<Success>) is present in the API response. Field “Success” is a flag that has the possible value true & false. It provides us information that whether a record is successfully processed or not using flag true/false. However, there is no primary field or external ID field name returned in each record to know which record belongs to which primary field record.

As per Salesforce documentation:

Results are returned in the same order as the records in the batch request. It’s important to track the record number so that SAP PO can identify the associated success/failed record in the batch request If the Success field is false, the row wasn’t processed successfully. Otherwise, the record was processed successfully.

Reference link :

https://developer.salesforce.com/docs/atlas.en-us.236.0.api_asynch.meta/api_asynch/asynch_api_batches_failed_records.htm

So to solve this, we can create a field with the name “PrimaryField” in which we can keep a track of primary field values separated by a delimiter.

Value of “PrimaryField“ would be transferred to the next ICO using GetPayloadValueBean & PutPayloadValueBean modules.

Once the value is transferred, we will remove this extra segment before passing it to SFDC using the Custom module.

Before Mapping Payload: (comes from Sender)

After Mapping Payload:

<?xml version="1.0" encoding="UTF-8"?>
<ns1:sObjects xmlns:ns1="http://www.force.com/2009/06/asyncapi/dataload">
    <ns1:sObject>
        <ns1:Store_Count__c>7</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 20</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-22T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>myHusky Rewards     </ns1:Card_Type__c>
        <ns1:Reference__c>41</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>731</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 3614.77</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-26T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Cash                </ns1:Card_Type__c>
        <ns1:Reference__c>141</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>57</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 2022.9</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-26T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Master Card         </ns1:Card_Type__c>
        <ns1:Reference__c>142</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>108</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 4249.31</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-26T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Visa                </ns1:Card_Type__c>
        <ns1:Reference__c>143</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>9</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 514.52</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-26T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>American Express    </ns1:Card_Type__c>
        <ns1:Reference__c>144</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:AdditionalSegment>
        <ns1:PrimaryField>41,141,142,143,144</ns1:PrimaryField>
    </ns1:AdditionalSegment>
</ns1:sObjects>

As above after mapping payload, the Primary field contains Primary field (Reference__) values separated by a comma.

  • Create a job via Lookup to SFDC and capture JobID. (Create Job API call)

In this step, we call Create Job API to SFDC.

Below is java mapping which is used to call Create Job API:

Java Mapping read parameters from the ID part as below.

It calls Lookup Channel which makes SFDC call. In return, once we receive a response from SFDC (Create Job API response), it extracts JobID and puts this value to ASMA attribute jobID to pass it to subsequent calls.

Java Class: SFDCBulkAPI_CreateJobLookup

import java.io.*;
import javax.xml.parsers.*;
import org.w3c.dom.*;
import com.sap.aii.mapping.api.*;
import com.sap.aii.mapping.lookup.Channel;
import com.sap.aii.mapping.lookup.LookupService;
import com.sap.aii.mapping.lookup.Payload;
import com.sap.aii.mapping.lookup.SystemAccessor;

public class SFDCBulkAPI_CreateJobLookup_Java extends AbstractTransformation  {

	public void transform(TransformationInput InputObj, TransformationOutput OutputObj)  
	{
		try{
			
		// Reading Session ID
		DynamicConfiguration conf= InputObj.getDynamicConfiguration();
		String NameSpace = "http://sap.com/xi/XI/System/REST" ;
		DynamicConfigurationKey key_sessionID  = DynamicConfigurationKey.create( NameSpace, "sessionID");
		String sessionID = conf.get(key_sessionID);
		
		DynamicConfigurationKey key_jobID  = DynamicConfigurationKey.create( NameSpace, "jobID");
		
		InputStream in = InputObj.getInputPayload().getInputStream();
		
		//Reading ICO Parameters
		String operation = InputObj.getInputParameters().getString("operation");
		String object = InputObj.getInputParameters().getString("object");
		String externalIdFieldName = InputObj.getInputParameters().getString("externalIdFieldName");
		
		OutputStream out = OutputObj.getOutputPayload().getOutputStream();
		execute(in , out, operation, object, externalIdFieldName, conf, key_jobID, sessionID);
		}
		catch(Exception e)
		{
			System.out.println(e);
		}
	}
	void execute(InputStream in , OutputStream out, String operation, String object, String externalIdFieldName, DynamicConfiguration conf, DynamicConfigurationKey key_jobID,  String sessionID)
	{
		try{
		
		// Craete Target Structue
		String requestPayload = "<?xml version=\"1.0\" encoding=\"utf-8\"?><jobInfo xmlns=\"http://www.force.com/2009/06/asyncapi/dataload\"><operation>" 
		+ operation + "</operation><object>" + object + "</object><externalIdFieldName>" + externalIdFieldName + "</externalIdFieldName>"
		+ "<contentType>XML</contentType><sessionID>" + sessionID + "</sessionID></jobInfo>";
		
		String jobID = "";
		  //perform the rest look up
		  Channel channel = LookupService.getChannel("SFDC","CC_SFDC_REST_BulkAPI_CreateJob_Rcv");
		  SystemAccessor accessor = null;
		  accessor = LookupService.getSystemAccessor(channel);
		  
		  InputStream inputStream = new ByteArrayInputStream(requestPayload.getBytes());
		  Payload payload = LookupService.getXmlPayload(inputStream);
		  Payload SOAPOutPayload = null;
		 SOAPOutPayload = accessor.call(payload);
		  
		 // Retrieving Job ID using DOM Pareser
		  InputStream inp = SOAPOutPayload.getContent();
		 DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
		  DocumentBuilder builder = factory.newDocumentBuilder();
		  Document document = builder.parse(inp);
		  NodeList nl_jobID = document.getElementsByTagName("id");
		  Node node = nl_jobID.item(0);
		if (node != null)
		  {
		  node = node.getFirstChild();
		  if (node != null)
		  {
			 jobID = node.getNodeValue();
		  }
		 
		 // Setting Dynamic attribute with Job ID
		conf.put(key_jobID, jobID);
		  }
		 copypayload(in, out);
		}
		catch (Exception e) 
		{
		  e.printStackTrace();
		}
		
	}
	// Copy Payload
	void copypayload(InputStream in , OutputStream out )
	{
		try{
		int size = in.available();
		byte buffer[] = new byte[size];
		in.read(buffer);
		out.write(buffer);
		}
		catch(Exception e)
		{	
		}
	}
}

Please note that in the above java mapping, the Payload structure which is created has one extra field with the name “sessionID” as below. We will see in the next step why it is created.

Output structure from Java mapping :

<?xml version="1.0" encoding="utf-8"?>
<jobInfo xmlns="http://www.force.com/2009/06/asyncapi/dataload">
    <operation>upsert</operation>
    <object>LOM_Tender__c</object>
    <externalIdFieldName>Reference__c</externalIdFieldName>
    <contentType>XML</contentType>
<sessionID>00D770000008aLl!ARkAQBvdVhGVxVT_G5ZK6yyfs1404jmSzfh8HMyo5MOtP<sessionID>
</jobInfo>

After this java mapping, there is one custom module written with the name SetSFDCSessionIDHeader.

Question: Why custom module SetSFDCSessionIDHeader is required?

Answer: This module is written to get the sessionId value from the payload (created above in Java mapping) and set it as HTTP header while doing a lookup to SFDC.

While doing the lookup, I didn’t find any way to pass the ASMA attribute(sessionID) as HTTP header to the lookup channel. To solve this, the SessionID field which is created in java mapping, would be passed to the custom module. In the custom module, SessionID field value would be used to create ASMA attribute and that would be used in the lookup channel to set it as HTTP header.

Module : SetSFDCSessionIDHeader

https://www.erpqna.com/sap-pp-certification
import java.io.*;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;
import javax.xml.transform.stream.StreamResult;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
import org.w3c.dom.NodeList;
import javax.ejb.Local;
import javax.ejb.LocalHome;
import javax.ejb.Remote;
import javax.ejb.RemoteHome;
import javax.ejb.Stateless;
import com.sap.aii.af.lib.mp.module.Module;
import com.sap.aii.af.lib.mp.module.ModuleContext;
import com.sap.aii.af.lib.mp.module.ModuleData;
import com.sap.aii.af.lib.mp.module.ModuleException;
import com.sap.engine.interfaces.messaging.api.Message;
import com.sap.engine.interfaces.messaging.api.MessageKey;
import com.sap.engine.interfaces.messaging.api.MessagePropertyKey;
import com.sap.engine.interfaces.messaging.api.Payload;
import com.sap.engine.interfaces.messaging.api.PublicAPIAccessFactory;
import com.sap.engine.interfaces.messaging.api.XMLPayload;
import com.sap.engine.interfaces.messaging.api.auditlog.AuditAccess;
import com.sap.engine.interfaces.messaging.api.auditlog.AuditLogStatus;

@Stateless(name="SetSFDCSessionIDHeaderModule")
@LocalHome(value=com.sap.aii.af.lib.mp.module.ModuleLocalHome.class)
@Local(com.sap.aii.af.lib.mp.module.ModuleLocal.class)

public class SetSFDCSessionIDHeader implements Module {

private AuditAccess audit;
@SuppressWarnings("deprecation")
MessagePropertyKey messagepropertykey;

public ModuleData process(ModuleContext moduleContext, ModuleData inputModuleData) throws ModuleException
{
try{
audit = PublicAPIAccessFactory.getPublicAPIAccess().getAuditAccess();
String CLASS_NAME = getClass().getSimpleName();
Message msg = (Message) inputModuleData.getPrincipalData();
MessageKey key = new MessageKey(msg.getMessageId(), msg.getMessageDirection());
audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, CLASS_NAME + ": SetSFDCSessionIDHeader Module Called");

messagepropertykey = new MessagePropertyKey("sessionID" , "http://sap.com/xi/XI/System/REST");

// Reading payload values created using Java mapping
Payload payload = msg.getDocument();
byte [] bytePayload = payload.getContent();
   String stringPayload = new String(bytePayload);
   InputStream  in = new ByteArrayInputStream(stringPayload.getBytes());
   DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();
   DocumentBuilder documentbuilder = dbFactory.newDocumentBuilder();
   Document doc = documentbuilder.parse(in);
      
   NodeList nl_operation = doc.getElementsByTagName("operation");
   NodeList nl_object = doc.getElementsByTagName("object");
   NodeList nl_externalIdFieldName = doc.getElementsByTagName("externalIdFieldName");
   NodeList nl_sessionID = doc.getElementsByTagName("sessionID");
   
   String operation = nl_operation.item(0).getFirstChild().getNodeValue();
   String object = nl_object.item(0).getFirstChild().getNodeValue();
   String externalIdFieldName = nl_externalIdFieldName.item(0).getFirstChild().getNodeValue();
   String sessionID = nl_sessionID.item(0).getFirstChild().getNodeValue();
   
   //Setting SessionID ASMA attribute, would be used in Lookup channel to pass this value in HTTP Header
   msg.setMessageProperty(messagepropertykey, sessionID);
   audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, CLASS_NAME + ": Setting ASMA sessionID:" +sessionID );
   
   //creating Payload required for Create Job API
   String outputPayload = "<?xml version=\"1.0\" encoding=\"utf-8\"?><jobInfo xmlns=\"http://www.force.com/2009/06/asyncapi/dataload\"><operation>" + operation + "</operation><object>" + object + "</object><externalIdFieldName>" + externalIdFieldName + "</externalIdFieldName><contentType>XML</contentType></jobInfo>";
   audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, CLASS_NAME + ": Calling end Point:" +outputPayload );
   payload.setContent(outputPayload.getBytes());
   msg.setDocument((XMLPayload) payload);
   
   inputModuleData.setPrincipalData(msg);
   audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, "Module executed successfully");
   return inputModuleData;
}
catch (Exception e) {
throw new ModuleException(e.getClass() + ": " + e.getMessage());
}
}
}
Lookup Channel CC_SFDC_REST_BulkAPI_CreateJob_Rcv Configuration
Lookup Channel Log-Successful call to Create Job API

So CreateJob call is successful to SFDC and in return, we receive JobID which passed as ASMA attribute (via Java class SFDCBulkAPI_CreateJobLookup_Java ) to make our next call on the receiver channel.

Message Attribute after Mapping: It has JobID and sessionID values.

Message Attribute after Mapping
  • Send Output Payload to SFDC (via Add batch to job API call) :

In the above step, an output payload is created which needs to be passed to SFDC. Along with payload, SeesionID needs to be passed as HTTP header and Jobid would be part of the URL.

Receiver channel-Add batch to job Call

Module Parameter: In the Receiver channel, there is a sequence of module parameters.

AF_Modules/GetPayloadValueBean: In Output payload, There is one extra segment with name “AdditionalSegment” which has XML field “PrimaryField”. It contains a sequence of Primary field values.

Using the GetPayloadValueBean module, a parameter called PrimaryField is created which holds the value of XML field “PrimaryField”. This is a parameter value that is nothing but a full XML path to the corresponding XML field.

“get:/ns1:sObjects/ns1:AdditionalSegment/ns1:PrimaryField”

We will use PutPayloadValueBean to use this parameter in order to pass this parameter value to our next flow.

AF_Modules/DynamicConfigurationBean: Dynamic Configuration Bean module is used to pass the value of SessionID to our next flow.

RemoveAdditionalSegmentModule: Output payload which is created in the above step has one extra segment with the name “AdditionalSegment” which is not at all required by SFDC. (Add batch to job call). This extra segment was created just to hold the values of the Primary field and pass it to the next ICO. So now the Primary field value is already being held by parameter using GetPayloadValueBean, so we are good to remove this additional segment now.

So there is one custom module written to remove additional segment:

RemoveAdditionalSegmentModule :

package com.sap.pi;
import java.io.*;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;
import javax.xml.transform.stream.StreamResult;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
import org.w3c.dom.NodeList;
import javax.ejb.Local;
import javax.ejb.LocalHome;
import javax.ejb.Remote;
import javax.ejb.RemoteHome;
import javax.ejb.Stateless;
import com.sap.aii.af.lib.mp.module.Module;
import com.sap.aii.af.lib.mp.module.ModuleContext;
import com.sap.aii.af.lib.mp.module.ModuleData;
import com.sap.aii.af.lib.mp.module.ModuleException;
import com.sap.engine.interfaces.messaging.api.Message;
import com.sap.engine.interfaces.messaging.api.MessageKey;
import com.sap.engine.interfaces.messaging.api.MessagePropertyKey;
import com.sap.engine.interfaces.messaging.api.Payload;
import com.sap.engine.interfaces.messaging.api.PublicAPIAccessFactory;
import com.sap.engine.interfaces.messaging.api.XMLPayload;
import com.sap.engine.interfaces.messaging.api.auditlog.AuditAccess;
import com.sap.engine.interfaces.messaging.api.auditlog.AuditLogStatus;

@Stateless(name="RemoveAdditionalSegmentModule")
@LocalHome(value=com.sap.aii.af.lib.mp.module.ModuleLocalHome.class)
@Local(com.sap.aii.af.lib.mp.module.ModuleLocal.class)

public class RemoveAdditionalSegment implements Module{

private AuditAccess audit;
@SuppressWarnings("deprecation")
MessagePropertyKey messagepropertykey;
public ModuleData process(ModuleContext moduleContext, ModuleData inputModuleData) throws ModuleException
{
try{
audit = PublicAPIAccessFactory.getPublicAPIAccess().getAuditAccess();
String CLASS_NAME = getClass().getSimpleName();
Message msg = (Message) inputModuleData.getPrincipalData();
MessageKey key = new MessageKey(msg.getMessageId(), msg.getMessageDirection());
audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, CLASS_NAME + ": RemoveAdditionalSegmentModule Module Called");

//Reading Module Parameters "SegmentName" which needs to be removed from payload
String SegmentName = moduleContext.getContextData("SegmentName");

Payload payload = msg.getDocument();
InputStream in = (InputStream)payload.getInputStream();
DocumentBuilderFactory documentbuilderfactory = DocumentBuilderFactory.newInstance();
documentbuilderfactory.setNamespaceAware(true);
DocumentBuilder documentbuilder = documentbuilderfactory.newDocumentBuilder();
Document document  = documentbuilder.parse(in);
document.normalize();
Element element  = document.getDocumentElement();
Element additional_element = (Element)document.getElementsByTagName(SegmentName).item(0);
audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, "Removing additional element: "+SegmentName );
//Removing Additional segment
element.removeChild(additional_element);
document.normalize();  

//After removing extra segment, passing remaining payload.
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
   TransformerFactory.newInstance().newTransformer().transform(new DOMSource(document), new StreamResult(byteArrayOutputStream));
   byte[] bytes = byteArrayOutputStream.toByteArray();
   payload.setContent(bytes);
   msg.setDocument((XMLPayload) payload);
   inputModuleData.setPrincipalData(msg);
   audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, "RemoveAdditionalSegmentModule executed successfully");
   return inputModuleData;
}
catch (Exception e) {
throw new ModuleException(e.getClass() + ": " + e.getMessage());
}
}
}

AF_Modules/RequestResponseBean: This module is used to make async/sync bridge.

AF_Modules/PutPayloadValueBean : In the Previous step, In GetPayloadValueBean module, One parameter is created PrimaryField to hold the value of the Primary field.

Now PutPayloadValueBean is used to put that parameter value to the response structure which would be passed to the next ICO.

Parameter value is nothing but the XML path put:/ns0:batchInfo/ns0:PrimaryField where this value would be placed in output/response structure.

AF_Modules/ResponseOnewayBean: This module is used to pass the response of SFDC (Add batch to job API call). This response needs to be passed to the next flow. In parameter values, details of the next ICO are given.

ResponseOnewayBean Parameters
Message Monitoring

2) Second ICO: SFDC to SFDC: (Close Job API call)

In our first ICO, basically, we covered the first two SFDC call, i.e. create Job & Add batch to Job API call.

In this ICO, we cover the next call i.e. Close Job.

Close Job_ICO
Incoming Payload (Response payload received from previous ICO)

As we see above, This payload is nothing but the response of Add Batch to Job API call along with one extra field PrimaryField which is added by module PutPayloadValueBean in the previous ICO receiver channel.

Message Attribute: Session ID is passed from the previous ICO using DynamicConfigurationBean.

Here in Operation Mapping, we have simple message mapping which creates the structure of the Close Job API call.

Operation Mapping

There is only one UDF to create JobID (because it is not passed from the Previous flow). JobID is being received in the input payload. This value needs to be passed to the receiver channel to close the Job as part of the URL.

UDF:

String NameSpace = "http://sap.com/xi/XI/System/REST" ;
DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
DynamicConfigurationKey key  = DynamicConfigurationKey.create( NameSpace, "jobID");
conf.put(key,var1); 
return var1;
Output Payload created by above Message mapping

Here we pass State as “Closed” as required by SFDC Close Job API call.

Apart from it, in an extra segment “AdditionalSegmet”, there are two extra values i.e. Primaryfield and Batch ID. Both these values are required in our next 3rd flow.

Message Attribute after mapping: It contains JobID & SessionID field values.

Receiver Channel Close job:

Receiver Channel_CloseJob

jobID & sessionID values are present in HTTP Header.

JobID needs to be passed as part of URL and sessionID needs to be passed as part of HTTP header.

Module Tab:

AF_Modules/GetPayloadValueBean : In Output payload, There is one extra segment with name “AdditionalSegment” which has XML field “PrimaryField” & BatchID.

This module is being used two times here to create two parameters that hold the value of PrimaryField & BatchID. Here Parameter value is the full XML path to the corresponding XML field.

get:/ns0:jobInfo/ns0:AdditionalSegment/ns0:PrimayField

get:/ns0:jobInfo/ns0:AdditionalSegment/ns0:BatchID

We will use PutPayloadValueBean to use these parameters in order to pass these parameter values to our next flow.

AF_Modules/DynamicConfigurationBean: Dynamic Configuration Bean module is used to pass the value of SessionID to our next flow.

RemoveAdditionalSegmentModule: Output payload which is created in the above step has one extra segment with the name “AdditionalSegment” which is not at all required by SFDC. (Close job API call). This extra segment was created just to hold the values of the Primary field & BatchID and pass it to the next ICO. So now Primary field value & BatchID are already being held by parameters using GetPayloadValueBean, so we are good to remove this additional segment now.

AF_Modules/RequestResponseBean: This module is used to make async/sync bridge.

AF_Modules/PutPayloadValueBean: In Previous step, GetPayloadValueBean module, two parameters are created PrimaryField & BatchID.

Now PutPayloadValueBean is used two times, to put these parameter values into a response structure which would be passed to the next ICO.

Parameter values are nothing but the XML path where this value would be placed in output/response structure.

put:/ns0:jobInfo/ns0:BatchID

put:/ns0:jobInfo/ns0:PrimayField

AF_Modules/ResponseOnewayBean: This module is used to pass the response of SFDC (Close Job API call response). This response needs to be passed to the next flow. In parameter values, details of the next ICO are given.

Message Monitoring

3) Third ICO: SFDC to SFDC: (Retrieve Batch Results)

So far, a job is created, and a batch is added to the job. After that Job is closed in the previous step. Now in this step, we retrieve batch results to get to know the status of each record whether the record is successfully updated in SFDC or not.

Incoming Payload (Response payload received from previous ICO)

This payload is a response of Add Batch to Job API call along with two extra fields PrimaryField & BatchID which is added by module PutPayloadValueBean in the previous ICO receiver channel.

Now we need to call Retrieve Batch Results API on the SFDC side which is actually a HTTP Get call so we are concerned about Payload which gets passed to SFDC.

At the level of operation mapping, there is simple One to One mapping along with a small UDF which is putting some delay of around 1 minute. So, the question is why this delay is needed? This delay is needed because whenever we close the job, record processing gets started which may take some time to get processed on the SFDC side. So on the previous ICO receiver channel, Job is closed, and after that here in Operation mapping, we are putting some delay before we retrieve the batch result.

UDF to put delay :
long l_DelayTimeMinutes = Long.parseLong(DelayMinutes);
try {
			TimeUnit.MINUTES.sleep(l_DelayTimeMinutes);
		} 
		catch (InterruptedException e) {
			
			e.printStackTrace();
		}
return "";

Receiver channel to call retrieve Result

Here jobid element refers to the <id> field from the payload. It is used in getting payload as part of URL as a query string.

BatchId element refers to BatchID field from payload. It is used in getting payload as part of URL as a query string.

sessionID is used in the HTTP header parameter.

ModuleTab

Here again, AF_Modules/GetPayloadValueBean is used to pass XML field value “PrimaryField” from the current ICO to the next ICO.

This module creates a parameter that hold the value of PrimaryField .

We use PutPayloadValueBean to pass this parameter value to the next flow. It put this parameter value into the response structure which would be passed to the next ICO.

AF_Modules/ResponseOnewayBean: This module is used to pass the response of SFDC (Retrieve result API result response). This response needs to be passed to the next flow. In parameter values, details of the next ICO are given.

Message Monitoring: Message processing took around 1 minute time due to delay put at mapping level.

Message Monitoring

4) Third ICO: SFDC to Sender: (update SFDC Result back to Sender)

So, in the previous step, we made a SFDC call to Retrieve Batch results via HTTP Get method. So, In this ICO, we get batch results.

Incoming Payload: This incoming payload is the response to API “Retrieve batch result API”. As we sent a total of five records to SFDC so here there are five <result> tags are received. Along with API response, <PrimaryField> is added by module tab PutPayoloadValueBean.

In Response, if the XML field <Success> is true, that means the record is successfully processed on the SFDC side. In case of error, the value would be false.

Incoming Payload

As discussed previously, as per SFDC note, Results are returned in the same order as the records in the batch request. So this was the reason to track the record number in <PrimaryField>.

Here at the mapping level, there is a simple UDF to put the primary field value in each <result> segment in the same order which is returned from SFDC.

UDF :

String primayfield = var1[0];	
String pf[] =  primayfield.split(“,”);
		for(int i = 0 ; i < pf.length ;  i++)
		{
		result.addValue(pf[i]);
		}

Output Payload from mapping: Below is the output payload which has the status(<Success>) & Primary field(KeyField) of each record sent to SFDC along to know whether that primary field is successfully updated on the SFDC side. Now, this payload is good to update to the sender.

<ns0:results xmlns:ns0="http://www.force.com/2009/06/asyncapi/dataload">
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpUUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>41</ns0:KeyField>
	</ns0:result>
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpVUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>141</ns0:KeyField>
	</ns0:result>
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpWUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>142</ns0:KeyField>
	</ns0:result>
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpXUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>143</ns0:KeyField>
	</ns0:result>
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpYUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>144</ns0:KeyField>
	</ns0:result>
</ns0:results>
Message Monitoring
Rating: 5 / 5 (1 votes)

The post Salesforce integration with SAP PO Rest Adapter using SFDC Bulk API Concept appeared first on ERP Q&A.

]]>