SAP PO Archives - ERP Q&A https://www.erpqna.com/category/erp/sap-po/ Trending SAP Career News and Guidelines Mon, 23 Dec 2024 04:55:09 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP PO Archives - ERP Q&A https://www.erpqna.com/category/erp/sap-po/ 32 32 Integration of SAP CI(BTP IS) with IBM MQ through AMQP https://www.erpqna.com/integration-of-sap-cibtp-is-with-ibm-mq-through-amqp/?utm_source=rss&utm_medium=rss&utm_campaign=integration-of-sap-cibtp-is-with-ibm-mq-through-amqp Sat, 02 Sep 2023 11:34:04 +0000 https://www.erpqna.com/?p=77129 Introduction What is IBM MQ? IBM MQ is a family of message-oriented middleware products that IBM launched in December 1993. It was originally called MQSeries, and was renamed WebSphere MQ in 2002 to join the suite of WebSphere products. In April 2014, it was renamed IBM MQ. IBM MQ supports the exchange of information between […]

The post Integration of SAP CI(BTP IS) with IBM MQ through AMQP appeared first on ERP Q&A.

]]>
Introduction

What is IBM MQ?

IBM MQ is a family of message-oriented middleware products that IBM launched in December 1993. It was originally called MQSeries, and was renamed WebSphere MQ in 2002 to join the suite of WebSphere products. In April 2014, it was renamed IBM MQ. IBM MQ supports the exchange of information between applications, systems, services and files by sending and receiving message data via messaging queues. This simplifies the creation and maintenance of business applications.

What is SAP CI(BTP-IS)?

Cloud Integration(BTP-IS) is a set of services and tools provided by SAP on its cloud-based Business Technology Platform (BTP) to enable integration between different systems, applications, and data sources. The key benefit of CI(BTP IS) is that it enables organizations to quickly and easily integrate their systems, data, and applications without the need for extensive coding or custom development. This helps to streamline business processes, reduce costs, and improve operational efficiency.

How IBM MQ can be integrated?

IBM MQ provides the messaging and queuing capabilities across multiple modes of operation: point-to-point ; publish/subscribe. IBM MQ has the Queue Managers(QM) in which different types of queues will be created. The QM can be connected directly or using client channel definition table or using a intermediate queue manager. All these will be associated with the channels which provides the in and out movement of the data from the Queues. Along with Queues we can have Topics also which can work with pub-sub approach. Rest APIS, JMS and MFTs can also be leveraged with the IBM MQ package installation.

How CI integrates with IBM MQ?

The integration between CI and IBM MQ can be best done using the AMQP 1.0 protocol. There are 7.5,8.0,9.0,9.1,9.2,9.3 versions of IBM MQ installations available in the market today. Out of which only 9.2 and above versions can support the integration between CI and IBM MQ.

The Queues on IBM MQ can be connected from CI and the Topics which can be published can also be subscribed from CI using AMQP protocol.

Note: Among the possible integrations with IBM MQ, Message Queues Integration using the AMQP protocol will be explained in detail.

Integration of SAP CI(BTP IS) with IBM MQ through AMQP

Prerequisites:

  1. Any IBM MQ server with version 9.2 and above. For Demo purpose using the trial IBM MQ from https://www.ibm.biz/ibmmqtrial.
  2. SAP Cloud Connector with required roles to connect IS tenant and IBM MQ
  3. SAP BTP IS Tenant Access with required CI roles.

Step-1: Install the IBM MQ 9.2 from the downloaded setup file

Select all the features and install them along with the MQ Explorer

IBM MQ Setup File

Once installed successfully open the IBM MQ Explorer which should open as below

IBM MQ Explorer

Step-2: Create the new Queue Manager (QM1)under the left side Queue Managers Pane

Queue Manager

Step-3: Create the new Queue (Q2) under the left side Queue Managers (QM1)

New Queue in the Queue Manager

Step-4: Create the new AMQP channel under the left side Queue Manager (QM1)

While Creating the AMQP channel provide the port as 5672 and start the channel. There will be a default AMQP channel which comes along with the installation stop it and use our configured AMQP channel. This is for the reason of handling the queues explicitly.

AMQP Channel

Step-5: Configure the Virtual Mapping to IBM MQ Internal System in Cloud Connector

Make sure the cloud connector is already connected successfully to the SAP BTP subaccount.

The backend type should be “NON-SAP System” and the protocol should be “TCP” only

Use the same port (5672) in system mapping which was given in the AMQP channel in IBM MQ and internal host on which the IBM MQ is installed and remaining virtual details as required.

Cloud Connector Virtual Mapping

Check the result of the reachable status on the internal host and port

Test the Virtual Mapping

Step-6: Test the connectivity from SAP Integration Suite to Cloud Connector.

The location ID (TESTCC) is the same name which was given during the connection between cloud connector and SAP BTP subaccount.

Cloud Connector Connectivity Test from IS

Step-7: Test the connectivity from SAP Integration Suite to IBM MQ through AMQP

The virtual host and virtual port which was given during system mapping in cloud connector should be used along with the same location id.

AMQP Connectivity Test from IS

Step-8: Deploy the SASL username/password of IBM MQ in IS Security Material

SASL Credentials Deployed in Security Material

Step-9: Create an IFLOW to send the message to the IBM MQ queue using AMQP channel

Provide the virtual host and port details along with the Location ID as tested earlier.

The credential name should be from the deployed security material.

To IBM MQ Through AMQP

Provide the destination type as Queue and use the same queue (Q2) created in IBM MQ

Destination and Queue Details

For testing purpose providing the sample message in the Content Modifier as below.

Content Modifier with Message Body

Step-10: Create an IFLOW to read the message from the IBM MQ queue using AMQP channel

Provide the virtual host and port details along with the Location ID as tested earlier.

The credential name should be from the deployed security material.

From IBM MQ through AMQP

For testing purpose storing the payload read from IBM MQ queue (Q2) using groovy script

Queue Details to be processed

Monitoring the Messages in CI

CI Monitoring Dashboard
Payload read from IBM MQ Queue

Monitoring the Messages in IBM MQ

Before Execution:

Before Execution with 0 read and write status

After Execution:

After Execution with 1 received and 1 sent status

Conclusion:

AMQP which is used mostly for Event Broker topics pub-sub can also be used for the exchanging of messaging queues data which provides a tight integration between CI and IBM MQ.

If the data in the IBM MQ TOPICS to be stored and retrieved then similar configurations to be done considering the pub-sub approach of the Topics using AMQP protocol itself.

Rating: 0 / 5 (0 votes)

The post Integration of SAP CI(BTP IS) with IBM MQ through AMQP appeared first on ERP Q&A.

]]>
Creating a Sync integration for downloading Open Text Content with REST API from Sap PO, Using API with multipart/form-data authentication https://www.erpqna.com/creating-a-sync-integration-for-downloading-open-text-content-with-rest-api-from-sap-po-using-api-with-multipart-form-data-authentication/?utm_source=rss&utm_medium=rss&utm_campaign=creating-a-sync-integration-for-downloading-open-text-content-with-rest-api-from-sap-po-using-api-with-multipart-form-data-authentication Wed, 29 Mar 2023 04:26:08 +0000 https://www.erpqna.com/?p=73237 Introduction: The purpose of this document is to develop a synchronous interface to download content from Open Text Content Server (OTCS) using REST API. I have tried to cover the overall design with code snippets for reference. Scope: Overall Design: Solution Flow: Rest API consumed from Open Text Content Server (OTCS): In Content-Type: multipart/form-data, credentials […]

The post Creating a Sync integration for downloading Open Text Content with REST API from Sap PO, Using API with multipart/form-data authentication appeared first on ERP Q&A.

]]>
Introduction:

The purpose of this document is to develop a synchronous interface to download content from Open Text Content Server (OTCS) using REST API. I have tried to cover the overall design with code snippets for reference.

Scope:

  • Rest OTCS authentication using Content-Type: multipart/form-data (Credentials as a header of the multipart body) for the token (otcsticket) generation.
  • Parameterized Mapping for accessing the OTCS credentials from ICO.
  • Calling OTCS authentication lookup, accessing parameters from java mapping.
  • Creating DynamicConfigurationKey for rest channel Parameter from java mapping.
  • OTCS session token management is out of scope.

Overall Design:

Sync Interface to get the Document from OpenText via PO 7.5

Solution Flow:

  1. SAP ECC calls a proxy to send the Document ID of the document in OTCS.
  2. PO Request java mapping receives Document ID.
    • Calls OTCS – Authentication API for a token (otcsticket) via REST lookup
    • Post ID and token to OTCS – Content API
  3. PO Response Java Mapping receives Document as an inputstream and maps it to the content field.
  4. Base64 content Field is sent to SAP for further processing.

Rest API consumed from Open Text Content Server (OTCS):

  • Authentication API – /otcs/cs.exe/api/v1/auth: API needs to be called with credentials in Content-Type: multipart/form-data section to generate a token, which is called otcsticket. Otcsticket needs to be present in the header for content API to be called.

In Content-Type: multipart/form-data, credentials need to be present, separated by a boundary.

  • Content API – /otcs/cs.exe/api/v1/nodes/{ID}/content: API would return the document as a byte stream, when called with the token and ID of the document in the header.
otcsticket and ID of the document in the http header

PO Objects and code snippets:

Data Structure

Document ID for OTCS comes as DataID from SAP. The document is returned to SAP as Content.

ICO

Mapping (MM) & Operational mapping (OM)

Please take care of the above ICO parameter in

  • OM-> Parameters section and Request Mapping Binding section
  • MM -> Signature tab

Request Mapping with java mapping in the Attributes and Methods Section

public void transform(TransformationInput in, TransformationOutput out) throws StreamTransformationException {

		try {

			getTrace().addDebugMessage("***OTCS-Request-JavaMapping-Start");
			
			//Get the mapping parameter from ICO
			String paramChannel = in.getInputParameters().getString("lookupChannel"); 
			String paramUserName = in.getInputParameters().getString("username"); 
			String paramPassword = in.getInputParameters().getString("password");
			String paramBoundary = in.getInputParameters().getString("boundary");
			getTrace().addDebugMessage("***OTCS-Step1-LogPramFromICO-lookupChannel:" + paramChannel + "-username:" 
				+ paramUserName + "-password:" + paramPassword +"-boundary:" +  paramBoundary);
			
			//Creating multipart/form-data for OTCS authentication
			String LINE_FEED = "\r\n";
			String ContentDisposition = "Content-Disposition: form-data; name=\"";
			String authReqFormData ="";
			authReqFormData =  LINE_FEED + paramBoundary + LINE_FEED + ContentDisposition + "username\"" + LINE_FEED 
				+ LINE_FEED + paramUserName + LINE_FEED + paramBoundary +  LINE_FEED +ContentDisposition 
				+ "password\"" + LINE_FEED + LINE_FEED + paramPassword + LINE_FEED + paramBoundary + "–-" + LINE_FEED;
			getTrace().addDebugMessage("***OTCS-Step2-multipart/form-data:" + authReqFormData);
			
			//Read message header value for Receiver 
			String paramReceiver = in.getInputHeader().getReceiverService();
			getTrace().addDebugMessage("***OTCS-Step3-ReceiverService:" + paramReceiver);
			
			//Get the OTCS rest lookup Channel Object for authentication
			Channel lookup_channel = LookupService.getChannel(paramReceiver, paramChannel);
			
			//Call rest lookup channel, with multipart/form-data payload
			SystemAccessor accessor = null;
			accessor = LookupService.getSystemAccessor(lookup_channel);
			InputStream authInputStream = new ByteArrayInputStream(authReqFormData.getBytes("UTF-8"));
			Payload authPayload = LookupService.getXmlPayload(authInputStream);
			Payload tokenOutPayload = null;
			//Call lookup
			getTrace().addDebugMessage("***OTCS-Step4-CallLookupChannel");
			tokenOutPayload = accessor.call(authPayload);
			//Parse for Lookup response for token
			InputStream authOutputStream = tokenOutPayload.getContent();
			DocumentBuilderFactory authfactory = DocumentBuilderFactory.newInstance();
			DocumentBuilder authbuilder = authfactory.newDocumentBuilder();
			Document authdocument = authbuilder.parse(authOutputStream);
			NodeList nlticket = authdocument.getElementsByTagName("ticket");
			String tokenTicket = "Empty";
			Node node = nlticket.item(0);
			if (node != null){
				node = node.getFirstChild();
				if (node != null){
					tokenTicket = node.getNodeValue();
				}
			}
			getTrace().addDebugMessage("***OTCS-Step5-TokenFromLookup:" + tokenTicket);
						
			//Parse input stream and get DataID from SAP
			DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();
			DocumentBuilder dBuilder = dbFactory.newDocumentBuilder();
			Document doc = dBuilder.parse(in.getInputPayload().getInputStream());
			String DataID = doc.getElementsByTagName("DataID").item(0).getTextContent();
			getTrace().addDebugMessage("***OTCS-Step6-DataIDFromSAP: " + DataID);
			
			//Create HTTP Header for rest call via setting DynamicConfiguration keys, that can be used in reciver channel
			DynamicConfiguration conf = in.getDynamicConfiguration();
			DynamicConfigurationKey keytokenTicket = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/REST","HeadertokenTicket");
			DynamicConfigurationKey keyDataID = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/REST","HeaderDataID");
			conf.put(keytokenTicket, tokenTicket);
			conf.put(keyDataID, DataID);

			String DummyPayload =  "DummyPayload";
			// Instantiating output stream to write at Target message
			OutputStream os = out.getOutputPayload().getOutputStream();
			// writing idoc to output stream
			os.write(DummyPayload.getBytes("UTF-8"));
			os.flush();
			os.close();
			getTrace().addDebugMessage("***OTCS-Request-JavaMapping-End");
		}
		catch (Exception e){
			getTrace().addDebugMessage(e.getMessage().toString());
			throw new StreamTransformationException(e.getMessage());
		}

	}

Response Mapping with java mapping in the Attributes and Methods Section

public void transform(TransformationInput in, TransformationOutput out) throws StreamTransformationException {
		try 
		{
			getTrace().addDebugMessage("***OTCS-Respose-JavaMapping-Start");
			InputStream inputstream = in.getInputPayload().getInputStream();
			OutputStream outputstream = out.getOutputPayload().getOutputStream();
			//Copy Input Payload into Output xml
			byte[] b = new byte[inputstream.available()];
			inputstream.read(b);
			//Form Output xml
			String outputStart = "<?xml version=\"1.0\" encoding=\"UTF-8\"?><ns0:MT_DocContent_Res xmlns:ns0=\"urn://XXXXXXXXXXX.com/OTCS/DocDownload\"><Content>";
			String outputEnd = "</Content></ns0:MT_DocContent_Res>";
			outputstream.write(outputStart.getBytes("UTF-8"));
			outputstream.write(b);
			outputstream.write(outputEnd.getBytes("UTF-8"));
			outputstream.flush();
			outputstream.close();
			getTrace().addDebugMessage("***OTCS-Respose-JavaMapping-End");
		}

		catch (Exception e)
		{
			getTrace().addDebugMessage(e.getMessage().toString());
			throw new StreamTransformationException(e.getMessage());

		}
	}

Channels

We have three channels in the flow, Such as Proxy channel from SAP, Rest channels for Token Lookup, and document fetching from the OTCS.

Proxy – CC_OTCS_GetDoc_Proxy_Sender

Rest lookup channel – CC_OTCS_Rest_LookUp

  • URL- http://{Server}/otcs/cs.exe/api/v1/auth
  • Rest Operation- POST
  • Data Format- UTF-8
  • HTTP Headers, as below

Content-Type multipart/form-data; boundary=SapPO75FormBoundaryhahahahahahahaEND

Rest document fetch channel– CC_OTCS_Rest_Receiver

  • URL- http:// /{Server}/otcs/cs.exe/api/v1/nodes/{ID}/content
  • Variable Substitution-
  • Rest Operation- GET
  • Data Format- UTF-8
  • HTTP Headers, as below

otcsticket {otcsticket}

ID {ID}

Rating: 0 / 5 (0 votes)

The post Creating a Sync integration for downloading Open Text Content with REST API from Sap PO, Using API with multipart/form-data authentication appeared first on ERP Q&A.

]]>
Integration of SAP CPI(BTP IS) with ChatGPT https://www.erpqna.com/integration-of-sap-cpibtp-is-with-chatgpt/?utm_source=rss&utm_medium=rss&utm_campaign=integration-of-sap-cpibtp-is-with-chatgpt Mon, 27 Feb 2023 11:40:03 +0000 https://www.erpqna.com/?p=72381 Integration of SAP CPI(BTP IS) with ChatGPT Introduction What is ChatGPT? ChatGPT is a language model developed by OpenAI. It is designed to understand natural language, generate human-like responses to a wide range of questions, and carry out various language-related tasks. It has been trained on a large corpus of text data and can understand […]

The post Integration of SAP CPI(BTP IS) with ChatGPT appeared first on ERP Q&A.

]]>
Integration of SAP CPI(BTP IS) with ChatGPT

Introduction

What is ChatGPT?

ChatGPT is a language model developed by OpenAI. It is designed to understand natural language, generate human-like responses to a wide range of questions, and carry out various language-related tasks. It has been trained on a large corpus of text data and can understand a broad range of topics. Its purpose is to assist users in their quest for knowledge and provide them with informative and useful responses.

What is CPI(BTP-IS)?

CPI(BTP-IS) is a set of services and tools provided by SAP on its cloud-based Business Technology Platform (BTP) to enable integration between different systems, applications, and data sources. The key benefit of CPI(BTP IS) is that it enables organizations to quickly and easily integrate their systems, data, and applications without the need for extensive coding or custom development. This helps to streamline business processes, reduce costs, and improve operational efficiency.

How ChatGPT can be integrated?

ChatGPT can be integrated using APIs (Application Programming Interfaces) provided by OpenAI. The API can be used to send text inputs to ChatGPT and receive responses in real-time. Chatbot Platform Integration and Custom Integration using libraries and SDK also possible.

How CPI integrates with ChatGPT?

CPI interacts ChatGPT through HTTP requests using API Keys for authentication. ChatGPT has multiple APIs depending on the usage or self learning basis like Models, Completions, Edits, Images, Create Image Variations, Embeddings, Files, Fine-tunes and Moderations APIs

Note: All the available ChatGPT APIs can be integrated with SAP CPI but for demonstration will use the Completion API.

Integration of SAP CPI(BTP IS) with ChatGPT

Prerequisites:

  1. SAP BTP IS Tenant Access
  2. ChatGPT openai platform Access
  3. Postman Tool for testing

Step-1: Create a new secret key on ChatGPT

After login to the account on https://platform.openai.com/account/api-keys goto API Keys and click on Create new secret key.

Image source from my account on platform.openai.com

Step-2: Download the security certificate from respective ChatGPT API

For demo we will use the API “ https://api.openai.com/v1/completions” from other available APIs of ChatGPT.

Open the API url in any browser and click on the lock(HTTPS) icon from the address bar and select show certificate.

Export the root certificate and save to your local desktop.

Image source from my account on platform.openai.com

Step-3: Upload the root security certificate in the SAP BTP IS(CPI)=>KeyStore

Login to BTP account and redirect to your Integration Suite home page. From the left side pane select the monitor artifacts options and open the Keystore page.

Click on Add dropdown option from right side top pane and select the Certificate option and upload the already downloaded security certificate from Step-1.

Image source from my trail account on SAP BTP IS

Step-4: Create a new Integration Flow under the desired package with appropriate naming convention.

From the left side pane design artifacts option select or create a package and create a new IFLOW.

We need to create a scenario of HTTPS sender adapter to HTTP receiver adapter with intermediate call on chatGPT API .

Step-5: Configure a sender HTTPS adapter as we will test it from Postman tool.

As per the requirement we can have any sender adapter which can able to provide the expected input text.

Image source from my trail account on SAP BTP IS

Provide the desired Address path which will be added to the deployed tenant URL.

Step-6: Configure the JSON to XML converter as the input text from Postman will be provided as JSON.

As per the requirement we can have any format input can be sent from sender system which can be modified accordingly.

Image source from my trail account on SAP BTP IS

Remove the Namespace Mapping if selected.

In our case JSON to XML only is used as input from POSTMAN will be JSON and for further processing of exchange properties we will use XML .

Step-7: Configure the Content Modifier to set the Message Header, Exchange Property and Message Body.

The ChatGPT HTTP API will be called with Authorization and ContentType values in the Message Header

So accordingly provide the Header constant in the Content Modifier with Content-Type as application/json

And Authorization will be the Bearer (space) API key generated in the step-1

Image source from my trail account on SAP BTP IS

The sample input from source will be in the below JSON format

{

“prompt”: “Input Text here

}

In order to capture the Input “Prompt” value and prepare complete request to API the exchange property will be configured with a “input” referring to the Xpath of the JSON converted XML field value.

Image source from my trail account on SAP BTP IS

The original API HTTP call will be expecting the actual source input with its respective parameters. Those parameters and source input will be formulated in the Message Body.

Image source from my trail account on SAP BTP IS

Each ChatGPT API call will be processed based on the model name provided. In our case will provide the latest model “text-davinci-003” which is the most capable GPT-3 model. Can do any task the other models can do, often with higher quality, longer output and better instruction-following. And supports inserting completions within text. Also its trained with the internet content available upto Jun2021.

Step-8: Configure a Request-reply with HTTP adapter receiver connection to ChatGPT API.

Provide the mandatory field values like Address of the ChatGPT API, Method as POST, Authentication as Client Certificate and the Request/Response headers. The Request headers can be given as * or Authorization|Content-Type etc.

Image source from my trail account on SAP BTP IS

Step-9: Handling the API JSON response.

When any ChatGPT API is invoked with JSON request then there will be a JSON response which can be converted to XML to produce the desired field output.

Configure the JSON to XML converter to select the respective XML field values to be populated in the output.

Image source from my trail account on SAP BTP IS

Step-10: Configure the Content Modifier for converted XML payload.

Create a “output” property referring to the XML text field which will be only populated as final output.

Image source from my trail account on SAP BTP IS

Modify the Message body also with the selected property value.

Image source from my trail account on SAP BTP IS

Final Integration Flow will look like:

Image source from my trail account on SAP BTP IS

Once the IFLOW is saved and deployed , go to the Managed Integration Content and get the endpoint which will be used in the source system to trigger a message to CPI.

Image source from my trail account on SAP BTP IS

In order to view the traces in the CPI monitor message processing. Select the log level as Trace instead of default option Info.

Image source from my trail account on SAP BTP IS

Testing in POSTMAN:

Image source from local installed Postman

Provide the CPI tenant endpoint URL for the created IFLOW along with the BTP account credentials in the Postman Authorization section.

For the sample input provided we have received the response accordingly from the chatGPT API.

Monitoring the IFLOW from CPI tenant with the trace enabled.

Image source from my trail account on SAP BTP IS
Image source from my trail account on SAP BTP IS
Rating: 0 / 5 (0 votes)

The post Integration of SAP CPI(BTP IS) with ChatGPT appeared first on ERP Q&A.

]]>
SAP CPI – End to End VAT integration using OAuth 2.0 with client credentials grant to generate Access token authentication https://www.erpqna.com/sap-cpi-end-to-end-vat-integration-using-oauth-2-0-with-client-credentials-grant-to-generate-access-token-authentication/?utm_source=rss&utm_medium=rss&utm_campaign=sap-cpi-end-to-end-vat-integration-using-oauth-2-0-with-client-credentials-grant-to-generate-access-token-authentication Sat, 16 Apr 2022 03:42:07 +0000 https://www.erpqna.com/?p=61987 When we are doing integration between Cloud – Cloud and Cloud to On-Premise, then we have multiple types of Authentication to access the API’s. User Credentials: User Id and Password OAuth 2.0 Credentials: Client secure url, Client ID , Client Secret and Scope OAuth 2.0 Authorization Code: Atheization URL, Token URL , Client ID, Client […]

The post SAP CPI – End to End VAT integration using OAuth 2.0 with client credentials grant to generate Access token authentication appeared first on ERP Q&A.

]]>
When we are doing integration between Cloud – Cloud and Cloud to On-Premise, then we have multiple types of Authentication to access the API’s.

  • User Credentials:

User Id and Password

  • OAuth 2.0 Credentials:

Client secure url, Client ID , Client Secret and Scope

  • OAuth 2.0 Authorization Code:

Atheization URL, Token URL , Client ID, Client Secret and User Name and password

  • Client Certificate:

client certificate

In this blog post I am going to explain the end-to-end VAT return outbound interface integration between SAP to HMRC UK government portal using the OAuth 2.0 with the Client Credentials Grant to generate Access Token.

Integration Flow Chart

Step-1 : OAuth 2.0 Access Token Configuration

The Following diagram illustrates the process to get access token

  • Logon to your SAP Cloud Platform Integration web application
  • Navigate to the Monitor option then we can see overview.
  • Then go to Manage Security > Security Material

Here choose Create > OAuth2 Authorization Code.

Then need to maintain the Authorization URL, Token Service URL, Client ID, Client Secret, User Name and scope of the your interface (for the VAT return interface scope is write:vat)

Click on the Deploy > Authorize > then it will redirect to the Callback URL, enter the Credentials once enter the credentials it will show as below then click on Grant Authority

Now successfully deployed the OAuth2.0 Access token configuration this will access later in the VAT return interface IFlow.

Step-2 End-to-End VAT return Interface IFlow

  • Navigate to Design > Package > Artifacts Tab
  • Choose Add > Integration Flow

Step-3

From the sender Drag the connecting arrow to connect to start action, from the Adapter Type box select the SOAP/HTTPS.

SOAP: WSDL based integration then we can go for the SOAP adapter.

HTTPS: JSON based integrations then we can go for the HTTPS Adapter, and also it will work for the xml.

  • In Connection tab, enter the Address or URL details
  • Enter HRMC/VAT/returns Optionally, you can enter any value of your choice, but ensure that you use “/” symbol before specifying the endpoint name

Step-4 Groovy Script

The groovy script contains the functionality to fetch an access token from the OAuth2.0 Authorization Code credential which we have configured in the Step-1(Security Material).

import com.sap.gateway.ip.core.customdev.util.Message;
import com.sap.it.api.securestore.SecureStoreService;
import com.sap.it.api.securestore.AccessTokenAndUser;
import com.sap.it.api.securestore.exception.SecureStoreException;
import com.sap.it.api.ITApiFactory;
def Message processData(Message message) {
     
    SecureStoreService secureStoreService = ITApiFactory.getService(SecureStoreService.class, null);
 
    AccessTokenAndUser accessTokenAndUser = secureStoreService.getAccesTokenForOauth2AuthorizationCodeCredential("OAuth2.0");
    String token = accessTokenAndUser.getAccessToken();
        
    message.setHeader("Authorization", "Bearer "+token);
    
     
    
   return message;
}
  • By calling the method

getAccesTokenForOauth2AuthorizationCodeCredential(“OAuth2.0”),

  • you fetch the access token of the OAuth2 Authorization Code credential with name “OAuth2.0”.
  • once we get token from this method then we need to pass same Access token in the header level using the below groovy syntax.

message.setHeader(“Authorization”, “Bearer “+token);

Step-5 Content Modifier

By using content modifier, we are passing the Accept and Content-type parameters in the http header request as below.

Step-6 Request-Reply

Connect the Request Reply to Receiver by dragging the arrow icon on Request Reply to the Receiver

Select the Connection tab. In the Address field, enter the Target endpoint/Url(https://hostname/org/vrn/return)

Step-7

  • Click Save to persist the changes to integration Flow.
  • Click Deploy to deploy the integration Flow.
  • Navigate to the Monitor View.
  • Under the Mange Integration Content Section > Choose Start to access all the Started Artifacts that we have deployed.
  • Select the integration flow > Endpoint tab then you can notice REST API URL for the integration flow.
  • This URL can be used to invoke the integration flow as a REST API from any REST client like postman.

Step-8 Testing The Integration Flow Using POSTMAN

Step-9 Trace and Monitoring in SAP CPI

Navigate to the Monitor > Manage Integration Content > Select the Iflow > Monitor Message Processing

  • Now we can see the messages status as completed
  • If you want to trace Iflow logs step-to-step then click on the trace option
  • Then we can see step-to-step process logs with content.
Rating: 0 / 5 (0 votes)

The post SAP CPI – End to End VAT integration using OAuth 2.0 with client credentials grant to generate Access token authentication appeared first on ERP Q&A.

]]>
SAP CPI – Encryption and Decryption using PGP keys https://www.erpqna.com/sap-cpi-encryption-and-decryption-using-pgp-keys/?utm_source=rss&utm_medium=rss&utm_campaign=sap-cpi-encryption-and-decryption-using-pgp-keys Sat, 09 Apr 2022 12:08:36 +0000 https://www.erpqna.com/?p=61737 When we are doing HR or Bank related integration here main factor is data secure, so one of way is encrypt data using PGP keys. In this tutorial I am going to explore How to generate PGP Keys using Kleopatra Software How to Encrypt data using PGP Public key in SAP CPI How to Decrypt […]

The post SAP CPI – Encryption and Decryption using PGP keys appeared first on ERP Q&A.

]]>
When we are doing HR or Bank related integration here main factor is data secure, so one of way is encrypt data using PGP keys.

In this tutorial I am going to explore

  1. How to generate PGP Keys using Kleopatra Software
  2. How to Encrypt data using PGP Public key in SAP CPI
  3. How to Decrypt the data using PGP Private Key in SAP CPI

Generating PGP Keys using Kleopatra Software

Open the Kleopatra application > File > New Key Pair

Provide the Key Pair Name > Advanced Settings > Select Validity

Enter the Passphrase

Click on Finish

Key Pair are Successfully generated.

Now we need to export the PGP Public and Private keys.

Exporting PGP Public Key

Save as FileName .pgp format

Exporting PGP Private Key

Finally, we have generated the PGP public and private keys

These keys we are going to use in the SAP CPI Iflow to Encrypt and Decrypt the content.

PGP Encryption

In this case we need to get PGP public key from the Non-SAP/Third party application team.

CPI Flow Chart

Step-1

Importing the PGP Public Key in Security Material

Step -2 Navigate to Design > Select Package > Artifact tab > Add Iflow

From the sender Drag the connecting arrow to connect to start action, from the Adapter Type box select the HTTPS.

  • In Connection tab, enter the Address or URL details
  • Enter /PGPEncryption Optionally, you can enter any value of your choice, but ensure that you use “/”symbol before specifying the endpoint name

Groovy script for the payload logging

import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {
    def body = message.getBody(java.lang.String) as String;
    def messageLog = messageLogFactory.getMessageLog(message);
    if(messageLog != null){
        messageLog.setStringProperty("Logging#1", "Printing Payload As Attachment")
        messageLog.addAttachmentAsString("ResponsePayload:", body, "text/plain");
     }
    return message;
}

Step-3 PGP Encryptor

Here we are using the PGP Encryptor pallet to encrypt the incoming data

Drag and Drop the PGP Encryptor function from Security tab into Iflow space. In the Processing tab we can select the dropdown values to adjust the algorithm, Key length, Compression etc. values but MUST specify the UID(User ID) of Public/Private Key pair to be used for Encryption

Save and deploy the Iflow

Step-4 Testing the Integration Flow Using Postman

PGP Decryption

In this case we need to generate PGP keys (Public and private), and public key will share with the third-party application team to encrypt the data.

CPI Flow Chart

Step-1

Importing the PGP Private Key in Security Material

Step-2

Here we are using the PGP Decryptor pallet to Decrypt the incoming content

In the Processing tab Specify the UID(User ID) of Public/Private Key pair to be used for Decryption

Step-3 Testing the Integration flow using Postman

Rating: 0 / 5 (0 votes)

The post SAP CPI – Encryption and Decryption using PGP keys appeared first on ERP Q&A.

]]>
SAP Password reset tool using Azure Logic App, SAP PO/Integration Suite and ABAP https://www.erpqna.com/sap-password-reset-tool-using-azure-logic-app-sap-po-integration-suite-and-abap/?utm_source=rss&utm_medium=rss&utm_campaign=sap-password-reset-tool-using-azure-logic-app-sap-po-integration-suite-and-abap Wed, 02 Mar 2022 09:05:56 +0000 https://www.erpqna.com/?p=60319 Introduction Recently while working on Azure Logic App, I felt we can make use of Office 365 email connector to automate a few manual processes. I thought, why not create a password reset tool? So, I designed a Logic App that picks up email from a specific folder(outlook) and passes on the information to SAP […]

The post SAP Password reset tool using Azure Logic App, SAP PO/Integration Suite and ABAP appeared first on ERP Q&A.

]]>
Introduction

Recently while working on Azure Logic App, I felt we can make use of Office 365 email connector to automate a few manual processes.

I thought, why not create a password reset tool?

So, I designed a Logic App that picks up email from a specific folder(outlook) and passes on the information to SAP ECC6 via SAP PO, finally using the ABAP program user password is reset/unlocked and messages are returned back to the sender by the Logic App.

Implementation

Develop a logic app in Azure that is connected with an outlook account(in the real world email must be a service account) and look for new emails.

Outlook connector configuration
  • Emails are looked at under the Password Reset folder every 10 seconds, so any new email that comes in will be immediately picked up.

Make sure the email body is in a specific format (end-users need to be trained about the email body format), otherwise parsing at the logic app will fail and the correct format will be returned back the sender.

Default email format(JSON)

{

“User”: {

“Client”: “230”,

“UNAME”: “MOHAMM4A”,

“email”: ”

“action” : “unlock”

}

}
Email body parsing

whenever an email trigger invokes, it will have a unique Message-Id. using message-id reply is sent back to the same email sender.

As soon as the parsing completes, a proper JSON structure request will be created and hit the SAP PO REST adapter endpoint using HTTP connector in Logic App.

HTTP connector

SAP PO or Integration Suite configuration:

In our DEV SAP ECC6, we have 2 clients(100 and 230), so the end-user needs to specify the proper client in the email body. once the payload is received in SAP PO it will be converted to XML by the REST adapter.

1. Develop ID objects in SAP PO with one REST sender adapter and 2 SOAP adapters(XI protocol) for two clients.

SAP PO ID objects

2. In the ICO, I have added conditions in the receiver determination so that based on the client its business system can be invoked for a password reset.

ICO – Receiver determination

3. Create ESR objects, thus the structure appears in SPROXY for ABAP coding.

ESR objects

4. Mapping in ESR is straightforward (one-to-one mapping).

Message Mapping

We are done with the SAP PO configuration.

If anyone is using SAP Integration suite, we can completely ignore Azure Logic App. It is a known thing that not everyone will be using Azure in their organization, hence we will be performing all the operations in SAP.

SAP Integration Suite IFlow

In my case I have used XI adapter. cloud connector was already in place, so I directly hit my ECC system.

I reused the same SAP PO generated Service Interface in ECC.

It is also possible to connect via SOAMANAGER.

IFlow can be found at my Git Repository.

Password reset IFlow – GitHub

Sender MAIL adapter is connected to my email account via IMAP and polls from the folder CPI at an interval of 10 seconds.

sender mail adapter

Once we have an email in the folder. It will pick, process it and change the email to read from unread.

For the purpose of sending the sender’s email ID along with the request payload for unlocking/reset, I’ve extracted the adapter header parameter(FROM) using a groovy script and passed it to ECC by forming a payload in the content modifier.

Finally, the response from ECC must be returned to the sender via receiver email adapter.

Receiver email configuration

SAP ABAP development:-

1. Generate proxy for the service provider in t-code SPROXY.

Service Proxy Proxy

2. Once we generate a proxy, An ABAP class will be generated.

Auto-generated ABAP class

3. We can either code in the same method or create a separate class where all the operations can be performed. I have created a separate class*(SE24) and did all the validations and password reset/unlock operation.

4. The most important thing is, how are we getting user details from SAP? to do this we can make use of standard BAPI: BAPI_USER_GET_DETAIL where we pass user-id(UNAME from email) as an exporting parameter and return user address details and SCN(secure network communication data )

BAPI_USER_GET_DETAIL

ls_address-e_mail will be holding the email address of the user in SAP.

the returned email(BAPI returned) will be validated against the sender’s email address (coming from Logic App or SAP Integration Suite).

if the email-id in SAP doesn’t match with the email sender’s email-id, then a reply will be sent back to the sender.

Otherwise, based on the action(UNLOCK/RESET) program will either reset or unlock the user’s password

5. The user details can be seen in transaction code SU01.

6. I have added a few validations within the ABAP code before resetting/unlocking the user.

There would be situations when the BASIS team will lock all the users during system maintenance, during such situations users shouldn’t be allowed to unlock. (if the status of usr02-uflag is 32 & 66 then it is locked by the administrator)

person who leaves the organization should not be allowed to reset/unlock. (usr02-class will hold user class)

email-sender needs to specify the correct user-id in the email body.

7. After validating and getting details from SAP, we can go ahead and reset/unlock users based on their choice.

8. Finally, using standard BAPI: BAPI_USER_UNLOCK user can be unlocked.

BAPI_USER_UNLOCK – To unlock user

9. If the choice is to reset, we can use BAPI: BAPI_USER_CHANGE.

I have used FM-GENERAL_GET_RANDOM_STRING to generate a random 8 character password and concatenated “@1” to it strong.

By passing a new string and username into FM-BAPI_USER_CHANGE, the password will be reset, and the response will be returned back to SAP PO –> Logic App –> email.

  1. We are done with the ABAP development.

Note: Validations are based on my own understanding post discussing with the security team, you can add as many validations you want. Also, If you have any suggestions regarding validations please write in the comment section.

Now, It’s time to test our tool

Test:1 – If is user tries to reset his password

A new password is generated and email is received as an email reply

Test 2: If the user tries to unlock.

User was not locked in SAP, so a proper response was returned back to the sender

Test 3: Few more validations.

Invalid action and invalid email validations

Test 4: In case the email body is not correct, the default email template will be returned as an email response.

Invalid email body

Monitoring:-

Azure Logic App

Logic App
Invalid JSON
HTTP connector

SAP PO:

SAP PO Message Monitor

SAP ECC6 Logs:

For the audit log purpose, I have added logs in ABAP SLG1 for each password reset request.

Logs in ECC can be checked in the SLG1 transaction code.

SLG1 log

SAP Integration Suite:

SAP IS Message Log
Email picked and responded by SAP Integration Suite
Rating: 0 / 5 (0 votes)

The post SAP Password reset tool using Azure Logic App, SAP PO/Integration Suite and ABAP appeared first on ERP Q&A.

]]>
Salesforce integration with SAP PO Rest Adapter using SFDC Bulk API Concept https://www.erpqna.com/salesforce-integration-with-sap-po-rest-adapter-using-sfdc-bulk-api-concept/?utm_source=rss&utm_medium=rss&utm_campaign=salesforce-integration-with-sap-po-rest-adapter-using-sfdc-bulk-api-concept Tue, 18 Jan 2022 09:26:07 +0000 https://www.erpqna.com/?p=58989 This is blog how to connect with SFDC via REST adapter using SFDC bulk API concept. (without BPM) When to use Bulk API? Bulk API is based on REST and is optimized for loading or deleting large sets of data. We can query, insert, update, upsert, or delete many records asynchronously by submitting batches. Salesforce […]

The post Salesforce integration with SAP PO Rest Adapter using SFDC Bulk API Concept appeared first on ERP Q&A.

]]>
This is blog how to connect with SFDC via REST adapter using SFDC bulk API concept. (without BPM)

When to use Bulk API?

Bulk API is based on REST and is optimized for loading or deleting large sets of data. We can query, insert, update, upsert, or delete many records asynchronously by submitting batches. Salesforce processes batches in the background. Bulk API is designed to make it simple to process data from a few thousand to millions of records.

Objective:

In salesforce, there are multiple standard SFDC objects for example Contact, Account. The objective of this blog is to create a generic solution to integrate with SFDC API 1.0 through SAP PO. This generic solution can be used to call different-different SFDC objects. Only we need to change the structure corresponding to the SFDC object. We can use this approach to call any SFDC object (standard/custom) using the bulk API concept.

Record processing through Bulk API Calls: Steps are defined below:

  1. Create job
  2. Add batch to job
  3. Close job to start processing
  4. Retrieve Batch Result

Firstly, we will understand the concept using POSTMAN. Let’s see how to send required requests using the POSTMAN tool. Please note that in this example, I am going to use :

SFDC custom object: LOM_Tender__c (It’s a custom SFDC Object)

externalIdFieldName: Reference__c

Operation: upsert

1) Create a job: Create a job by sending a POST request to this URI. The request body identifies the type of object processed in all associated batches.

URI : /services/async/APIversion/job

Example XML request body:

<?xml version="1.0" encoding="utf-8"?>
<jobInfo xmlns="http://www.force.com/2009/06/asyncapi/dataload">
    <operation>upsert</operation>
    <object>LOM_Tender__c</object>
    <externalIdFieldName>Reference__c</externalIdFieldName>
    <contentType>XML</contentType>
</jobInfo>

In Request XML, XML field <Object> is SFDC object name.

externalIdFieldName is noting but a primary field in SFDC object.

HTTP Header:

X-SFDC-Session: is an Access token which needs to be retrieved first using Login API. There are multiple blogs available on how to get session ID using SOAP call to SFDC.

ContentType: application/XML;charset=UTF-8

In Response, a job ID needs to be retrieved that would be passed as a query string in subsequent calls. In this case, Job Id is “75077000000A8qZAAS”

<?xml version="1.0" encoding="UTF-8"?>
<jobInfo
   xmlns="http://www.force.com/2009/06/asyncapi/dataload">
    <id>75077000000A8qZAAS</id>
    <operation>upsert</operation>
    <object>LOM_Tender__c</object>
    <createdById>005G0000002kHpQIAU</createdById>
    <createdDate>2021-12-30T05:05:31.000Z</createdDate>
    <systemModstamp>2021-12-30T05:05:31.000Z</systemModstamp>
    <state>Open</state>
    <externalIdFieldName>Reference__c</externalIdFieldName>
    <concurrencyMode>Parallel</concurrencyMode>
    <contentType>XML</contentType>
    <numberBatchesQueued>0</numberBatchesQueued>
    <numberBatchesInProgress>0</numberBatchesInProgress>
    <numberBatchesCompleted>0</numberBatchesCompleted>
    <numberBatchesFailed>0</numberBatchesFailed>
    <numberBatchesTotal>0</numberBatchesTotal>
    <numberRecordsProcessed>0</numberRecordsProcessed>
    <numberRetries>0</numberRetries>
    <apiVersion>47.0</apiVersion>
    <numberRecordsFailed>0</numberRecordsFailed>
    <totalProcessingTime>0</totalProcessingTime>
    <apiActiveProcessingTime>0</apiActiveProcessingTime>
    <apexProcessingTime>0</apexProcessingTime>
</jobInfo>

2) Add batch to job:

Add a new batch to a job by sending a POST request to this URI. The request body contains a list of records for processing. We need to pass Job ID in URL which is received in the previous call.

URI: /services/async/APIversion/job/jobid/batch

In Body, Actual data would be transferred to SFDC.

Header Parameters are the same as the previous request.

Request XML: In this step, we will upload the actual business data which needs to be processed. We can upload multiple records by repeating this step. The maximum limit of records is 10,000 with the bulk API concept.

<?xml version="1.0" encoding="UTF-8"?>
<ns1:sObjects xmlns:ns1="http://www.force.com/2009/06/asyncapi/dataload">
    <ns1:sObject>
        <ns1:Store_Count__c>527</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 2736.55</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-22T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Cash                </ns1:Card_Type__c>
        <ns1:Reference__c>31</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>48</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 1685.03</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-22T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Master Card         </ns1:Card_Type__c>
        <ns1:Reference__c>332</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2102</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
</ns1:sObjects>

Here payload would get differ based on SFDC object name.

In Response, we will receive a job ID & Batch ID.

3) Close Job: Close a job by sending a POST request to this URI. Once the record is uploaded, we must close the job so that record processing will be started. This API request also informs Salesforce that no more batches will be submitted. To close the job, the below detail will be required.

URI : /services/async/APIversion/job/jobId

Header Parameters are the same as the previous request.

Request Payload:

<?xml version="1.0" encoding="UTF-8"?>
<ns0:jobInfo xmlns:ns0="http://www.force.com/2009/06/asyncapi/dataload">
    <ns0:state>Closed</ns0:state>
</ns0:jobInfo>

Response Payload:

<?xml version="1.0" encoding="UTF-8"?>
<jobInfo
   xmlns="http://www.force.com/2009/06/asyncapi/dataload">
    <id>75077000000A8qZAAS</id>
    <operation>upsert</operation>
    <object>LOM_Tender__c</object>
    <createdById>005G0000002kHpQIAU</createdById>
    <createdDate>2021-12-30T05:05:31.000Z</createdDate>
    <systemModstamp>2021-12-30T05:05:31.000Z</systemModstamp>
    <state>Closed</state>
    <externalIdFieldName>Reference__c</externalIdFieldName>
    <concurrencyMode>Parallel</concurrencyMode>
    <contentType>XML</contentType>
    <numberBatchesQueued>0</numberBatchesQueued>
    <numberBatchesInProgress>0</numberBatchesInProgress>
    <numberBatchesCompleted>1</numberBatchesCompleted>
    <numberBatchesFailed>0</numberBatchesFailed>
    <numberBatchesTotal>1</numberBatchesTotal>
    <numberRecordsProcessed>2</numberRecordsProcessed>
    <numberRetries>0</numberRetries>
    <apiVersion>47.0</apiVersion>
    <numberRecordsFailed>0</numberRecordsFailed>
    <totalProcessingTime>157</totalProcessingTime>
    <apiActiveProcessingTime>80</apiActiveProcessingTime>
    <apexProcessingTime>0</apexProcessingTime>
</jobInfo>

4) Retrieve Batch Results: When a batch is completed, you must retrieve the batch result to see the status of individual records.

HTTP Method: Get

URL: https://instance.salesforce.com/services/async/53.0/job/jobId/batch/batchId/result

Job id and batch id are part of URL.

jobId is the job ID that was returned when you created the job. batchId is the batch ID that was returned when you added a batch to the job. Here in this example, batch id is 75177000000BPU8AAO

In response, if the XML field <Success>is true, that means the record is successfully processed on the SFDC side. In case of error, the value would be false.

Now we will see how to design this concept in SAP PO using a REST adapter.

High-level Diagram: This is a high-level design that will be implemented in SAP PO. In this requirement, SFDC sends back the response to Sender to let it know the status of each record.

In SAP PO, we will implement this scenario using 4 different ICOs which will be called one after one using the Async-Sync bridge. (Using Module Request Response Bean and Response One way bean). Let’s move to interface designing in SAP PO.

1. First ICO: System A (Sender) to SFDC :

This first ICO will do the below tasks:

  • Firstly, it will get data from Sender.
  • Lookup to login API to get Session ID which needs to be transferred in subsequent calls to SFDC as part of HTTP header.
  • Create Target SFDC Structure
  • Create a job again via Lookup to SFDC. (Create Job API call)
  • Will send business data to SFDC which is received from the Sender side. (Add batch to job API call)
  • Receive Data from Sender: Firstly we receive data from Sender which is actual business data via any method (JDBC/SOAP/REST). Data can be fetched via any transport protocol.
  • Retrieve Session ID: Once it is received in SAP PO, we will first call Login API via SOAP lookup. There are multiple blogs available already on how to call Login API via lookup.

Operation Mapping: Here in Operation mapping, we have one Message mapping and then Java mapping to be executed.

Message Mapping: Here we create a Target SFDC structure based on the SFDC object.

On root element sObjects, we have SOAP lookup

SOAP lookup java code to retrieve Session ID:

AbstractTrace trace = container.getTrace();
String sessionId = "";
String serverUrl = "";
try {
	//instance the channel to invoke the service.
	Channel channel = LookupService.getChannel("SFDC","CC_SFDC_SOAP_Login_Rcv");
	SystemAccessor accessor = null;
	accessor = LookupService.getSystemAccessor(channel);
	// The Request message in XML. THIS IS THE LOOKUP SERVICE
	
	String loginxml = "<login xmlns=\"urn:enterprise.soap.sforce.com\"> <username>"
	+ UserName
	+ "</username> <password>"
	+ Pass
	+ "</password> </login>";
	
	InputStream inputStream = new ByteArrayInputStream(loginxml.getBytes());
	Payload payload = LookupService.getXmlPayload(inputStream);
	Payload SOAPOutPayload = null;
	
	//The response will be a Payload. Parse this to get the response field out.
	SOAPOutPayload = accessor.call(payload);

	/* Parse the SOAPPayload to get the SOAP Response back */ 
	InputStream inp = SOAPOutPayload.getContent();

	/* Create DOM structure from input XML */  
	DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
	DocumentBuilder builder = factory.newDocumentBuilder();
	Document document = builder.parse(inp);
	//Node node=null;
	NodeList list = document.getElementsByTagName("serverUrl");
	NodeList list2 = document.getElementsByTagName("sessionId");

	//NodeList list = document.getChildNodes();
	int len=list.getLength();
	//String sessionId="";
	for(int i=0;i<len;i++)
	{
		Node     node = list.item(i);
		Node node2=list2.item(i);
		if (node != null && node2 != null)
		{

			node = node.getFirstChild();
			node2= node2.getFirstChild();
			if (node != null && node2 != null)
			{
				serverUrl = node.getNodeValue();
				sessionId = node2.getNodeValue();
			}
		}
	}
}

catch(Exception e)
{e.printStackTrace();}
return (sessionId) ;

Once session ID is received, set it to one of the Header attribute using Dynamic Configuration:

UDF Code to set attribute “sessionID”:

String NameSpace = "http://sap.com/xi/XI/System/REST" ;
DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
DynamicConfigurationKey key  = DynamicConfigurationKey.create( NameSpace, "sessionID");
conf.put(key,sessionID); 
return sessionID;

Now Session is set to one of Dynamic config attribute.

  • Create Target Structure:

With simple Message Mapping, we can pass source data to the target. However In target structure, if you compare the target structure which is shown above in the 2nd API call (Add batch to a Job), we created one extra segment with a name (“AdditionalSegment”) that is actually not a part of the SFDC object.

Question: Why “AdditionalSegment” is required?

Answer: Here in this requirement, we are supposed to send the status of each record to the sender system whether that particular record is successfully processed to SFDC or not.

However, when we make the last call to SFDC (Retrieve Batch Results API), There is no primary field returned by SFDC based on which SAP PO needs to update the status on the sender side.

The primary field on the sender side is STG_TENDER_SEQ. This is mapped to Reference__c on SFDC side which is externalIdFieldName(primary field in SFDC object)

As per the above screenshot, the status of each record (<Success>) is present in the API response. Field “Success” is a flag that has the possible value true & false. It provides us information that whether a record is successfully processed or not using flag true/false. However, there is no primary field or external ID field name returned in each record to know which record belongs to which primary field record.

As per Salesforce documentation:

Results are returned in the same order as the records in the batch request. It’s important to track the record number so that SAP PO can identify the associated success/failed record in the batch request If the Success field is false, the row wasn’t processed successfully. Otherwise, the record was processed successfully.

Reference link :

https://developer.salesforce.com/docs/atlas.en-us.236.0.api_asynch.meta/api_asynch/asynch_api_batches_failed_records.htm

So to solve this, we can create a field with the name “PrimaryField” in which we can keep a track of primary field values separated by a delimiter.

Value of “PrimaryField“ would be transferred to the next ICO using GetPayloadValueBean & PutPayloadValueBean modules.

Once the value is transferred, we will remove this extra segment before passing it to SFDC using the Custom module.

Before Mapping Payload: (comes from Sender)

After Mapping Payload:

<?xml version="1.0" encoding="UTF-8"?>
<ns1:sObjects xmlns:ns1="http://www.force.com/2009/06/asyncapi/dataload">
    <ns1:sObject>
        <ns1:Store_Count__c>7</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 20</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-22T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>myHusky Rewards     </ns1:Card_Type__c>
        <ns1:Reference__c>41</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>731</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 3614.77</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-26T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Cash                </ns1:Card_Type__c>
        <ns1:Reference__c>141</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>57</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 2022.9</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-26T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Master Card         </ns1:Card_Type__c>
        <ns1:Reference__c>142</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>108</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 4249.31</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-26T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>Visa                </ns1:Card_Type__c>
        <ns1:Reference__c>143</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:sObject>
        <ns1:Store_Count__c>9</ns1:Store_Count__c>
        <ns1:Store_Trans_Amount__c> 514.52</ns1:Store_Trans_Amount__c>
        <ns1:Date__c>2013-06-26T00:00:00.000Z</ns1:Date__c>
        <ns1:Card_Type__c>American Express    </ns1:Card_Type__c>
        <ns1:Reference__c>144</ns1:Reference__c>
        <ns1:Retail_Location__r>
            <ns1:sObject>
                <ns1:Location_Number__c>2202</ns1:Location_Number__c>
            </ns1:sObject>
        </ns1:Retail_Location__r>
    </ns1:sObject>
    <ns1:AdditionalSegment>
        <ns1:PrimaryField>41,141,142,143,144</ns1:PrimaryField>
    </ns1:AdditionalSegment>
</ns1:sObjects>

As above after mapping payload, the Primary field contains Primary field (Reference__) values separated by a comma.

  • Create a job via Lookup to SFDC and capture JobID. (Create Job API call)

In this step, we call Create Job API to SFDC.

Below is java mapping which is used to call Create Job API:

Java Mapping read parameters from the ID part as below.

It calls Lookup Channel which makes SFDC call. In return, once we receive a response from SFDC (Create Job API response), it extracts JobID and puts this value to ASMA attribute jobID to pass it to subsequent calls.

Java Class: SFDCBulkAPI_CreateJobLookup

import java.io.*;
import javax.xml.parsers.*;
import org.w3c.dom.*;
import com.sap.aii.mapping.api.*;
import com.sap.aii.mapping.lookup.Channel;
import com.sap.aii.mapping.lookup.LookupService;
import com.sap.aii.mapping.lookup.Payload;
import com.sap.aii.mapping.lookup.SystemAccessor;

public class SFDCBulkAPI_CreateJobLookup_Java extends AbstractTransformation  {

	public void transform(TransformationInput InputObj, TransformationOutput OutputObj)  
	{
		try{
			
		// Reading Session ID
		DynamicConfiguration conf= InputObj.getDynamicConfiguration();
		String NameSpace = "http://sap.com/xi/XI/System/REST" ;
		DynamicConfigurationKey key_sessionID  = DynamicConfigurationKey.create( NameSpace, "sessionID");
		String sessionID = conf.get(key_sessionID);
		
		DynamicConfigurationKey key_jobID  = DynamicConfigurationKey.create( NameSpace, "jobID");
		
		InputStream in = InputObj.getInputPayload().getInputStream();
		
		//Reading ICO Parameters
		String operation = InputObj.getInputParameters().getString("operation");
		String object = InputObj.getInputParameters().getString("object");
		String externalIdFieldName = InputObj.getInputParameters().getString("externalIdFieldName");
		
		OutputStream out = OutputObj.getOutputPayload().getOutputStream();
		execute(in , out, operation, object, externalIdFieldName, conf, key_jobID, sessionID);
		}
		catch(Exception e)
		{
			System.out.println(e);
		}
	}
	void execute(InputStream in , OutputStream out, String operation, String object, String externalIdFieldName, DynamicConfiguration conf, DynamicConfigurationKey key_jobID,  String sessionID)
	{
		try{
		
		// Craete Target Structue
		String requestPayload = "<?xml version=\"1.0\" encoding=\"utf-8\"?><jobInfo xmlns=\"http://www.force.com/2009/06/asyncapi/dataload\"><operation>" 
		+ operation + "</operation><object>" + object + "</object><externalIdFieldName>" + externalIdFieldName + "</externalIdFieldName>"
		+ "<contentType>XML</contentType><sessionID>" + sessionID + "</sessionID></jobInfo>";
		
		String jobID = "";
		  //perform the rest look up
		  Channel channel = LookupService.getChannel("SFDC","CC_SFDC_REST_BulkAPI_CreateJob_Rcv");
		  SystemAccessor accessor = null;
		  accessor = LookupService.getSystemAccessor(channel);
		  
		  InputStream inputStream = new ByteArrayInputStream(requestPayload.getBytes());
		  Payload payload = LookupService.getXmlPayload(inputStream);
		  Payload SOAPOutPayload = null;
		 SOAPOutPayload = accessor.call(payload);
		  
		 // Retrieving Job ID using DOM Pareser
		  InputStream inp = SOAPOutPayload.getContent();
		 DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
		  DocumentBuilder builder = factory.newDocumentBuilder();
		  Document document = builder.parse(inp);
		  NodeList nl_jobID = document.getElementsByTagName("id");
		  Node node = nl_jobID.item(0);
		if (node != null)
		  {
		  node = node.getFirstChild();
		  if (node != null)
		  {
			 jobID = node.getNodeValue();
		  }
		 
		 // Setting Dynamic attribute with Job ID
		conf.put(key_jobID, jobID);
		  }
		 copypayload(in, out);
		}
		catch (Exception e) 
		{
		  e.printStackTrace();
		}
		
	}
	// Copy Payload
	void copypayload(InputStream in , OutputStream out )
	{
		try{
		int size = in.available();
		byte buffer[] = new byte[size];
		in.read(buffer);
		out.write(buffer);
		}
		catch(Exception e)
		{	
		}
	}
}

Please note that in the above java mapping, the Payload structure which is created has one extra field with the name “sessionID” as below. We will see in the next step why it is created.

Output structure from Java mapping :

<?xml version="1.0" encoding="utf-8"?>
<jobInfo xmlns="http://www.force.com/2009/06/asyncapi/dataload">
    <operation>upsert</operation>
    <object>LOM_Tender__c</object>
    <externalIdFieldName>Reference__c</externalIdFieldName>
    <contentType>XML</contentType>
<sessionID>00D770000008aLl!ARkAQBvdVhGVxVT_G5ZK6yyfs1404jmSzfh8HMyo5MOtP<sessionID>
</jobInfo>

After this java mapping, there is one custom module written with the name SetSFDCSessionIDHeader.

Question: Why custom module SetSFDCSessionIDHeader is required?

Answer: This module is written to get the sessionId value from the payload (created above in Java mapping) and set it as HTTP header while doing a lookup to SFDC.

While doing the lookup, I didn’t find any way to pass the ASMA attribute(sessionID) as HTTP header to the lookup channel. To solve this, the SessionID field which is created in java mapping, would be passed to the custom module. In the custom module, SessionID field value would be used to create ASMA attribute and that would be used in the lookup channel to set it as HTTP header.

Module : SetSFDCSessionIDHeader

https://www.erpqna.com/sap-pp-certification
import java.io.*;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;
import javax.xml.transform.stream.StreamResult;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
import org.w3c.dom.NodeList;
import javax.ejb.Local;
import javax.ejb.LocalHome;
import javax.ejb.Remote;
import javax.ejb.RemoteHome;
import javax.ejb.Stateless;
import com.sap.aii.af.lib.mp.module.Module;
import com.sap.aii.af.lib.mp.module.ModuleContext;
import com.sap.aii.af.lib.mp.module.ModuleData;
import com.sap.aii.af.lib.mp.module.ModuleException;
import com.sap.engine.interfaces.messaging.api.Message;
import com.sap.engine.interfaces.messaging.api.MessageKey;
import com.sap.engine.interfaces.messaging.api.MessagePropertyKey;
import com.sap.engine.interfaces.messaging.api.Payload;
import com.sap.engine.interfaces.messaging.api.PublicAPIAccessFactory;
import com.sap.engine.interfaces.messaging.api.XMLPayload;
import com.sap.engine.interfaces.messaging.api.auditlog.AuditAccess;
import com.sap.engine.interfaces.messaging.api.auditlog.AuditLogStatus;

@Stateless(name="SetSFDCSessionIDHeaderModule")
@LocalHome(value=com.sap.aii.af.lib.mp.module.ModuleLocalHome.class)
@Local(com.sap.aii.af.lib.mp.module.ModuleLocal.class)

public class SetSFDCSessionIDHeader implements Module {

private AuditAccess audit;
@SuppressWarnings("deprecation")
MessagePropertyKey messagepropertykey;

public ModuleData process(ModuleContext moduleContext, ModuleData inputModuleData) throws ModuleException
{
try{
audit = PublicAPIAccessFactory.getPublicAPIAccess().getAuditAccess();
String CLASS_NAME = getClass().getSimpleName();
Message msg = (Message) inputModuleData.getPrincipalData();
MessageKey key = new MessageKey(msg.getMessageId(), msg.getMessageDirection());
audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, CLASS_NAME + ": SetSFDCSessionIDHeader Module Called");

messagepropertykey = new MessagePropertyKey("sessionID" , "http://sap.com/xi/XI/System/REST");

// Reading payload values created using Java mapping
Payload payload = msg.getDocument();
byte [] bytePayload = payload.getContent();
   String stringPayload = new String(bytePayload);
   InputStream  in = new ByteArrayInputStream(stringPayload.getBytes());
   DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();
   DocumentBuilder documentbuilder = dbFactory.newDocumentBuilder();
   Document doc = documentbuilder.parse(in);
      
   NodeList nl_operation = doc.getElementsByTagName("operation");
   NodeList nl_object = doc.getElementsByTagName("object");
   NodeList nl_externalIdFieldName = doc.getElementsByTagName("externalIdFieldName");
   NodeList nl_sessionID = doc.getElementsByTagName("sessionID");
   
   String operation = nl_operation.item(0).getFirstChild().getNodeValue();
   String object = nl_object.item(0).getFirstChild().getNodeValue();
   String externalIdFieldName = nl_externalIdFieldName.item(0).getFirstChild().getNodeValue();
   String sessionID = nl_sessionID.item(0).getFirstChild().getNodeValue();
   
   //Setting SessionID ASMA attribute, would be used in Lookup channel to pass this value in HTTP Header
   msg.setMessageProperty(messagepropertykey, sessionID);
   audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, CLASS_NAME + ": Setting ASMA sessionID:" +sessionID );
   
   //creating Payload required for Create Job API
   String outputPayload = "<?xml version=\"1.0\" encoding=\"utf-8\"?><jobInfo xmlns=\"http://www.force.com/2009/06/asyncapi/dataload\"><operation>" + operation + "</operation><object>" + object + "</object><externalIdFieldName>" + externalIdFieldName + "</externalIdFieldName><contentType>XML</contentType></jobInfo>";
   audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, CLASS_NAME + ": Calling end Point:" +outputPayload );
   payload.setContent(outputPayload.getBytes());
   msg.setDocument((XMLPayload) payload);
   
   inputModuleData.setPrincipalData(msg);
   audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, "Module executed successfully");
   return inputModuleData;
}
catch (Exception e) {
throw new ModuleException(e.getClass() + ": " + e.getMessage());
}
}
}
Lookup Channel CC_SFDC_REST_BulkAPI_CreateJob_Rcv Configuration
Lookup Channel Log-Successful call to Create Job API

So CreateJob call is successful to SFDC and in return, we receive JobID which passed as ASMA attribute (via Java class SFDCBulkAPI_CreateJobLookup_Java ) to make our next call on the receiver channel.

Message Attribute after Mapping: It has JobID and sessionID values.

Message Attribute after Mapping
  • Send Output Payload to SFDC (via Add batch to job API call) :

In the above step, an output payload is created which needs to be passed to SFDC. Along with payload, SeesionID needs to be passed as HTTP header and Jobid would be part of the URL.

Receiver channel-Add batch to job Call

Module Parameter: In the Receiver channel, there is a sequence of module parameters.

AF_Modules/GetPayloadValueBean: In Output payload, There is one extra segment with name “AdditionalSegment” which has XML field “PrimaryField”. It contains a sequence of Primary field values.

Using the GetPayloadValueBean module, a parameter called PrimaryField is created which holds the value of XML field “PrimaryField”. This is a parameter value that is nothing but a full XML path to the corresponding XML field.

“get:/ns1:sObjects/ns1:AdditionalSegment/ns1:PrimaryField”

We will use PutPayloadValueBean to use this parameter in order to pass this parameter value to our next flow.

AF_Modules/DynamicConfigurationBean: Dynamic Configuration Bean module is used to pass the value of SessionID to our next flow.

RemoveAdditionalSegmentModule: Output payload which is created in the above step has one extra segment with the name “AdditionalSegment” which is not at all required by SFDC. (Add batch to job call). This extra segment was created just to hold the values of the Primary field and pass it to the next ICO. So now the Primary field value is already being held by parameter using GetPayloadValueBean, so we are good to remove this additional segment now.

So there is one custom module written to remove additional segment:

RemoveAdditionalSegmentModule :

package com.sap.pi;
import java.io.*;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;
import javax.xml.transform.stream.StreamResult;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
import org.w3c.dom.NodeList;
import javax.ejb.Local;
import javax.ejb.LocalHome;
import javax.ejb.Remote;
import javax.ejb.RemoteHome;
import javax.ejb.Stateless;
import com.sap.aii.af.lib.mp.module.Module;
import com.sap.aii.af.lib.mp.module.ModuleContext;
import com.sap.aii.af.lib.mp.module.ModuleData;
import com.sap.aii.af.lib.mp.module.ModuleException;
import com.sap.engine.interfaces.messaging.api.Message;
import com.sap.engine.interfaces.messaging.api.MessageKey;
import com.sap.engine.interfaces.messaging.api.MessagePropertyKey;
import com.sap.engine.interfaces.messaging.api.Payload;
import com.sap.engine.interfaces.messaging.api.PublicAPIAccessFactory;
import com.sap.engine.interfaces.messaging.api.XMLPayload;
import com.sap.engine.interfaces.messaging.api.auditlog.AuditAccess;
import com.sap.engine.interfaces.messaging.api.auditlog.AuditLogStatus;

@Stateless(name="RemoveAdditionalSegmentModule")
@LocalHome(value=com.sap.aii.af.lib.mp.module.ModuleLocalHome.class)
@Local(com.sap.aii.af.lib.mp.module.ModuleLocal.class)

public class RemoveAdditionalSegment implements Module{

private AuditAccess audit;
@SuppressWarnings("deprecation")
MessagePropertyKey messagepropertykey;
public ModuleData process(ModuleContext moduleContext, ModuleData inputModuleData) throws ModuleException
{
try{
audit = PublicAPIAccessFactory.getPublicAPIAccess().getAuditAccess();
String CLASS_NAME = getClass().getSimpleName();
Message msg = (Message) inputModuleData.getPrincipalData();
MessageKey key = new MessageKey(msg.getMessageId(), msg.getMessageDirection());
audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, CLASS_NAME + ": RemoveAdditionalSegmentModule Module Called");

//Reading Module Parameters "SegmentName" which needs to be removed from payload
String SegmentName = moduleContext.getContextData("SegmentName");

Payload payload = msg.getDocument();
InputStream in = (InputStream)payload.getInputStream();
DocumentBuilderFactory documentbuilderfactory = DocumentBuilderFactory.newInstance();
documentbuilderfactory.setNamespaceAware(true);
DocumentBuilder documentbuilder = documentbuilderfactory.newDocumentBuilder();
Document document  = documentbuilder.parse(in);
document.normalize();
Element element  = document.getDocumentElement();
Element additional_element = (Element)document.getElementsByTagName(SegmentName).item(0);
audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, "Removing additional element: "+SegmentName );
//Removing Additional segment
element.removeChild(additional_element);
document.normalize();  

//After removing extra segment, passing remaining payload.
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
   TransformerFactory.newInstance().newTransformer().transform(new DOMSource(document), new StreamResult(byteArrayOutputStream));
   byte[] bytes = byteArrayOutputStream.toByteArray();
   payload.setContent(bytes);
   msg.setDocument((XMLPayload) payload);
   inputModuleData.setPrincipalData(msg);
   audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS, "RemoveAdditionalSegmentModule executed successfully");
   return inputModuleData;
}
catch (Exception e) {
throw new ModuleException(e.getClass() + ": " + e.getMessage());
}
}
}

AF_Modules/RequestResponseBean: This module is used to make async/sync bridge.

AF_Modules/PutPayloadValueBean : In the Previous step, In GetPayloadValueBean module, One parameter is created PrimaryField to hold the value of the Primary field.

Now PutPayloadValueBean is used to put that parameter value to the response structure which would be passed to the next ICO.

Parameter value is nothing but the XML path put:/ns0:batchInfo/ns0:PrimaryField where this value would be placed in output/response structure.

AF_Modules/ResponseOnewayBean: This module is used to pass the response of SFDC (Add batch to job API call). This response needs to be passed to the next flow. In parameter values, details of the next ICO are given.

ResponseOnewayBean Parameters
Message Monitoring

2) Second ICO: SFDC to SFDC: (Close Job API call)

In our first ICO, basically, we covered the first two SFDC call, i.e. create Job & Add batch to Job API call.

In this ICO, we cover the next call i.e. Close Job.

Close Job_ICO
Incoming Payload (Response payload received from previous ICO)

As we see above, This payload is nothing but the response of Add Batch to Job API call along with one extra field PrimaryField which is added by module PutPayloadValueBean in the previous ICO receiver channel.

Message Attribute: Session ID is passed from the previous ICO using DynamicConfigurationBean.

Here in Operation Mapping, we have simple message mapping which creates the structure of the Close Job API call.

Operation Mapping

There is only one UDF to create JobID (because it is not passed from the Previous flow). JobID is being received in the input payload. This value needs to be passed to the receiver channel to close the Job as part of the URL.

UDF:

String NameSpace = "http://sap.com/xi/XI/System/REST" ;
DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
DynamicConfigurationKey key  = DynamicConfigurationKey.create( NameSpace, "jobID");
conf.put(key,var1); 
return var1;
Output Payload created by above Message mapping

Here we pass State as “Closed” as required by SFDC Close Job API call.

Apart from it, in an extra segment “AdditionalSegmet”, there are two extra values i.e. Primaryfield and Batch ID. Both these values are required in our next 3rd flow.

Message Attribute after mapping: It contains JobID & SessionID field values.

Receiver Channel Close job:

Receiver Channel_CloseJob

jobID & sessionID values are present in HTTP Header.

JobID needs to be passed as part of URL and sessionID needs to be passed as part of HTTP header.

Module Tab:

AF_Modules/GetPayloadValueBean : In Output payload, There is one extra segment with name “AdditionalSegment” which has XML field “PrimaryField” & BatchID.

This module is being used two times here to create two parameters that hold the value of PrimaryField & BatchID. Here Parameter value is the full XML path to the corresponding XML field.

get:/ns0:jobInfo/ns0:AdditionalSegment/ns0:PrimayField

get:/ns0:jobInfo/ns0:AdditionalSegment/ns0:BatchID

We will use PutPayloadValueBean to use these parameters in order to pass these parameter values to our next flow.

AF_Modules/DynamicConfigurationBean: Dynamic Configuration Bean module is used to pass the value of SessionID to our next flow.

RemoveAdditionalSegmentModule: Output payload which is created in the above step has one extra segment with the name “AdditionalSegment” which is not at all required by SFDC. (Close job API call). This extra segment was created just to hold the values of the Primary field & BatchID and pass it to the next ICO. So now Primary field value & BatchID are already being held by parameters using GetPayloadValueBean, so we are good to remove this additional segment now.

AF_Modules/RequestResponseBean: This module is used to make async/sync bridge.

AF_Modules/PutPayloadValueBean: In Previous step, GetPayloadValueBean module, two parameters are created PrimaryField & BatchID.

Now PutPayloadValueBean is used two times, to put these parameter values into a response structure which would be passed to the next ICO.

Parameter values are nothing but the XML path where this value would be placed in output/response structure.

put:/ns0:jobInfo/ns0:BatchID

put:/ns0:jobInfo/ns0:PrimayField

AF_Modules/ResponseOnewayBean: This module is used to pass the response of SFDC (Close Job API call response). This response needs to be passed to the next flow. In parameter values, details of the next ICO are given.

Message Monitoring

3) Third ICO: SFDC to SFDC: (Retrieve Batch Results)

So far, a job is created, and a batch is added to the job. After that Job is closed in the previous step. Now in this step, we retrieve batch results to get to know the status of each record whether the record is successfully updated in SFDC or not.

Incoming Payload (Response payload received from previous ICO)

This payload is a response of Add Batch to Job API call along with two extra fields PrimaryField & BatchID which is added by module PutPayloadValueBean in the previous ICO receiver channel.

Now we need to call Retrieve Batch Results API on the SFDC side which is actually a HTTP Get call so we are concerned about Payload which gets passed to SFDC.

At the level of operation mapping, there is simple One to One mapping along with a small UDF which is putting some delay of around 1 minute. So, the question is why this delay is needed? This delay is needed because whenever we close the job, record processing gets started which may take some time to get processed on the SFDC side. So on the previous ICO receiver channel, Job is closed, and after that here in Operation mapping, we are putting some delay before we retrieve the batch result.

UDF to put delay :
long l_DelayTimeMinutes = Long.parseLong(DelayMinutes);
try {
			TimeUnit.MINUTES.sleep(l_DelayTimeMinutes);
		} 
		catch (InterruptedException e) {
			
			e.printStackTrace();
		}
return "";

Receiver channel to call retrieve Result

Here jobid element refers to the <id> field from the payload. It is used in getting payload as part of URL as a query string.

BatchId element refers to BatchID field from payload. It is used in getting payload as part of URL as a query string.

sessionID is used in the HTTP header parameter.

ModuleTab

Here again, AF_Modules/GetPayloadValueBean is used to pass XML field value “PrimaryField” from the current ICO to the next ICO.

This module creates a parameter that hold the value of PrimaryField .

We use PutPayloadValueBean to pass this parameter value to the next flow. It put this parameter value into the response structure which would be passed to the next ICO.

AF_Modules/ResponseOnewayBean: This module is used to pass the response of SFDC (Retrieve result API result response). This response needs to be passed to the next flow. In parameter values, details of the next ICO are given.

Message Monitoring: Message processing took around 1 minute time due to delay put at mapping level.

Message Monitoring

4) Third ICO: SFDC to Sender: (update SFDC Result back to Sender)

So, in the previous step, we made a SFDC call to Retrieve Batch results via HTTP Get method. So, In this ICO, we get batch results.

Incoming Payload: This incoming payload is the response to API “Retrieve batch result API”. As we sent a total of five records to SFDC so here there are five <result> tags are received. Along with API response, <PrimaryField> is added by module tab PutPayoloadValueBean.

In Response, if the XML field <Success> is true, that means the record is successfully processed on the SFDC side. In case of error, the value would be false.

Incoming Payload

As discussed previously, as per SFDC note, Results are returned in the same order as the records in the batch request. So this was the reason to track the record number in <PrimaryField>.

Here at the mapping level, there is a simple UDF to put the primary field value in each <result> segment in the same order which is returned from SFDC.

UDF :

String primayfield = var1[0];	
String pf[] =  primayfield.split(“,”);
		for(int i = 0 ; i < pf.length ;  i++)
		{
		result.addValue(pf[i]);
		}

Output Payload from mapping: Below is the output payload which has the status(<Success>) & Primary field(KeyField) of each record sent to SFDC along to know whether that primary field is successfully updated on the SFDC side. Now, this payload is good to update to the sender.

<ns0:results xmlns:ns0="http://www.force.com/2009/06/asyncapi/dataload">
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpUUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>41</ns0:KeyField>
	</ns0:result>
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpVUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>141</ns0:KeyField>
	</ns0:result>
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpWUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>142</ns0:KeyField>
	</ns0:result>
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpXUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>143</ns0:KeyField>
	</ns0:result>
	<ns0:result>
		<ns0:id>a1Q8A00000CEMpYUAX</ns0:id>
		<ns0:success>true</ns0:success>
		<ns0:created>true</ns0:created>
		<ns0:KeyField>144</ns0:KeyField>
	</ns0:result>
</ns0:results>
Message Monitoring
Rating: 5 / 5 (1 votes)

The post Salesforce integration with SAP PO Rest Adapter using SFDC Bulk API Concept appeared first on ERP Q&A.

]]>
New features for Integration Flow tooling – PI Configuration Search and Where-Used https://www.erpqna.com/new-features-for-integration-flow-tooling-pi-configuration-search-and-where-used/?utm_source=rss&utm_medium=rss&utm_campaign=new-features-for-integration-flow-tooling-pi-configuration-search-and-where-used Tue, 14 Sep 2021 09:48:05 +0000 https://www.erpqna.com/?p=54032 SAP Process Integration is a part of the SAP NetWeaver platform. It is called SAP NetWeaver Exchange Infrastructure XI in NetWeaver 7.0 ehp2 and older versions. SAP NetWeaver Process Integration is a part of the NetWeaver software component and is used for exchange of information in company’s internal system or with external parties.

The post New features for Integration Flow tooling – PI Configuration Search and Where-Used appeared first on ERP Q&A.

]]>
If you regularly work with Integration Flows and related objects in the NWDS SAP Integration Designer, then earlier or later you will come across the search functionality. The Search PI Objects view offers a search which relies on a local client-based index. The index is built each time after the login if the property “Enable content indexing for object references in search and system landscape” is enabled. The search shows findings in object keys as well as findings in referenced objects.

If the property is disabled or index can’t be created, you will get only reduced result set which is based on object keys only:

Different search results dependent on content indexing flag

Unfortunately, the indexing of content is slow. I guess, your coffee consumption after login to the Integration Flow tooling increased in the past months where you worked from home with a slower network – login, and wait, and wait, and wait… until content indexing run is ready.

But a local client-based index has another major drawback – it is never updated (… unless you press “Update Search Index” and go for another cup of coffee). That means, if your colleague made a change, created a new object, deleted an object, transported an object in the meantime – your search does not reflect those changes and you cannot rely on the shown search results.

PI Configuration Search

These reasons brought us to rework the search functionality and we are happy to introduce a new PI Configuration Search which is available in the NWDS SAP Integration Designer starting with 7.50 SP22. The major differentiator of the new search is that it is executed on the server and so always reflects the latest content.

PI Configuration Search is integrated into standard Eclipse search dialog. Use Search -> PI Configuration… menu or the search menu in the toolbar to get there.

PI Configuration Search menu

In contrast to the old search, the PI Configuration Search is object type specific and attribute-based. As first step of the search, select a value from the Object Type combo box. Your last selection is memorized by the dialog. You can select any object type which is supported by Integration Flow tooling.

Select any object type supported by Integration Designer

The content of the Attributes table is updated dependent on the selected object type and shows all key fields and in some cases also other useful attributes (e. g. agency, schema for Value Mappings) by default.

Standard search using key attributes

In case the standard selection doesn’t fit you needs you can select a different attribute from the combo box available in each cell in the Attribute column. Beside key attributes, you can search for some object-type specific attributes, description and administrative information like responsible person, date of last change and others.

Available search attributes per object type

You can add more attribute lines by pressing ‘+’ button below the table or remove a selected line by selecting ‘-‘ button.

Add or remove attribute lines

Different attributes are combined with AND in the resulting search query. Same attributes are combined with OR. You can formulate complex queries containing several attributes. Search pattern preview helps you to understand how the attributes are combined with each other.

Preview of the resulting search query

The column Operation in the Attributes table allows you to select a different operation if the default operation doesn’t fit your needs. The list of available operations depends on attribute type. If you want to search using a wildcard (supported wildcards are ‘*’ meaning any string and ‘?’ meaning any character) use operation like or not like.

Several operations available

Finally enter a value which you want to search for. Be aware that search is case-sensitive. That means, that a search for “xipattern” and “XiPattern” will lead to different result sets! This fact is probably a bit unusual nowadays, but it is a toll which we have to pay if the basis framework underneath is getting a bit older.

For most attributes you can type a value directly into the cell, or use an offered F4 help, or select the value from a combo box.

F4-help showing date chooser
Example: Fixed values F4

Once you hit the “Search” button the search query is executed on the server. This guarantees that search operates on up-to-date data and always returns current results. The results are shown in PI Configuration Search Result view, which opens automatically when search is executed (alternatively you can open it via menu Window -> Show View -> PI Configuration Search Result).

The view shows for which object type the query was executed and the search query pattern. In the right lower corner, you can see how many objects were found for the given query.

PI Configuration Search Result view

Sometimes the number of lines in the result and the number of unique objects can deviate. This happens when an attribute has multiple values. One prominent example are the module names in communication channels. Usually, multiple modules are assigned to the module chain. Therefore, in some cases multiple lines are shown for the same object, like in this example:

One object can be shown multiple times in the result

Per default, the search result view shows like search dialog only the key attributes. However, you can always adjust the set of visible columns using the “Show/hide columns” button in the upper right corner. It offers you same set of columns as the search dialog. You can decide which columns to display and in which order. The last filter selection per object type is memorized and will be used also for subsequent search queries. For the above example with module names the “Show/hide columns” filter is defined as following:

Change the set of visible columns

You can open an object from the result view by double clicking on a line, pressing enter on your keyboard when a line is selected or using right mouse context menu ‘Open’.

Where-Used List

In the context menu you will also find another new functionality which was made available during the same development – it is the “Where-Used List”, also known as cross-reference search.

Where-Used List functionality added to Integration Designer

Even if the old Search PI Object view combines search and some kind of where-used functionality in one view, it was not possible to trigger “where-used” query explicitly for an object so far. With SP22 we make cross-reference search, which you probably know form the old Swing tools also available in the NWDS SAP Integration Designer.

You can trigger the “Where-Used List” from context menu for a selected object in the PI Explorer, from PI Configuration Search Result view or from Where-Used List view.

Trigger Where-Used List from PI Explorer

The results of cross-reference search are shown in the Where-Used List view. The view opens automatically when the functionality is triggered. Or you can alternatively open it via Window -> Show View -> Where-Used List.

Results of the Where-Used List query

The view is showing all objects which are referencing the given object. Objects, which are generated during deployment, like Integrated Configuration or embedded communication channels are shown underneath the corresponding Integration Flow node and marked with an overlay icon.

You can open the objects using right mouse click context menu or via double click (alternativly press enter on the keyboard). Or you can trigger further where-used list operations if you need to find cross-reference over multiple objects.

The view allows you to show the history of last five queries in order to access these results quickly and allows you to navigate back and force between the queries in the history:

Toolbar of the Where-Used List view

Sometimes “Link with editor” functionality of this view can be interesting for you. When the button is toggled (means “Link with editor” is on), the view will react on the changes to the open editors and will update the content of the view and show the cross-references for currently selected object on the fly.

‘Where-Used in Configuration’ for NWDS ESR

You probably are now asking, and what is about the ESR objects – can I find out if an operation mapping is used in Integration Flow? The good news is – yes, it is now possible as well!

The existing NWDS ESR Where-Used functionality supported so far only cross-reference between ESR objects. Now we go one step forward and offer “Where-Used in Configuration” for ESR objects like operation mapping, service interfaces, IDoc, RFC and Adapter Metadata – objects which can be directly used by the configuration. The functionality is integrated into ESR Search Result view, Enterprise Service Browser tree and ESR Where-Used result view and will help you to find, whether a design object is used by any configuration object.

Where-Used in Configuration accessible from Enterprise Service Browser
… or from ESR Search Result view

The result of this query is shown in the Where-Used List view, as described above for configuration objects:

Results of the Where-Used in Configuration
Rating: 0 / 5 (0 votes)

The post New features for Integration Flow tooling – PI Configuration Search and Where-Used appeared first on ERP Q&A.

]]>
Facilitate JWT with Auth0 and OpenAPIs on SAP PO https://www.erpqna.com/facilitate-jwt-with-auth0-and-openapis-on-sap-po/?utm_source=rss&utm_medium=rss&utm_campaign=facilitate-jwt-with-auth0-and-openapis-on-sap-po Wed, 07 Jul 2021 10:00:07 +0000 https://www.erpqna.com/?p=50351 Overview Identity providers like Auth0 allow companies to “outsource” the hard work of hosting an own identity provider solution by leveraging open standards like Oauth2 and JWT (JSON Web tokens). By that you could establish a trust between your API and Auth0 as authentication service. An API client would then authenticate itself in Auth0 and […]

The post Facilitate JWT with Auth0 and OpenAPIs on SAP PO appeared first on ERP Q&A.

]]>
Overview

Identity providers like Auth0 allow companies to “outsource” the hard work of hosting an own identity provider solution by leveraging open standards like Oauth2 and JWT (JSON Web tokens).

By that you could establish a trust between your API and Auth0 as authentication service. An API client would then authenticate itself in Auth0 and obtain a JWT (JSON Web) token.

The client would then invoke your service with the token holding all its claims and your service would verify that token and grant/deny access to the API according to the claims.

This blog shows how to implement such a scenario with a JWT token provider like Aut0 to safeguard Open APIs hosted on SAP PO with the KaTe RESTful adapter.

As Identity provider we’ll use Auth0 for this example but keep note that this would work with any identity provider that supports the JWT standard (e.g. WSO2 Identity server, Keycloak or Azure AD).

This architecture could serve as a viable alternative to publish on premise SAP APIs with the OpenAPI standard independent of any product stack as it works with any API management and/or identity solutions.

Setup Auth0

First we set up our solution on Auth0.

Auth0 let’s you define a so called domain (subdomain of Auth0 or if you have own DNS names a custom domain name).

Within that domain you can define an API artifact that should be secured (here our API hosted on SAP PO) and a client application that wants to authenticate against Auth0 and invoke the service on SAP PO.

Create API

In the Auth0 UI, we first need to create an API artifact that represents the API we want to secure (here Petstore API) and we use the RS256 (SHA 256) signing algorithm and “https://my-po-host:443/MyAPI” as audience identifier.

SAP Process Integration, SAP Process Orchestration

Under the permissions tab we also create one scope called call:petstore for accessing the API which we’ll use at a later stage to narrow permissions on API and operation level.

Then we create an application artifact in Auth0 that references this API. The application itself is a representation of a client application that wants to invoke our API (here for Machine to Machine communication).

We assign the API to the application with its scope call:petstore checked.

This enables us at a later stage to obtain the JWT tokens including the scope call:petstore that our API will check to allow or to deny API access.

The next page gives us all the credential information that our client needs in order to fetch a token via an Oauth2 client credentials grant

  • client Id
  • client secret

As we would like to test with our Swagger UI running on top of SAP PO instead of a full blown client application, we’ll include its hostname/IP in the CORS/Allowed Web Origins (Swagger UI is loaded from SAP PO and requests access from there to Auth0 via XMLHttpRequest from the WebUI).

Configure a RESTful sender channel

We change the authentication option to “OAuth2 JWT token”.

This allows us to configure

  • The JWT Web key set URL of Auth0 (“Where can i verify tokens”?)
  • Issuer Realm URL (“Who issued the token”?)
  • Audience (“Who is allowed for this token usage”?)

The Web key set URL in Auth0 is composed of your domain name and the path “/.well-known/jwks.json”.

The domain name is setup initially during setup of your Auth0 account.

https://YOUR_DOMAIN/.well-known/jwks.json

In our example https://dev-x-9tty72.eu.auth0.com/.well-known/jwks.json

The Issuer Realm URL is simply the domain name itself as an URL (here: https://dev-x-9tty72.eu.auth0.com/), the audience is the audience that we want to allow in our API and we set in our Aut0 API configuration (https://my-po-host:443/MyAPI)

For now we leave all other finer grained validation check boxes empty as we just want to verify the JWT token is from the right origin in Auth0 and for the right audience.

Security of OpenAPI definition

In order to test the API directly (simulating a client) we also need to instruct our Swagger UI via Open API definition to show us to this option.

In order to let the adapter include the security information in the Open API definition, we additionally check the checkbox “Add Security to OpenAPI description”. This will add the necessary security mechanisms to invoke this API automatically to the OpenAPI definition published from SAP PO.

The channel configuration shows an option to configure the oauth2 grant type & token endpoint a client should use (and will show up then in the API definition security section)

Auth0 provides us this information under the “quick start” tab of the application

Note that Auth0 uses a field audience additionally specific to Auth0 that is needed to fetch the token in a client credentials grant.

We use this information to configure the channel publishing options with an additional audience parameter.

Open the Swagger UI and authenticate

Now we open our published Open API definition on PO again which will show us the “Authorize” symbol on the upper right

If we click the link for our JSON open API definition we see the security section included into the definition

If we click on it, it will ask us to submit our client credentials to authenticate against Auth0 as any client would be required to. Swagger UI then caches the generated access tokens for a our test calls.

Now we invoke the service GET /pet/{petId}

We see it worked!

The swagger UI used the fetched token and added the authorization header with the token from Auth0 when invoking the service (see curl line)

At invocation time the sender channel verifies the token’s signature to the public key obtained from the JSON Web key set URL of Auth0 and then validates the issuer and audience passed.

Depending on the output it will allow or deny access.

Now let’s have a look at the token itself and the details of validation:

If we copy and paste it into the JWT debugger on https://jwt.io/ we can analyze all of its contents.

A JWT token is nothing less then a Base64 encoded string with 3 sections:

  • header
  • payload
  • signature of its contents

The header tells us the signing algorithm and public key id (kid). The payload shows the issuer realm (iss) audience (field aud), scope (call:petstore) and a few other things.

At runtime the adapter uses this information to lookup (and cache) the public key and verify the token signature first. If the signature is valid it will validate the issuer field and audience and then allow or deny access.

Then depending on your settings in your channel it will further validate

  • scope on global level (for the whole API)
  • scope on API operation level

Restrict an API to a scope

Now let’s do something more sophisticated and let our adapter validate the scope of the tokens.

Without setting any scope validation settings in our channel, the token would allow us to invoke any API operation on this API as long as its valid.

As we saw the token already contains a scope we defined in Aut0: call:petstore that we could use.

Now we want to restrict the API:

  • to verify the scope “call:petstore” for any API operation
  • (2nd step) verify to write a pet (e.g. POST /pet/{petId}) only with a special scope

We set now call:petstore as global scope in our channel(global means applicable for the whole API):

If we invoke the API again it will be successful again as the token already contains the scope “call:petstore”.

Restricting an API operation on a scope

Now let’s additionally define which scope is needed to write to pets.

By that we define a scope save:pets to restrict the operation POST /pet/{petId} as a local scope. A caller would then need the save:pets scope to invoke this operation or is otherwise denied.

If we invoke now any other API operation, e.g. GET /pet/{petId} again, it will still work as the global scope “call:petstore” is applicable.

If we invoke POST /pet/{petId} it doesn’t work as our token only holds the call:petstore scope, but not the new save:pets scope. The API returns 403 “token did not match expected scope”.

Now let’s go back to Auth0 and add a “save:pets” scope permission to the API.

And let’s allow the application as client also to use it.

Then let’s hit logout from our Swagger UI Authorize button as otherwise our old token with only one scope is still held in the SwaggerUI browser cache as long as its not expired.

Now we login again with the same credentials. This will force a new logon and thereby a new token that contains the updated settings (both scopes that we just added)

If we invoke the POST /pet/{petId} operation again it’s now successful as the token contains now the added scope.

The sender channel can validate the POST /pet/{petId} invocation as valid with the scope save:pet and let’s it pass

If we look at our raw token again we can also see the added scope save:pets that allowed us this operation

Rating: 0 / 5 (0 votes)

The post Facilitate JWT with Auth0 and OpenAPIs on SAP PO appeared first on ERP Q&A.

]]>
SAP PI/PO Communication Channel Tool https://www.erpqna.com/sap-pi-po-communication-channel-tool/?utm_source=rss&utm_medium=rss&utm_campaign=sap-pi-po-communication-channel-tool Wed, 28 Apr 2021 12:00:04 +0000 https://www.erpqna.com/?p=47109 Purpose In this blog post I will show you how can you work this a new communication channel tool in order to start or stop a channel in SAP PO from a SAP ABAP Program. My system My System: SAP S/4HANA 1909 and SAP PO 7.5 SAP Note relevant 2691666 – How to extract PI […]

The post SAP PI/PO Communication Channel Tool appeared first on ERP Q&A.

]]>
Purpose

In this blog post I will show you how can you work this a new communication channel tool in order to start or stop a channel in SAP PO from a SAP ABAP Program.

My system

My System: SAP S/4HANA 1909 and SAP PO 7.5

SAP Note relevant

2691666 – How to extract PI Communication Channel list

Steps

1. You need to create a new program in the sap transaction SE38 as you can see in the developer system.

Write the name of the program, change the attributes/Type as “Executable program” and click on SAVE

SAP Process Orchestration, SAP Process Integration

In the following window you can select a Package and a transportation request or you just can select Local Object.

2. Paste the following code and activate it.

*&---------------------------------------------------------------------*
*& Report  ZPO_TOOL_CHANNEL.
*&
*&---------------------------------------------------------------------*
*&
*&
*&---------------------------------------------------------------------*
*&---------------------------------------------------------------------*
*&INFORMATION
*&WRICEF                 :   001
*&Program                :   ZPO_TOOL CHANNEL
*&Títle                  :   ZPO_TOOL CHANNEL
*&Autor                  :   Freddy Valderrama
*&Date                   :   26.04.2021 10:58:37
*&Description            :   Program for administration of communications channel of SAP PO, you can START or STOP a channel with double click in the name of channel.
*&                           with double click in the first four lines you can, stop all the channel, start all the channel,
*&                           download the current status or update with the previus saved status.
*&Prerequisites          :   it is necessary the communication channel has the control data in the communication channel monitor in External
*&*************************************************************************


REPORT  ZPO_TOOL_CHANNEL.

INCLUDE <icon>.
INCLUDE <symbol>.

DATA: v_url  TYPE string,
      client TYPE REF TO if_http_client.
DATA: response_code TYPE  sysubrc,
      response_text TYPE  string.
DATA: fields_tab    TYPE tihttpnvp,
      status_code   TYPE string,
      status_reason TYPE string,
      number        TYPE i.
DATA: w_result   TYPE string,
      result_tab TYPE TABLE OF string,
      result_wa  LIKE LINE OF result_tab.
DATA : lv_file_string TYPE xstring,
       te_t_xml_data  TYPE TABLE OF smum_xmltb,
       te_return      TYPE TABLE OF bapiret2,
       ls_return      TYPE bapiret2.
DATA: BEGIN OF ls_w_zgraymarketinfo ,
        party           TYPE string,
        service         TYPE string,
        channelname     TYPE string,
        adaptertype     TYPE string,
        channelstate    TYPE string,
        activationstate TYPE string,
        color           TYPE lvc_t_scol,
      END OF ls_w_zgraymarketinfo.
DATA: BEGIN OF ls_w_zgraymarketinfos ,
        party           TYPE string,
        service         TYPE string,
        channelname     TYPE string,
        adaptertype     TYPE string,
        channelstate    TYPE string,
        activationstate TYPE string,
      END OF ls_w_zgraymarketinfos.
DATA: te_t_zgraymarketinfo  LIKE STANDARD TABLE OF ls_w_zgraymarketinfo,
      te_t_zgraymarketinfo2 LIKE STANDARD TABLE OF ls_w_zgraymarketinfo,
      te_t_zgraymarketinfos LIKE STANDARD TABLE OF ls_w_zgraymarketinfos,
      ls_s_zgraymarketinfo  LIKE ls_w_zgraymarketinfo.
DATA :xml_table    TYPE TABLE OF smum_xmltb,
      ls_xml_table TYPE smum_xmltb,
      return       TYPE TABLE OF  bapiret2.
DATA gr_table TYPE REF TO cl_salv_table.
DATA functions TYPE REF TO cl_salv_functions_list.
DATA: l_text TYPE string,
      l_icon TYPE string.
DATA g_dsp TYPE REF TO cl_salv_display_settings.
DATA gr_columns  TYPE REF TO cl_salv_columns_table.
DATA column  TYPE REF TO cl_salv_column.
DATA lo_sorts TYPE REF TO cl_salv_sorts.
DATA: ls_color               TYPE lvc_s_scol.
DATA: lr_layout TYPE REF TO cl_salv_layout,
      ls_key    TYPE        salv_s_layout_key,
      l_repid   TYPE        sy-repid.
DATA: lr_columns TYPE REF TO cl_salv_columns.
DATA: lr_display TYPE REF TO cl_salv_display_settings.
DATA: gr_layout TYPE REF TO cl_salv_layout.
DATA: key TYPE salv_s_layout_key.
DATA: lo_header  TYPE REF TO cl_salv_form_layout_grid,
      lo_h_label TYPE REF TO cl_salv_form_label,
      lo_h_flow  TYPE REF TO cl_salv_form_layout_flow.
DATA lv_lines TYPE i.
DATA s_slddest TYPE sld_s_accessdata_display.

DATA: lv_start TYPE string VALUE 'START'.
DATA: lv_stop TYPE string VALUE 'STOP'.
DATA: lv_download TYPE string VALUE 'DOWNLOAD'.
DATA: lv_upload TYPE string VALUE 'UPLOAD AND ***UPDATE***'.

DEFINE titulo.
  &2 = &1->get_column( &3 ).
  &2->set_short_text( &4 ).
  &2->set_medium_text( &5 ).
  &2->set_long_text( &6 ).
  &2->set_output_length( &7 ).
END-OF-DEFINITION.

DEFINE set_color.
  ls_color-fname     = &1.
  ls_color-color-col = &2.
  ls_color-color-int = &3.
  ls_color-color-inv = &4.
  APPEND ls_color TO ls_w_zgraymarketinfo-color.
END-OF-DEFINITION.

CLASS lcl_datos DEFINITION.
  PUBLIC SECTION.
    METHODS:

      on_user_command FOR EVENT added_function OF cl_salv_events
        IMPORTING e_salv_function,

      double_click
                    FOR EVENT double_click OF cl_salv_events_table
        IMPORTING row.

ENDCLASS.

DATA lr_events TYPE REF TO cl_salv_events_table.
DATA  gr_events      TYPE REF TO lcl_datos.

CLASS lcl_datos IMPLEMENTATION.
  METHOD double_click.
    DATA lv_strig0 TYPE string.
    DATA lv_strig TYPE string.
    DATA lv_new_state TYPE string.
    DATA: ls_is_stable          TYPE lvc_s_stbl.
    DATA: gd_nom_archivo TYPE string,
          ld_filename    TYPE string,
          ld_path        TYPE string,
          ld_fullpath    TYPE string,
          ld_result      TYPE i.
    DATA: BEGIN OF gs_line_salida,
            line TYPE string,
          END OF gs_line_salida.
    DATA gt_line_salida  LIKE TABLE OF gs_line_salida.
    DATA ti_file    TYPE filetable.
    DATA wa_file    TYPE LINE OF filetable.
    CONSTANTS: lv_stop(7) VALUE 'STOP'.
    CONSTANTS: lv_start(7) VALUE 'START'.


    CHECK row IS NOT INITIAL.
    READ TABLE te_t_zgraymarketinfo INTO ls_w_zgraymarketinfo INDEX row.
    CASE ls_w_zgraymarketinfo-service.
      WHEN lv_start.
        lv_strig = ' ?'.
        lv_strig0 = '¿Alert!!!: Do you want to "' && lv_start &&  '" all channels'.
        lv_new_state = lv_start.
        PERFORM ask_change_mass USING lv_strig0 lv_strig lv_new_state row.
      WHEN lv_stop.
        lv_strig = ' ?'.
        lv_strig0 = '¿Alert!!!: Do you want to "' && lv_stop &&  '" all channels'.
        lv_new_state = lv_stop.
        PERFORM ask_change_mass USING lv_strig0 lv_strig lv_new_state row.
      WHEN lv_download.
        gd_nom_archivo = 'Backup_Tool_Channel_' && sy-datum && '-' && sy-uzeit && '.txt'.
* Display save dialog window
        CALL METHOD cl_gui_frontend_services=>file_save_dialog
          EXPORTING
            window_title      = 'Download File with all Channels'
            default_extension = 'txt'
            file_filter       = '(*.txt)'
            default_file_name = gd_nom_archivo
            initial_directory = 'c:\'
          CHANGING
            filename          = ld_filename
            path              = ld_path
            fullpath          = ld_fullpath
            user_action       = ld_result.

        gd_nom_archivo = ld_fullpath.

        CLEAR te_t_zgraymarketinfos[].
        LOOP AT te_t_zgraymarketinfo INTO ls_w_zgraymarketinfo.
          MOVE-CORRESPONDING ls_w_zgraymarketinfo TO ls_w_zgraymarketinfos.
          APPEND ls_w_zgraymarketinfos TO te_t_zgraymarketinfos.
        ENDLOOP.

        CALL METHOD cl_gui_frontend_services=>gui_download
          EXPORTING
            filename              = gd_nom_archivo
            write_field_separator = 'X'
          CHANGING
            data_tab              = te_t_zgraymarketinfos.

        MESSAGE i398(00) WITH 'Was saved the file:' ld_filename.

      WHEN lv_upload.
        CALL METHOD cl_gui_frontend_services=>file_open_dialog
          EXPORTING
            window_title     = 'Select File'
            default_filename = '*.txt'
            multiselection   = ''
          CHANGING
            file_table       = ti_file
            rc               = ld_result.

        READ TABLE ti_file INTO  wa_file INDEX 1.
        CHECK sy-subrc IS INITIAL.
        gd_nom_archivo = wa_file-filename.
        CLEAR te_t_zgraymarketinfos[].

        CALL METHOD cl_gui_frontend_services=>gui_upload
          EXPORTING
            filename                = gd_nom_archivo
            has_field_separator     = 'X'
          CHANGING
            data_tab                = te_t_zgraymarketinfos
          EXCEPTIONS
            file_open_error         = 1
            file_read_error         = 2
            no_batch                = 3
            gui_refuse_filetransfer = 4
            invalid_type            = 5
            no_authority            = 6
            unknown_error           = 7
            bad_data_format         = 8
            header_not_allowed      = 9
            separator_not_allowed   = 10
            header_too_long         = 11
            unknown_dp_error        = 12
            access_denied           = 13
            dp_out_of_memory        = 14
            disk_full               = 15
            dp_timeout              = 16
            OTHERS                  = 17.
        IF te_t_zgraymarketinfos[] IS INITIAL.
          MESSAGE i398(00) WITH 'Error or File Empty...'.
        ELSE.
          lv_strig = ''.
          lv_strig0 = '¿Alert!!!: Do you want to UPDATE all channels ?'.
          PERFORM ask_change_mass_update USING lv_strig0 lv_strig lv_new_state row.
        ENDIF.
      WHEN OTHERS.
        IF ls_w_zgraymarketinfo-activationstate EQ 'STARTED'.
          lv_strig = ls_w_zgraymarketinfo-channelname && ' ?'.
          lv_strig0 = '¿Do you want to "' && lv_stop &&  '" the channel:'.
          lv_new_state = lv_stop.
        ELSE.
          lv_strig = ls_w_zgraymarketinfo-channelname && ' ?'.
          lv_strig0 = '¿Do you want to "' && lv_start &&  '" the channel:'.
          lv_new_state = lv_start.
        ENDIF.

        PERFORM ask_change USING lv_strig0 lv_strig lv_new_state row.
    ENDCASE.

    ls_is_stable-row = 'X'.
    ls_is_stable-col = 'X'.

    CALL METHOD gr_table->refresh
      EXPORTING
        s_stable = ls_is_stable.

  ENDMETHOD.                    "on_double_click
  METHOD on_user_command.

  ENDMETHOD.
ENDCLASS.                    "lcl_event_handler IMPLEMENTATION

PARAMETERS: host TYPE string. "" DEFAULT 'SAPPOD'.
PARAMETERS: port TYPE string. "" DEFAULT '50000'.
PARAMETERS: user TYPE string DEFAULT sy-uname.
PARAMETERS: password TYPE string LOWER CASE OBLIGATORY.

AT SELECTION-SCREEN OUTPUT.
  LOOP AT SCREEN.
    IF screen-name = 'PASSWORD'.
      screen-invisible = '1'.
      MODIFY SCREEN.
    ENDIF.
  ENDLOOP.
  IF host IS INITIAL.
    CALL FUNCTION 'SLDAPI_GET_PRIMARY_DEST'
      IMPORTING
        dest                 = s_slddest
      EXCEPTIONS
        no_primary_available = 1.
    host = s_slddest-host.
    port = s_slddest-port.
  ENDIF.

START-OF-SELECTION.


  v_url = 'http://' && host &&':' && port && '/AdapterFramework/ChannelAdminServlet?party=*&service=*&channel=*&action=STATUS'.
  PERFORM call_service.
  PERFORM move_data.


END-OF-SELECTION.

  IF te_t_zgraymarketinfo[] IS NOT INITIAL.
    PERFORM call_alv.
  ELSE.
    MESSAGE i398(00) WITH 'No CC found...'.
  ENDIF.

FORM call_service.

  CLEAR client.
  CALL METHOD cl_http_client=>create
    EXPORTING
      host               = host
      service            = port
    IMPORTING
      client             = client
    EXCEPTIONS
      argument_not_found = 1
      plugin_not_active  = 2
      internal_error     = 3
      OTHERS             = 4.

*set header fields
  CALL METHOD client->request->set_header_field
    EXPORTING
      name  = '~request_method'
      value = 'POST'.

  CALL METHOD client->request->set_header_field
    EXPORTING
      name  = 'Content-Type'
      value = 'application/xml'.
  ""*; charset=utf-8′ .

*Set request protocol
  CALL METHOD client->request->set_header_field
    EXPORTING
      name  = '~server_protocol'
      value = 'HTTP/1.0'.

*Update url
  CALL METHOD client->request->set_header_field
    EXPORTING
      name  = '~request_uri'
      value = v_url.

*Disable logon popup
  client->propertytype_logon_popup = client->co_disabled.
  CALL METHOD client->authenticate
    EXPORTING
      username = user
      password = password.


  cl_http_utility=>set_request_uri( request = client->request  uri = response_text ).

*Send http request to server
  CALL METHOD client->send
    EXCEPTIONS
      http_communication_failure = 1
      http_invalid_state         = 2
      http_processing_failed     = 3
      OTHERS                     = 4.

  IF sy-subrc <> 0.
    CALL METHOD client->get_last_error
      IMPORTING
        code    = response_code
        message = response_text.

    MESSAGE i000(sr) WITH response_text.

    EXIT.
  ENDIF.

*Get http response from server
  CALL METHOD client->receive
    EXCEPTIONS
      http_communication_failure = 1
      http_invalid_state         = 2
      http_processing_failed     = 3
      OTHERS                     = 4.
  IF sy-subrc <> 0.

    CALL METHOD client->get_last_error
      IMPORTING
        code    = response_code
        message = response_text.

    status_code = client->response->get_header_field( '~status_code' ).
    status_reason = client->response->get_header_field( '~status_reason' ).
    CONCATENATE response_text '(' status_code status_reason ')'
    INTO status_reason SEPARATED BY space.

    MESSAGE i000(sr) WITH status_reason.

    EXIT.
  ENDIF.

  CLEAR: w_result.
  w_result = client->response->get_cdata( ).
  REFRESH result_tab.
  SPLIT w_result AT cl_abap_char_utilities=>newline INTO TABLE result_tab.

  CLEAR lv_file_string.
  CALL FUNCTION 'SCMS_STRING_TO_XSTRING'
    EXPORTING
      text   = w_result
    IMPORTING
      buffer = lv_file_string
    EXCEPTIONS
      failed = 1
      OTHERS = 2.

  CLEAR: xml_table[], te_return[].
  CALL FUNCTION 'SMUM_XML_PARSE'
    EXPORTING
      xml_input = lv_file_string
    TABLES
      xml_table = xml_table
      return    = te_return.

  IF te_return[] IS NOT INITIAL.
    READ TABLE te_return INTO ls_return WITH KEY type = 'E'.
    IF sy-subrc IS INITIAL.
      MESSAGE i398(00) WITH 'ERROR: with parameters or credentials...'.
      STOP.
    ENDIF.
  ENDIF.

ENDFORM.

*&---------------------------------------------------------------------*
*& Form MOVE_DATA
*&---------------------------------------------------------------------*
*& text
*&---------------------------------------------------------------------*
*& -->  p1        text
*& <--  p2        text
*&---------------------------------------------------------------------*
FORM move_data .

  LOOP AT xml_table INTO ls_xml_table.
    ASSIGN COMPONENT ls_xml_table-cname OF STRUCTURE ls_w_zgraymarketinfo TO FIELD-SYMBOL(<fs_val>).
    IF sy-subrc IS INITIAL AND <fs_val> IS ASSIGNED.
      <fs_val> = ls_xml_table-cvalue.
    ENDIF.
    IF ls_xml_table-cname EQ 'Control'.
      IF ls_w_zgraymarketinfo-activationstate EQ 'STARTED'.
        set_color 'ACTIVATIONSTATE'   5 0 0.
      ELSE.
        set_color 'ACTIVATIONSTATE'   6 0 0.
      ENDIF.
      IF ls_w_zgraymarketinfo-channelstate EQ 'OK'.
        set_color 'CHANNELSTATE'   5 0 0.
      ELSEIF ls_w_zgraymarketinfo-channelstate EQ 'ERROR'.
        set_color 'CHANNELSTATE'   6 0 0.
      ELSE.
        set_color 'CHANNELSTATE'   7 0 0.
      ENDIF.
      APPEND ls_w_zgraymarketinfo TO te_t_zgraymarketinfo2.
      CLEAR ls_w_zgraymarketinfo.
    ENDIF.
  ENDLOOP.

  PERFORM add_double_click_to.

  SORT te_t_zgraymarketinfo2 BY party service channelname.
  LOOP AT te_t_zgraymarketinfo2 INTO ls_w_zgraymarketinfo.
    APPEND ls_w_zgraymarketinfo TO te_t_zgraymarketinfo.
  ENDLOOP.
ENDFORM.
*&---------------------------------------------------------------------*
*& Form CALL_ALV
*&---------------------------------------------------------------------*
*& text
*&---------------------------------------------------------------------*
*& -->  p1        text
*& <--  p2        text
*&---------------------------------------------------------------------*
FORM call_alv .

  TRY.
      CALL METHOD cl_salv_table=>factory
        IMPORTING
          r_salv_table = gr_table
        CHANGING
          t_table      = te_t_zgraymarketinfo[].
    CATCH cx_salv_msg .
  ENDTRY.

  functions = gr_table->get_functions( ).
  functions->set_all( abap_true ).

  g_dsp = gr_table->get_display_settings( ).
  g_dsp->set_striped_pattern( abap_true ).

  gr_columns = gr_table->get_columns( ).
  gr_columns->set_color_column( 'COLOR' ).

  titulo gr_columns column 'PARTY'           'PARTY' 'PARTY' 'PARTY' 20.
  titulo gr_columns column 'SERVICE'         'SERVICE' 'SERVICE' 'SERVICE' 30.
  titulo gr_columns column 'CHANNELNAME'     'CHANNEL' 'CHANNEL' 'CHANNEL' 40.
  titulo gr_columns column 'ADAPTERTYPE'     'TYPE' 'TYPE' 'TYPE' 10.
  titulo gr_columns column 'ACTIVATIONSTATE' 'START/STOP' 'STARTED/STOPPED' 'STARTED/STOPPED' 20.
  titulo gr_columns column 'CHANNELSTATE'    'OK/ERR/INA' 'OK/ERROR/INACTIVE' 'OK/ERROR/INACTIVE' 20.

  key-report = sy-repid.
  gr_layout = gr_table->get_layout( ).
  gr_layout->set_key( key ).

  gr_layout->set_save_restriction( if_salv_c_layout=>restrict_none ).


  CREATE OBJECT lo_header.
* Information in Bold
  lo_h_label = lo_header->create_label( row = 1 column = 1 ).
  lo_h_label->set_text('Comunication Channel Tool V 1.0').

* Information in tabular format
  lo_h_flow = lo_header->create_flow( row = 2 column = 1 ).
  lo_h_flow->create_text( text = 'CC Total:' ).
  lo_h_flow = lo_header->create_flow( row = 2 column = 2 ).

  lv_lines = lines( te_t_zgraymarketinfo ).
  lo_h_flow->create_text( text = lv_lines ).

  lo_h_flow = lo_header->create_flow( row = 3 column = 1 ).
  lo_h_flow->create_text( text = 'Date of List Generation' ).
  lo_h_flow = lo_header->create_flow( row = 3 column = 2 ).
  lo_h_flow->create_text( text = sy-datum ).

* Set the top of list using the header for Online
  gr_table->set_top_of_list( lo_header ).
* Set the top of list using the header for Print
  gr_table->set_top_of_list_print( lo_header ).


  lr_events = gr_table->get_event( ).
  CREATE OBJECT gr_events.
**  double click
  SET HANDLER gr_events->double_click FOR lr_events.

  gr_table->display( ).
ENDFORM.
*&---------------------------------------------------------------------*
*& Form ASK_CHANGE
*&---------------------------------------------------------------------*
*& text
*&---------------------------------------------------------------------*
*& -->  p1        text
*& <--  p2        text
*&---------------------------------------------------------------------*
FORM ask_change USING lv_strig lv_strig2 lv_new_state row.
  DATA lv_response.
  CLEAR lv_response.
  CALL FUNCTION 'POPUP_TO_CONFIRM_STEP'
    EXPORTING
      defaultoption = 'N'
      textline1     = lv_strig
      textline2     = lv_strig2
      titel         = 'Update Information?'
      start_column  = 35
      start_row     = 6
    IMPORTING
      answer        = lv_response.

  IF lv_response = 'J'. "Yes
    v_url = 'http://' && host &&':' && port && '/AdapterFramework/ChannelAdminServlet?party=' && ls_w_zgraymarketinfo-party
                      && '&service=' && ls_w_zgraymarketinfo-service && '&channel=' && ls_w_zgraymarketinfo-channelname && '&action=' && lv_new_state.
    PERFORM call_service.
    READ TABLE xml_table INTO ls_xml_table WITH KEY cname = 'ActivationState'.
    IF lv_new_state EQ 'STOP'.
      IF ls_xml_table-cvalue EQ 'STOPPED'.
        MESSAGE i398(00) WITH 'Channel:' ls_w_zgraymarketinfo-channelname ',was updated successful.'.
      ELSE.
        MESSAGE i398(00) WITH 'Channel:' ls_w_zgraymarketinfo-channelname ',was not updated.'.
      ENDIF.
    ELSE.
      IF ls_xml_table-cvalue EQ 'STARTED'.
        MESSAGE i398(00) WITH 'Channel:' ls_w_zgraymarketinfo-channelname ',was updated successful.'.
      ELSE.
        MESSAGE i398(00) WITH 'Channel:' ls_w_zgraymarketinfo-channelname ',was not updated.'.
      ENDIF.
    ENDIF.

    ls_w_zgraymarketinfo-activationstate = ls_xml_table-cvalue.
    DELETE ls_w_zgraymarketinfo-color WHERE fname EQ 'ACTIVATIONSTATE'.
    IF ls_w_zgraymarketinfo-activationstate EQ 'STARTED'.
      set_color 'ACTIVATIONSTATE'   5 0 0.
    ELSE.
      set_color 'ACTIVATIONSTATE'   6 0 0.
    ENDIF.
    MODIFY te_t_zgraymarketinfo FROM ls_w_zgraymarketinfo INDEX row.
  ENDIF.


ENDFORM.

*&---------------------------------------------------------------------*
*& Form ASK_CHANGE_MASS
*&---------------------------------------------------------------------*
*& text
*&---------------------------------------------------------------------*
*& -->  p1        text
*& <--  p2        text
*&---------------------------------------------------------------------*
FORM ask_change_mass USING lv_strig lv_strig2 lv_new_state row.
  DATA lv_response.
  DATA lv_updated TYPE i.
  DATA lv_error TYPE i.
  CLEAR: lv_response, lv_updated, lv_error.
  CALL FUNCTION 'POPUP_TO_CONFIRM_STEP'
    EXPORTING
      defaultoption = 'N'
      textline1     = lv_strig
      textline2     = lv_strig2
      titel         = 'Update Information?'
      start_column  = 35
      start_row     = 6
    IMPORTING
      answer        = lv_response.

  IF lv_response = 'J'. "Yes
    LOOP AT te_t_zgraymarketinfo INTO ls_w_zgraymarketinfo WHERE party NE 'DOUBLE CLICK TO'.
      row = sy-tabix.
      v_url = 'http://' && host &&':' && port && '/AdapterFramework/ChannelAdminServlet?party=' && ls_w_zgraymarketinfo-party
                        && '&service=' && ls_w_zgraymarketinfo-service && '&channel=' && ls_w_zgraymarketinfo-channelname && '&action=' && lv_new_state.
      PERFORM call_service.
      READ TABLE xml_table INTO ls_xml_table WITH KEY cname = 'ActivationState'.
      IF lv_new_state EQ 'STOP'.
        IF ls_xml_table-cvalue EQ 'STOPPED'.
          ADD 1 TO lv_updated.
        ELSE.
          ADD 1 TO lv_error.
        ENDIF.
      ELSE.
        IF ls_xml_table-cvalue EQ 'STARTED'.
          ADD 1 TO lv_updated.
        ELSE.
          ADD 1 TO lv_error.
        ENDIF.
      ENDIF.

      ls_w_zgraymarketinfo-activationstate = ls_xml_table-cvalue.
      DELETE ls_w_zgraymarketinfo-color WHERE fname EQ 'ACTIVATIONSTATE'.
      IF ls_w_zgraymarketinfo-activationstate EQ 'STARTED'.
        set_color 'ACTIVATIONSTATE'   5 0 0.
      ELSE.
        set_color 'ACTIVATIONSTATE'   6 0 0.
      ENDIF.
      MODIFY te_t_zgraymarketinfo FROM ls_w_zgraymarketinfo INDEX row.
    ENDLOOP.
    MESSAGE i398(00) WITH lv_updated ': was updated successful AND '  lv_error ': finished with Error:'.
  ENDIF.


ENDFORM.

*&---------------------------------------------------------------------*
*& Form ASK_CHANGE_MASS_UPDATE
*&---------------------------------------------------------------------*
*& text
*&---------------------------------------------------------------------*
*& -->  p1        text
*& <--  p2        text
*&---------------------------------------------------------------------*
FORM ask_change_mass_update USING lv_strig lv_strig2 lv_new_state row.
  DATA lv_response.
  DATA lv_updated TYPE i.
  DATA lv_error TYPE i.
  CLEAR lv_response.
  CALL FUNCTION 'POPUP_TO_CONFIRM_STEP'
    EXPORTING
      defaultoption = 'N'
      textline1     = lv_strig
      textline2     = lv_strig2
      titel         = 'Update Information?'
      start_column  = 35
      start_row     = 6
    IMPORTING
      answer        = lv_response.

  IF lv_response = 'J'. "Yes
    CLEAR: te_t_zgraymarketinfo[], lv_updated, lv_error.
    DELETE te_t_zgraymarketinfos  WHERE party EQ 'DOUBLE CLICK TO'.
    PERFORM add_double_click_to.
    LOOP AT te_t_zgraymarketinfos INTO ls_w_zgraymarketinfos.
      CLEAR ls_w_zgraymarketinfo.
      MOVE-CORRESPONDING ls_w_zgraymarketinfos TO ls_w_zgraymarketinfo.
      IF ls_w_zgraymarketinfo-activationstate EQ 'STARTED'.
        lv_new_state = lv_start.
      ELSE.
        lv_new_state = lv_stop.
      ENDIF.

      v_url = 'http://' && host &&':' && port && '/AdapterFramework/ChannelAdminServlet?party=' && ls_w_zgraymarketinfo-party
                        && '&service=' && ls_w_zgraymarketinfo-service && '&channel=' && ls_w_zgraymarketinfo-channelname && '&action=' && lv_new_state.
      PERFORM call_service.
      READ TABLE xml_table INTO ls_xml_table WITH KEY cname = 'ActivationState'.
      IF lv_new_state EQ 'STOP'.
        IF ls_xml_table-cvalue EQ 'STOPPED'.
          ADD 1 TO lv_updated.
        ELSE.
          ADD 1 TO lv_error.
        ENDIF.
      ELSE.
        IF ls_xml_table-cvalue EQ 'STARTED'.
          ADD 1 TO lv_updated.
        ELSE.
          ADD 1 TO lv_error.
        ENDIF.
      ENDIF.

      ls_w_zgraymarketinfo-activationstate = ls_xml_table-cvalue.
      DELETE ls_w_zgraymarketinfo-color WHERE fname EQ 'ACTIVATIONSTATE'.
      IF ls_w_zgraymarketinfo-activationstate EQ 'STARTED'.
        set_color 'ACTIVATIONSTATE'   5 0 0.
      ELSE.
        set_color 'ACTIVATIONSTATE'   6 0 0.
      ENDIF.

      APPEND ls_w_zgraymarketinfo TO te_t_zgraymarketinfo.
    ENDLOOP.
    MESSAGE i398(00) WITH lv_updated ': was updated successful AND '  lv_error ': finished with Error:'.
  ENDIF.


ENDFORM.
*&---------------------------------------------------------------------*
*& Form ADD_DOUBLE_CLICK_TO
*&---------------------------------------------------------------------*
*& text
*&---------------------------------------------------------------------*
*& -->  p1        text
*& <--  p2        text
*&---------------------------------------------------------------------*
FORM add_double_click_to .
  CLEAR ls_w_zgraymarketinfo.
  set_color 'PARTY'           5 0 0.
  set_color 'SERVICE'         5 0 0.
  set_color 'CHANNELNAME'     5 0 0.
  set_color 'ADAPTERTYPE'     5 0 0.
  set_color 'ACTIVATIONSTATE' 5 0 0.
  set_color 'CHANNELSTATE'    5 0 0.
  ls_w_zgraymarketinfo-party = 'DOUBLE CLICK TO'.
  ls_w_zgraymarketinfo-service = lv_start.
  ls_w_zgraymarketinfo-channelname = 'ALL CHANNEL'.
  APPEND ls_w_zgraymarketinfo TO te_t_zgraymarketinfo.

  CLEAR ls_w_zgraymarketinfo.
  set_color 'PARTY'           6 0 0.
  set_color 'SERVICE'         6 0 0.
  set_color 'CHANNELNAME'     6 0 0.
  set_color 'ADAPTERTYPE'     6 0 0.
  set_color 'ACTIVATIONSTATE' 6 0 0.
  set_color 'CHANNELSTATE'    6 0 0.
  ls_w_zgraymarketinfo-party = 'DOUBLE CLICK TO'.
  ls_w_zgraymarketinfo-service = lv_stop.
  ls_w_zgraymarketinfo-channelname = 'ALL CHANNEL'.
  APPEND ls_w_zgraymarketinfo TO te_t_zgraymarketinfo.

  CLEAR ls_w_zgraymarketinfo.
  set_color 'PARTY'           4 1 0.
  set_color 'SERVICE'         4 1 0.
  set_color 'CHANNELNAME'     4 1 0.
  set_color 'ADAPTERTYPE'     4 1 0.
  set_color 'ACTIVATIONSTATE' 4 1 0.
  set_color 'CHANNELSTATE'    4 1 0.
  ls_w_zgraymarketinfo-party = 'DOUBLE CLICK TO'.
  ls_w_zgraymarketinfo-service = lv_download.
  ls_w_zgraymarketinfo-channelname = 'ALL CHANNEL'.
  APPEND ls_w_zgraymarketinfo TO te_t_zgraymarketinfo.

  CLEAR ls_w_zgraymarketinfo.
  set_color 'PARTY'           3 1 0.
  set_color 'SERVICE'         3 1 0.
  set_color 'CHANNELNAME'     3 1 0.
  set_color 'ADAPTERTYPE'     3 1 0.
  set_color 'ACTIVATIONSTATE' 3 1 0.
  set_color 'CHANNELSTATE'    3 1 0.
  ls_w_zgraymarketinfo-party = 'DOUBLE CLICK TO'.
  ls_w_zgraymarketinfo-service = lv_upload.
  ls_w_zgraymarketinfo-channelname = 'ALL CHANNEL'.
  APPEND ls_w_zgraymarketinfo TO te_t_zgraymarketinfo.
ENDFORM.

3. You can now run the program.

The program take the name of SAP PO, port and the user from the configuration as you can see, just write you private password.

If all is OK you can see the list with the complete details of the communication channel.

You can see the C.C Total in SAP PO and their Status.

  1. If you do double click at one channel you will get a message from the system in order to reject or approver the change of status.
  2. If you do double click at the first green line you will get a message in order to reject or approver starting all the Communication Channels.
  3. If you do double click at the second red line you will get a message in order to reject or approver stopping all the Communication Channels.
  4. If you do double click at the their blue line you will download the current status of the all channel in order save it as backup and you can update this same status in the future.
  5. If you do double click at the yellow line you will upload and update the previous status downloaded.

Prerequisites

It is necessary the communication channel has the

control data in the communication channel monitor in External

Configuration and Monitoring Home / Monitoring / Communication Channel Monitor

Rating: 0 / 5 (0 votes)

The post SAP PI/PO Communication Channel Tool appeared first on ERP Q&A.

]]>