SAP Integration Suite Archives - ERP Q&A https://www.erpqna.com/category/sap-integration-suite/ Trending SAP Career News and Guidelines Mon, 10 Mar 2025 11:14:44 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP Integration Suite Archives - ERP Q&A https://www.erpqna.com/category/sap-integration-suite/ 32 32 Converting IDoc to a Fixed-Length Flat File Using XSLT Mapping in SAP CPI https://www.erpqna.com/converting-idoc-to-a-fixed-length-flat-file-using-xslt-mapping-in-sap-cpi/?utm_source=rss&utm_medium=rss&utm_campaign=converting-idoc-to-a-fixed-length-flat-file-using-xslt-mapping-in-sap-cpi Mon, 10 Mar 2025 11:14:39 +0000 https://www.erpqna.com/?p=90920 Step-by-Step Process 1. Receiving IDoc via IDoc Sender Adapter The first step is to receive the IDoc in SAP CPI using the IDoc Sender Adapter. Key Configuration Steps: 2. Mapping IDoc Data to a Fixed-Length Structure Before applying XSLT, ensure the IDoc fields are mapped correctly to match the flat file format. Important Considerations for […]

The post Converting IDoc to a Fixed-Length Flat File Using XSLT Mapping in SAP CPI appeared first on ERP Q&A.

]]>
Step-by-Step Process

1. Receiving IDoc via IDoc Sender Adapter

The first step is to receive the IDoc in SAP CPI using the IDoc Sender Adapter.

    Key Configuration Steps:

    • Use the IDoc Sender Adapter to connect SAP CPI with SAP S/4HANA or ECC.
    • Ensure the correct IDoc type (e.g., DELVRY03, ORDERS05) is selected.
    • Establish connectivity using the appropriate authentication method.

    2. Mapping IDoc Data to a Fixed-Length Structure

    Before applying XSLT, ensure the IDoc fields are mapped correctly to match the flat file format.

      Important Considerations for Fixed-Length Files:

      • Each field in the flat file must have a specific length (e.g., 10, 20, 30 characters).
      • Truncate or pad fields with spaces if they do not meet the required length.
      • Ensure the sequence of fields is aligned with the file structure.

      3. Applying XSLT Mapping for Fixed-Length Output

      To ensure the flat file follows a fixed-length format, we use XSLT mapping. The following XSLT code helps achieve this by:

      ✔ Removing XML tags while preserving field values.
      ✔ Padding fields with spaces to match the required length.
      ✔ Adding new lines to ensure proper record structure.

        XSLT Code for Fixed-Length Formatting

        <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
          <xsl:output method="text" indent="no"/>
        
          <!-- Remove XML tags and keep only text content -->
          <xsl:template match="*">
            <xsl:apply-templates select="node()"/>
          </xsl:template>
        
          <!-- Preserve text nodes -->
          <xsl:template match="text()">
            <xsl:value-of select="."/>
          </xsl:template>
        
          <!-- Add line breaks after each segment -->
          <xsl:template match="Header | SubHeader | HeaderText | LineData | LineText">
            <xsl:apply-templates/>
            <xsl:text>&#xA;</xsl:text>  <!-- New line character -->
          </xsl:template>
        
          <!-- Root template -->
          <xsl:template match="/">
            <xsl:apply-templates/>
          </xsl:template>
        </xsl:stylesheet>

        How XSLT Helps:

        ✔ Removes XML tags while preserving text content.
        ✔ Adds line breaks to format the flat file correctly.
        ✔ Ensures proper structuring for downstream processing.

        4. Sending the Flat File to an SFTP Server

        After the transformation, the flat file needs to be sent to an SFTP server.

          Configuring the SFTP Receiver Adapter:

          • Set up an SFTP Receiver Adapter in SAP CPI.
          • Provide the correct host, port, and authentication details.
          • Define the file naming convention to store the output properly.

          Conclusion

          By leveraging SAP CPI, we can efficiently convert IDocs into flat file formats using XSLT mapping and an SFTP adapter. This approach eliminates manual intervention and ensures seamless data transmission between systems.

          Key Takeaways:

          • IDoc Sender Adapter captures IDocs from SAP S/4HANA or ECC.
          • Message Mapping helps format data as per flat file structure.
          • XSLT Mapping removes XML tags and applies necessary formatting.
          • SFTP Adapter delivers the final flat file to the target system.

          With this approach, businesses can streamline IDoc-to-flat file conversions and enhance their integration capabilities within SAP CPI.

          Rating: 5 / 5 (1 votes)

          The post Converting IDoc to a Fixed-Length Flat File Using XSLT Mapping in SAP CPI appeared first on ERP Q&A.

          ]]>
          SAP S/4HANA direct connectivity with Event Mesh in Integration Suite https://www.erpqna.com/sap-s-4hana-direct-connectivity-with-event-mesh-in-integration-suite/?utm_source=rss&utm_medium=rss&utm_campaign=sap-s-4hana-direct-connectivity-with-event-mesh-in-integration-suite Tue, 30 Jul 2024 11:11:47 +0000 https://www.erpqna.com/?p=86938 This post covers the details of how you can configure this integration. Prerequisites: You have SAP Integration Suite, and you have activated Event Mesh as a capability. Creation of a Queue on EMIS Creation of a topic subscription Next, create a topic subscription. For S4 integration, it is important to note that the topic name […]

          The post SAP S/4HANA direct connectivity with Event Mesh in Integration Suite appeared first on ERP Q&A.

          ]]>
          This post covers the details of how you can configure this integration.

          Prerequisites:

          You have SAP Integration Suite, and you have activated Event Mesh as a capability.

          Creation of a Queue on EMIS

          Creation of a topic subscription

          Next, create a topic subscription.

          For S4 integration, it is important to note that the topic name should correspond to the topic space defined in S4. Kindly note the asterisk (*) in the topic name which acts as a wildcard for the topic hierarchy.

          Creation of a service key for EM messaging client

          The service key is needed to get the credentials for S4 to connect to EMIS.

          Kindly create the service key post successful creation of service instance.

          Creation of SM59 destination on S/4

          S/4HANA needs details such as hostname and port to connect to EMIS. These details are specified in the sm59 destination.

          Kindly execute the following transaction:

          /nsm59

          Choose ‘Create’

          Provide a destination name and choose connection type as ‘HTTP connection to external server’

          Save.

          Now, refer to the service key created using SAP Integration Suite, Event Mesh – ‘message client’ instance.

          You need to refer to this service key to provide the details in the sm59 destination.

          The hostname can be obtained from the ‘uri’ section under ‘messaging’

          The port should be ‘443’ and path prefix ‘/protocols/amqp10ws’ for AMQP. (the first release of EMIS only supports AMQP protocol for messaging)

          In the logon and security tab, set ‘SSL’ to ‘Active’ as shown below –

          Save.

          Creation of Oauth client for S4

          S4HANA authenticates with EMIS using Oauth2.0.

          You need to execute the below transaction:

          /nOA2C_CONFIG

          Choose ‘Create’ and provide the necessary details.

          The OAuth2.0 Client Profile as shown above needs to be ‘IWXBE/MGW_MQTT’ even for AMQP based messaging.

          You can provide any configuration name, the Oauth 2.0 client ID can be found from the service key created above under the ‘oa2’ section in ‘messaging’.

          Choose OK.

          Provide the client secret, authorization and token endpoint from the service key.

          The authorization endpoint is not directly provided in the service key, but it is same as the token endpoint, except that it ends with ‘/authorize’ as show below.

          The Access Settings are mentioned below:

          Client Authentication: Form Fields

          Resource Access Authentication: Header Field

          Selected Grant Type: Client Credentials

          Creation of a channel using form fields on S/4

          Kindly execute the below transaction to configure the channel:

          /n/IWXBE/CONFIG

          Create a new channel (without service key).

          At this point, it is not possible to configure this using the ‘Create Channel via Service Key’ option.

          This is currently a gap and will be addressed in an upcoming release.

          It is important to note that the above gap is only applicable for Event Mesh in Integration Suite (EMIS).

          For Event Mesh (default), the recommended way is to use ‘Create Channel via Service Key’ option on S4.

          Choose ‘Default’ or ‘SAP Event Mesh’.

          Now the following details need to be provided –

          You can provide any name to the channel.

          The destination needs to be the name of the sm59 destination as created above.

          The topic space needs to correspond to the topic subscription on the Queue.

          The Oauth 2.0 Configuration needs to be the name of the Oauth2.0 client configuration as created above.

          Save.

          Activation of Channel & Connectivity Check

          Click on ‘Activate’ followed by ‘Check Connection’ to ensure that connection to Event Mesh is successful.

          Sending a Test Event using Event monitor

          Kindly execute this transaction,

          /n/IWXBE/EEE_SUPPORT

          Choose ‘Produce Test Events’ followed by the channel.

          Execute the transaction.

          You can now navigate back to Event Mesh in Integration Suite, head over to the Queues section and find the message queued! (unless there is active consumer)

          Congratulations – You have succeeded in integrating SAP S/4HANA with Event Mesh in Integration Suite!

          Rating: 0 / 5 (0 votes)

          The post SAP S/4HANA direct connectivity with Event Mesh in Integration Suite appeared first on ERP Q&A.

          ]]>
          SAP Integration Suite – Generative AI based Integration Flow Generation https://www.erpqna.com/sap-integration-suite-generative-ai-based-integration-flow-generation/?utm_source=rss&utm_medium=rss&utm_campaign=sap-integration-suite-generative-ai-based-integration-flow-generation Sat, 27 Jul 2024 10:30:33 +0000 https://www.erpqna.com/?p=86813 Introduction SAP Cloud Integration version 6.54.**, one of the capabilities of SAP Integration Suite, comes with an enhancement on Generative AI (GenAI) feature – GenAI based Integration Flow Generation. This feature will be available only in SAP Integration Suite Cloud Integration Premium Editions, on specific regions/data centres JP10, EU10, AP10, US10, EU10-003, EU10-002, and US10-002. […]

          The post SAP Integration Suite – Generative AI based Integration Flow Generation appeared first on ERP Q&A.

          ]]>
          Introduction

          SAP Cloud Integration version 6.54.**, one of the capabilities of SAP Integration Suite, comes with an enhancement on Generative AI (GenAI) feature – GenAI based Integration Flow Generation. This feature will be available only in SAP Integration Suite Cloud Integration Premium Editions, on specific regions/data centres JP10, EU10, AP10, US10, EU10-003, EU10-002, and US10-002. This feature is forecasted to rollout during calendar week 29/30/31 2024 (tentative timelines, subjected to change as per phased rollout plan).

          Enabling GenAI Based Integration Flow Generation feature

          As tenant administrator of your premium edition tenant, you need to enable this feature in Settings page of SAP Integration Suite. By enabling this feature, you need to agree to the relevant terms and conditions.

          Below are the sample screenshots of the same

          Settings Tab for Generative AI
          Click on Edit
          Switch ON the flag
          Terms and Conditions

          Clicking on Save button will display the user who has accepted the terms and condition. Click on Save and then click on Cancel button to come out of edit mode.

          Generating Integration Flows

          Once this feature is switched ON by the tenant administrator, you – as integration flow developer persona – can use this feature to generate integration flows.

          The GenAI based integration flow generation makes use of the Unified Cloud Landscape (UCL) concept. For that, you need to configure your account’s System Landscape configuration with SAP Integration Suite formation and add the required systems in that formation.

          Note: SAP Integration Suite formation/system type is getting rolled out to relevant data centres. This blog note will be updated once rollout is complete.

          If you have a successful SAP Integration Suite formation created with some systems (e.g. S4), the GenAI based Integration flow generation feature will browse the systems and APIs and lists them in the Sender and Receiver sections in GenAI based integration flow creation UI dialog, as shown in the screenshot below.

          Below are the sample screenshots of System Landscape configuration with Systems with system type SAP Integration Suite (the sub-account where Integration Suite is subscribed to) and Formations of type of Integration with SAP Integration Suite formation type along with other systems participating in the integration.

          Some sample systems added

          SAP Integration Suite sample formation

          In case if you have not enabled System Landscape configuration as above, the systems and API listings will not happen for Sender and Receiver systems, and these fields will be empty in the selection of GenAI based integration flow UI dialog.

          Now, you click on Add -> Integration flow menu in your package view, you will be given option of generating the integration flow with assistance from AI or create integration flow manually. Below are the screenshots of generating integration flows using generative AI.

          Click on Package Edit

          Edit the package

          Click on Add -> Integration Flow menu

          Select Integration Flow

          Choose GenAI based Integration flow generation option
          Provide your scenario description and click on Send button
          Observe the AI response and correct the description in case of suggestions
          After correcting the description, click on send button again. If scenario description is correct, AI will list the sender and receiver systems from configured System Landscape infromation
          Clicking on Select button will list other systems from System Landscape configuration
          AI will suggest integration scenario name, you can change it
          Click on Generate button to generate integration flow

          Upon click on Generate button, an integration flow will be generated as shown in

          Generated Integration Flow template
          Observe the Timer flow step generated as per the schedule in scenario description in GenAI dialog
          Sender system address configuration has been pre-filled as per the system discovery from System Landscape (UCL) information along with Receiver system

          The generated integration flow will act as a template, you need to further configure the integration flow and/or update the integration with some more integration steps to match your end to end integration scenario requirement.

          Note: As mentioned above, in case of issues in UCL, the systems and API listings will not happen for Sender and Receiver systems, and these fields will be empty in the selection of GenAI based integration flow UI dialog as shown in sample screenshot below. You can still continue the integration flow generation.

          GenAI Integration Flow generation dialog when UCL systems & API discovery fails, sender and receiver fields will be empty.

          This feature is currently provided on the basis of free and fair usage option. Due to this, you will be allowed to generate limited number of integration flows per tenant per calendar month for next 6 months. The exact number of integration flows generated with GenAI will be difficult to provide, because it depends on the scenario description you provided and the responses from GenAI backend, and back-and-forth communication, which ends up in consumption of GenAI transactions. If you exhaust these transactions, you will observe below information message.

          GenAI Transaction limits exceeded
          GenAI Transaction limits exceeded

          Summary

          SAP Integration Suite – Cloud Integration GenAI based integration flow generation feature will help you bootstrap and accelerate the integration development activity.

          As a first step towards the generative AI in integration, we have introduced this feature which currently able to interpret the scenario description and generate an integration flow with sender and receiver systems only. Going forward, we will be enhancing this offering to include mediation steps (e.g. converters, mappings etc.) based on the description on the integration scenario to generate more enriched integration flow.

          Rating: 0 / 5 (0 votes)

          The post SAP Integration Suite – Generative AI based Integration Flow Generation appeared first on ERP Q&A.

          ]]>
          Create DataType and Message Type artifact in Cloud Integration capability of SAP Integration Suite https://www.erpqna.com/create-datatype-and-message-type-artifact-in-cloud-integration-capability-of-sap-integration-suite/?utm_source=rss&utm_medium=rss&utm_campaign=create-datatype-and-message-type-artifact-in-cloud-integration-capability-of-sap-integration-suite Wed, 10 Jul 2024 12:56:31 +0000 https://www.erpqna.com/?p=86393 Introduction SAP Cloud Integration version 6.54.xx comes with new feature, where in one can create Datatype and Messagetype as reusable design time artifacts in Cloud Integration capability of SAP Integration Suite This feature is available only in SAP Integration Suite standard and above service plans. SAP Cloud Integration version 6.54.xx software update is planned on […]

          The post Create DataType and Message Type artifact in Cloud Integration capability of SAP Integration Suite appeared first on ERP Q&A.

          ]]>
          Introduction

          SAP Cloud Integration version 6.54.xx comes with new feature, where in one can create Datatype and Messagetype as reusable design time artifacts in Cloud Integration capability of SAP Integration Suite

          This feature is available only in SAP Integration Suite standard and above service plans.

          SAP Cloud Integration version 6.54.xx software update is planned on mid of July 2024 (date and time subjected to change).

          Create DataType:

          1. Open the Integration Suite Tenant and navigate to Design –>Integrations and API’s

          2. Create an Integration Package or open an existing one.

          3. Navigate to the Artifacts tab and click on Edit in the top right corner

            4. Click on Add drop down and select Data Type from the list

            5. Add Data Type dialog is displayed with Create (radio button) selected by default.

            6. Enter the values for the fields Name, ID,Target Namespace, Description, select the category – Simple Type(selected by default) or Complex Type for the Datatype you want to create and click on Add or Add and Open in Editor

            7. On Click of Add, the Data Type artifact with the provided name gets created and is listed in the Artifacts list page

            8. On Click of Add and Open in Editor, the Data Type artifact gets created with the provided name and the artifact gets opened in the Editor in display mode.

            9. The Editor contains three tabs : Overview,Structure and XSD.The Structure is shown by default when the artifact is opened. It displays the structure of the datatype in a tree table with the following columns :

              • Name : Contains the Name of the node(element or attribute).For Root node the name is same as the name of the Datatype and it cannot be edited.
              • Category : This column shows whether the root element has subnodes or not. For root node it is either Simple type or Complex Type and for submodes it can be either Element or Attribute. You cannot change values in this column.
              • Type: This column displays the type with which the node is defined.Here you select a built-in data type or reference to an existing data type for an element or attribute. You must specify a type for attributes.
              • Occurrence: Determines how often elements occur.For attributes, you can determine whether the attribute is optional or required.
              • Restrictions : This column displays the facets (if any) defined incase the node is defined by a built-in primitive type or a user defined Simple type Datatype

              9. Switch to edit mode and to define/build the Structure of the Datatype. On selecting the first row(rootnode),the Add drop down in the table header gets enabled and also the details of the row are displayed in the right side section of the editor.

              10. Simple Type Data Type :

              • No child nodes can be added
              • Root node is defined by string built-in primitive datatype.
              • Click on the root node and the Properties sheet which contains the details of the node selected is displayed on the right side of the editor. In Edit mode, user can edit the Type, define the restrictions applicable for the Type selected.

              11. Complex Type Data Type :

              To add child nodes:

              • Click on the root node and the Add drop down in the table header gets enabled.

              Add –>Element to add child element node

              Add –>Attribute to add attribute node

              Add –>Rows to add multiple Elements/Attributes

              • Click on the newly added node and define the details in the Properties sheet

              12. Once the Structure is defined,Click on Save to save the artifact as Draft, Save as Version to save the artifact as versioned artifact.

              13. XSD tab displays the read only view of the xsd schema of the Datatype artifact

                Create MessageType:

                1. Open the Integration Suite Tenant and navigate to Design –>Integrations and API’s

                2. Create an Integration Package or open an existing one.

                3. Navigate to the Artifacts tab and click on Edit in the top right corner

                4. Click on Add drop down and select Message Type from the list

                  5. Add Message Type dialog is opened

                  6. Enter the values for the fields Name, ID, XMLNamespace, Datatype to be Used, Description, and click on Add or Add and Open in Editor

                  7. On Click of Add, Message Type artifact gets created and is listed in the Artifacts list page

                    8. On Click of Add and Open in Editor, MessageType artifact gets created and the artifact gets opened in the DataType Editor with Structure tab loaded by default in non-edit mode. The rootnode Name would be same as the Message Type name, Category as Element and Type as Data Type Used (if selected in the Add Message Type dialog)

                    9. Overview tab in Edit mode is as shown below :

                    10. XSD tab

                    11. Datatype Used to create a Message type can be changed in Overview tab or in Structure tab. Switch to edit mode and select the root node in the Structure tab.The properties sheet gets displayed on the right side of the page with Datatype Used field as editable.

                    12. No other nodes(child nodes) are editable in the Message Type artifact.

                      Rating: 0 / 5 (0 votes)

                      The post Create DataType and Message Type artifact in Cloud Integration capability of SAP Integration Suite appeared first on ERP Q&A.

                      ]]>
                      Integration of SAP S/4HANA with SAP Integration Suite, advanced event mesh (AEM) using AIF https://www.erpqna.com/integration-of-sap-s-4hana-with-sap-integration-suite-advanced-event-mesh-aem-using-aif/?utm_source=rss&utm_medium=rss&utm_campaign=integration-of-sap-s-4hana-with-sap-integration-suite-advanced-event-mesh-aem-using-aif Mon, 08 Jul 2024 10:03:48 +0000 https://www.erpqna.com/?p=86292 How To Integrate SAP S/4HANA BOR Framework with SAP Integration Suite, advanced event mesh (AEM) using SAP Application Interface Framework (AIF) Events are becoming increasingly popular for integration. Consequently, I’ve started to connect an SAP S/4HANA system and an SAP ERP system with SAP Integration Suite, advanced event mesh (AEM) using SAP Application Interface Framework […]

                      The post Integration of SAP S/4HANA with SAP Integration Suite, advanced event mesh (AEM) using AIF appeared first on ERP Q&A.

                      ]]>
                      How To Integrate SAP S/4HANA BOR Framework with SAP Integration Suite, advanced event mesh (AEM) using SAP Application Interface Framework (AIF)

                      Events are becoming increasingly popular for integration. Consequently, I’ve started to connect an SAP S/4HANA system and an SAP ERP system with SAP Integration Suite, advanced event mesh (AEM) using SAP Application Interface Framework (AIF).

                      I chose AIF because of its myriad of capabilities such as an integrated runtime and monitoring. Other significant advantages are its availability as an add-on in SAP ERP and as an inherent part of the SAP S/4HANA foundation.

                      As example, I chose the change event of the business partner within an SAP ERP 731 system.

                      The following result is sent to AEM:

                      {
                          "specversion": "1.0",
                          "type": "aif.businesspartner.change",
                          "source": "ZUX/200",
                          "datacontenttype": "application/json",
                          "id": "sOcLMMBC7k{BmRa}xTN6e0",
                          "time": "2024-06-19T07:30:24Z",
                          "data": {
                      	    "BusinessPartner": "A10",
                      	    "LastName": "Sisko",
                      	    "FirstName": "Benjamin",
                              "Country": "US"
                          }
                      }

                      Process shown as diagram:

                      Step 1: Enhancing the BOR framework

                      First, I linked an AIF-specific function module to the business partner change event in transaction Maintain Event Type Linkages (SWE2). This function module hands over the event data to the AIF XML runtime in a specific AIF/BOR event structure that reflects the import parameter of the receiver function module.

                      FUNCTION zaif_event_integration .
                      *"--------------------------------------------------------------------
                      *"*"Local Interface:
                      *"  IMPORTING
                      *"     VALUE(OBJTYPE) LIKE  SWETYPECOU-OBJTYPE
                      *"     VALUE(OBJKEY) LIKE  SWEINSTCOU-OBJKEY
                      *"     VALUE(EVENT) LIKE  SWEINSTCOU-EVENT
                      *"     VALUE(RECTYPE) LIKE  SWETYPECOU-RECTYPE
                      *"  TABLES
                      *"      EVENT_CONTAINER STRUCTURE  SWCONT
                      *"--------------------------------------------------------------------
                      
                      DATA: ls_event_raw TYPE zaif_event_raw.
                          ls_event_raw-objtype = objtype.  
                          ls_event_raw-objkey = objkey.  
                          ls_event_raw-event  = event.  
                          ls_event_raw-rectype = rectype.      
                      
                      TRY.    
                          CALL METHOD /aif/cl_enabler_xml=>transfer_to_aif  
                              EXPORTING  
                                  is_any_structure = ls_event_raw  
                      CATCH cx_root INTO DATA(lx_root).
                          MESSAGE lx_root->get_text( ) TYPE 'E'.
                      ENDTRY.
                      
                      ENDFUNCTION.

                      Step 2: RFC Destination Type G

                      I created a RFC destination type G for the communication with the Advanced Event Mesh in transaction RFC Destinations (SM59). As authentication I chose “Basic Authentication”.

                      Step 3: Developing the AIF interface

                      Interface customizing

                      Next, I defined an AIF interface in transaction AIF Customizing (/AIF/CUST) –> Define Interfaces. Overall, this interface collects additional business partner information (first name, last name and country) from the business partner and transmits it to AEM in a CloudEvent format. The raw structure is the previously defined AIF/BOR Event structure.

                      The target structure, or SAP structure, contains all needed information for the AEM.

                      The CONTROLLER sub-structure holds all information for the processing within the AIF and the communication with AEM:

                      The DATA substructures holds the CloudEvent represented as DDIC structure:

                      Engine Settings

                      As the interface is handled as an XML interface, I set the engines accordingly in AIF Customizing (/AIF/CUST) –> Additional Interface Porperties –> Specify Interface Engines to XML runtime:

                      Structure Mapping – Payload Component

                      The header fields DATACONTENTTYPE, SPECVERSION and TYPE are filled via fix values. Structure Mapping is done in AIF Customizing (/AIF/CUST) –> Define Structure Mapping

                      The header fields ID, TIME and SOURCE are filled via value mapping function modules in the field mapping:

                      The coding I used for the ID creation:

                      cl_uuid_factory=>create_system_uuid( )->create_uuid_c22( ).

                      For the time determination:

                      DATA:   time_stamp     TYPE timestamp,
                              dat            TYPE d,
                              tim            TYPE t,
                              tz             TYPE ttzz-tzone,
                              dst            TYPE c LENGTH 1.
                      
                      GET TIME STAMP FIELD time_stamp.
                      tz = 'UTC'.
                      
                      CONVERT TIME STAMP time_stamp TIME ZONE tz INTO DATE dat TIME tim DAYLIGHT SAVING TIME dst .
                      value_out =   |{ dat DATE = ISO }{ 'T' } { tim TIME = ISO }{ 'Z' } | .
                      
                      CONDENSE value_out NO-GAPS.

                      For the SOURCE:

                      sy-sysid && '/' && sy-mandt

                      The fields of the data structure are filled using value mappings:

                      The value mappings for FIRST_NAME and LAST_NAME read data directly from the business partner table BUT000 using a db select:

                      The COUNTRY field is read by function module using BP function modules:

                      DATA: lv_partner TYPE bu_partner.
                      DATA: lt_address  TYPE STANDARD TABLE OF bus020_ext.
                      
                      lv_partner = value_in.
                      
                      CALL FUNCTION 'BUA_ADDRESS_GET_ALL'
                          EXPORTING
                              i_partner        = lv_partner
                          TABLES
                              t_address        = lt_address
                          EXCEPTIONS
                              no_address_found = 1
                              wrong_parameters = 2
                              internal_error   = 3
                              date_invalid     = 4
                              not_valid        = 5
                              partner_blocked  = 6
                              OTHERS           = 7.
                      
                      IF sy-subrc <> 0.
                          clear value_out.   
                      ELSE.
                          TRY.
                      
                              value_out = lt_address[ 1 ]-country.
                      
                          CATCH cx_sy_itab_line_not_found.
                              clear value_out.   
                          ENDTRY.
                      ENDIF.

                      Structure Mapping – Controller Component

                      Within the controller component, the fields NAME_MAPPING_ID, RFC_DEST and URI are filled with Fix Values.

                      The field TOPIC is determined based on OBJTYPE and COUNTRY of the payload.

                      Defined Multi Values:

                      OBJTYPE COUNTRY TOPIC

                      BUS1006DEaif.businesspartner.change.de
                      BUS1006ENaif.businesspartner.change.en

                      Action

                      Next, I assigned an AEM integration action to the structure mapping:

                      Following the coding of the function module, it takes the populated SAP structure, employs the CONTROLLER information to establish the HTTP connection and URI, converts the payload into JSON and transmits it to AEM. Ultimately, the result is returned for status evaluation and logging within the AIF.

                      The coding as it works for NW 7.31 up. It’s coded generically, so it can be used for every event and payload, when the SAP structure and mapping follows the same schema.

                      To get the function module activated, also the name mapping table must be created

                      FUNCTION zaif_action_call_aem .
                          *"----------------------------------------------------------------------
                          *"*"Local Interface:
                          *"  IMPORTING
                          *"     REFERENCE(TESTRUN) TYPE  C
                          *"     REFERENCE(SENDING_SYSTEM) TYPE  /AIF/AIF_BUSINESS_SYSTEM_KEY
                          *"       OPTIONAL
                          *"  TABLES
                          *"      RETURN_TAB STRUCTURE  BAPIRET2
                          *"  CHANGING
                          *"     REFERENCE(DATA)
                          *"     REFERENCE(CURR_LINE)
                          *"     REFERENCE(SUCCESS) TYPE  /AIF/SUCCESSFLAG
                          *"     REFERENCE(OLD_MESSAGES) TYPE  /AIF/BAL_T_MSG
                          *"----------------------------------------------------------------------
                      
                      
                      DATA: lv_json TYPE string.
                      DATA: lv_status_code TYPE i.
                      DATA: lv_status_reason TYPE string.
                      DATA: lr_http_client  TYPE REF TO if_http_client.
                      DATA: ls_bapiret2 TYPE bapiret2.
                      DATA: ls_name_mapping TYPE /ui2/cl_json=>name_mapping.
                      DATA: lt_name_mappings TYPE /ui2/cl_json=>name_mappings.
                      FIELD-SYMBOLS: <ls_data> TYPE any.
                      FIELD-SYMBOLS: <ls_controller> TYPE zaif_aem_controller.
                      DATA: lv_uri TYPE string.
                      DATA: ls_aif_aem_name_ma TYPE zaif_aem_name_ma.
                      DATA: lt_aif_aem_name_ma TYPE TABLE OF zaif_aem_name_ma.
                      
                      ASSIGN COMPONENT 'DATA' OF STRUCTURE data TO <ls_data>.
                      ASSIGN COMPONENT 'CONTROLLER' OF STRUCTURE data TO <ls_controller>.
                      
                      
                      SELECT * FROM zaif_aem_name_ma INTO TABLE lt_aif_aem_name_ma WHERE name_mapping_id = <ls_controller>-name_mapping_id.
                      
                      LOOP AT lt_aif_aem_name_ma INTO ls_aif_aem_name_ma.
                          MOVE-CORRESPONDING ls_aif_aem_name_ma TO ls_name_mapping.
                          INSERT ls_name_mapping INTO TABLE lt_name_mappings.
                      ENDLOOP.
                      
                      /aif/cl_json=>serialize(              " for NW 7.55 ff, /ui2_cl_json must be used
                          EXPORTING
                              data             = <ls_data>
                              name_mappings    = lt_name_mappings
                          RECEIVING
                              r_json           = lv_json ).
                      
                      cl_http_client=>create_by_destination(
                          EXPORTING
                              destination              = <ls_controller>-rfc_dest
                          IMPORTING
                              client = lr_http_client    " HTTP Client Abstraction
                          EXCEPTIONS
                              argument_not_found       = 1                " Connection Parameter (Destination) Not Available
                              destination_not_found    = 2                " Destination not found
                              destination_no_authority = 3                " No Authorization to Use HTTP Destination
                              plugin_not_active        = 4                " HTTP/HTTPS communication not available
                              internal_error           = 5                " Internal error (e.g. name too long)
                              OTHERS                   = 6
                      ).
                      
                      IF sy-subrc <> 0.
                          ls_bapiret2-id = sy-msgid.ls_bapiret2-number = sy-msgno.ls_bapiret2-type = sy-msgty.
                          ls_bapiret2-message_v1 = sy-msgv1.ls_bapiret2-message_v2 = sy-msgv2.ls_bapiret2-message_v3 = sy-msgv3.ls_bapiret2-message_v4 = sy-msgv4.
                          APPEND ls_bapiret2 TO return_tab.RETURN.
                      ENDIF.
                      
                      lr_http_client->request->set_header_field( name = 'Content-Type'              ##NO_TEXT
                                                              value = 'application/json' ).
                      lr_http_client->request->set_method( 'POST' ).
                      
                      lr_http_client->request->set_cdata(
                          data = lv_json ).
                      
                      lv_uri = <ls_controller>-uri.
                      
                      cl_http_utility=>set_request_uri(
                          EXPORTING
                              request = lr_http_client->request
                              uri     = lv_uri
                      ).
                      
                      lr_http_client->send(
                          EXCEPTIONS
                              http_communication_failure = 1
                              http_invalid_state         = 2
                              http_processing_failed     = 3
                              http_invalid_timeout       = 4
                              OTHERS                     = 5 ).
                      
                      IF sy-subrc <> 0.
                          ls_bapiret2-id = sy-msgid.ls_bapiret2-number = sy-msgno.ls_bapiret2-type = sy-msgty.
                          ls_bapiret2-message_v1 = sy-msgv1.ls_bapiret2-message_v2 = sy-msgv2.ls_bapiret2-message_v3 = sy-msgv3.ls_bapiret2-message_v4 = sy-msgv4.
                          APPEND ls_bapiret2 TO return_tab.
                          RETURN.
                      ENDIF.
                      
                      lr_http_client->receive(
                          EXCEPTIONS
                              http_communication_failure = 1
                              http_invalid_state         = 2
                              http_processing_failed     = 3
                              OTHERS                     = 5 ).
                      
                      IF sy-subrc <> 0.
                          ls_bapiret2-id = sy-msgid.ls_bapiret2-number = sy-msgno.ls_bapiret2-type = sy-msgty.
                          ls_bapiret2-message_v1 = sy-msgv1.ls_bapiret2-message_v2 = sy-msgv2.ls_bapiret2-message_v3 = sy-msgv3.ls_bapiret2-message_v4 = sy-msgv4.
                          APPEND ls_bapiret2 TO return_tab.
                          RETURN.
                      ENDIF.
                      
                      lr_http_client->response->get_status( IMPORTING code = lv_status_code reason = lv_status_reason ).
                      
                      " close the connection
                      lr_http_client->close(
                          EXCEPTIONS
                              http_invalid_state = 1                " Invalid state
                              OTHERS             = 2
                      ).
                      
                      IF sy-subrc <> 0.
                          ls_bapiret2-id = sy-msgid.ls_bapiret2-number = sy-msgno.ls_bapiret2-type = sy-msgty.
                          ls_bapiret2-message_v1 = sy-msgv1.ls_bapiret2-message_v2 = sy-msgv2.ls_bapiret2-message_v3 = sy-msgv3.ls_bapiret2-message_v4 = sy-msgv4.
                          APPEND ls_bapiret2 TO return_tab.
                          RETURN.
                      ENDIF.
                      
                      
                      IF lv_status_code = 200 .
                          ls_bapiret2-id = 'ZAIF_AEM_MESSAGES'.
                          ls_bapiret2-number = '004'. 
                          ls_bapiret2-type = 'S'.
                          ls_bapiret2-message_v1 = lv_status_code.
                          CONDENSE ls_bapiret2-message_v1.
                          APPEND ls_bapiret2 TO return_tab.
                      ELSE.
                          ls_bapiret2-id = 'ZAIF_AEM_MESSAGES'.
                          ls_bapiret2-number = '003'.
                          ls_bapiret2-type = 'E'.
                          ls_bapiret2-message_v1 = lv_status_code.
                          ls_bapiret2-message_v2 = lv_status_reason.
                          CONDENSE ls_bapiret2-message_v1.
                          APPEND ls_bapiret2 TO return_tab.
                      ENDIF.
                      
                      ENDFUNCTION.

                      Step 4: Serialization

                      Based on the object key, I also defined a serialization. I did this is tansaction /AIF/CUST -> Interface Development -> Additional Interface Properties -> Define Serialization Settings

                      Serialization Object

                      I created a new serialization object of type “Internal Timestamp”.

                      As basis, I used the key field “OBJKEY”.

                      Step 5: Error Handling

                      To improve error handling, I added key fields and an interface display name in /AIF/CUST -> Error Handling -> Define Namespace-Specific Features and -> Define Interface-Specific Feature.

                      Key Field

                      I defined all fields of the raw structure as key fields, but only the object key is visible. Index table was creates accordingly.

                      Interface Display Name

                      As interface display name, I defined the event type:

                      Step 6: Interface Determination

                      As the raw structure can be reused for multiple BOR events, I created an interface determination in /AIF/CUST -> System Configuration -> Interface Determination -> Define Interface Determination for XML Interfaces based on the object key and event.

                      The following are the specific settings:

                      Step 7 – Name Mapping Table

                      For converting the DDIC names to JSon names I created the table: ZAIF_AEM_NAME_MA

                      For my specific use case I filled with following entries:
                      Content:

                      NAME_MAPPING_ID ABAP JSON

                      BPCHANGEDATAdata
                      BPCHANGEDATACONTENTTYPEdatacontenttype
                      BPCHANGEIDid
                      BPCHANGENAME_FIRSTFirstName
                      BPCHANGENAME_LASTLastName
                      BPCHANGEPARTNERBusinessPartner
                      BPCHANGESPECVERSIONspecversion
                      BPCHANGETIMEtime
                      BPCHANGETYPEtype
                      BPCHANGECOUNTRYcountry

                      Monitoring

                      The messages that were created can be seen in transaction Monitoring and Error Handling (/AIF/ERR).

                      They can also be seen in the SAP Fiori Message Monitoring App:

                      Rating: 0 / 5 (0 votes)

                      The post Integration of SAP S/4HANA with SAP Integration Suite, advanced event mesh (AEM) using AIF appeared first on ERP Q&A.

                      ]]>
                      SAP PO to Integration Suite | Migration Assessment Tool https://www.erpqna.com/sap-po-to-integration-suite-migration-assessment-tool/?utm_source=rss&utm_medium=rss&utm_campaign=sap-po-to-integration-suite-migration-assessment-tool Fri, 05 Jul 2024 10:44:43 +0000 https://www.erpqna.com/?p=86192 SAP Integration Suite is a new age solution designed to streamline the integration of SAP and non-SAP systems. Integration Suite extends its reach to integrate Cloud and On-premises applications, data sources and APIs. SAP is encouraging customers to transition older integration solutions (like PI/PO) to a more unified and cloud centric solution which is the […]

                      The post SAP PO to Integration Suite | Migration Assessment Tool appeared first on ERP Q&A.

                      ]]>
                      SAP Integration Suite is a new age solution designed to streamline the integration of SAP and non-SAP systems. Integration Suite extends its reach to integrate Cloud and On-premises applications, data sources and APIs.

                      SAP is encouraging customers to transition older integration solutions (like PI/PO) to a more unified and cloud centric solution which is the Integration Suite. This means that now we must consider doing migration assessments from PI/PO to Integration Suite.

                      Fortunately, SAP provides a ready to use Migration Assessment tool with the Integration Suite subscription.

                      High level process flow of how we can achieve the Migration Assessment

                      Pre-requisite: Application subscription to Integration Suite in your BTP subaccount.

                      Preparation: When you intend to do the assessment on SAP PO which is hosted on-premises using a tool which is enabled by Integration Suite, a cloud hosted service, you must make sure that you have a valid and secure connection well established, this is where we use SAP Cloud Connector service.

                      SAP cloud connector is a service hosted on-premises and is connected with BTP subaccount and further leveraged by Cloud/SaaS applications. The diagram below provides a general overview:

                      On-Premise setup and configurations:

                      Once installed, you can perform the configuration by logging into the Cloud connector using the url – Access https://localhost:8443, initially with Name “Administrator” and Password “manage”, password to be changed on initial logon.

                      Provide the BTP subaccount details where you have your Integration Suite subscription and save.

                      Check for successful connection with BTP subaccount on the top bar of the Cloud Connector logon page.

                      Now, we will have to create a new connection with the backend SAP PI/PO system.

                      In the Cloud Connector, on the left side bar, click ‘Cloud to On-Premise’ and click on ‘add’ system

                      From the drop-down options select ‘SAP Process Integration’

                      Next, provide the protocol (HTTP/HTTPS)

                      Next, provide PI/PO host and port information, note that this must be accessible from Cloud connector host.

                      Next, provide a virtual host and port, which will be mapped to the actual ones

                      Next, choose the option to use Virtual host and port

                      Next, check Internal host and finish the configuration

                      Under the check result if we see ‘Reachable’ comment that means we have a successful connection from Cloud Connector to SAP PO. Otherwise, troubleshooting may be required.

                      In addition to this we will have to expose following PO services for Migration assessment (service names are listed below)

                      /CommunicationChannelInService
                      /IntegratedCongurationInService
                      /SenderAgreementInService
                      /AlertRuleInService
                      /IntegratedConfiguration750InService
                      /ValuemappingInService
                      /ConfigurationScenarionInService
                      /BPMFacaseBeanImplService
                      /ReceiverAgreementInService
                      /rep/read/ext
                      /dir/read/ext
                      /rep/support/SimpleQuery

                      Add each of the services here with path and all sub paths selection

                      This is where our Cloud Connector configuration is completed.

                      Configurations and Setup on cloud:

                      To check the connectivity from Integration Suite, log in to Integration Suite, click ‘monitor’ from the left pane and select ‘integrations’

                      Go to the ‘Cloud Connector’ tab, provide location ID that was defined during Cloud Connector setup and click on ‘Send’

                      If you get a message like below, we have a successful connection between Cloud Connector and the Integration suite which also means we have successfully connected to On-premise PO system.

                      Now, that we have the connectivity set up and we can proceed with the assessment.

                      Please make sure that you have a valid SAP PO user setup with the following roles/authorizations and that you have the user credentials.

                      SAP_XI_API_DISPLAY_J2EE

                      SAP_XI_API_DEVELOP_J2EE

                      SAP_XI_MONITOR_J2EE

                      Let us now explore how we can perform the assessment on PI/PO environment.

                      Data extraction:

                      On the Integration Suite, select ‘Access Migration Scenario’ -> ‘create requests’

                      On the left had panel, select ‘settings’ to add the on-premise PO system

                      add to the fields. Remember the credentials we created in PO and the location ID of the cloud connector.

                      now we can select the system and perform test. look out for ‘successfully connected’ message

                      Now that this is done.

                      Click ‘Request’ -> ‘Data Extraction’

                      create data extraction request

                      provide a request name of choice and the system (that we created in previous step)

                      as soon as you click ‘create’ the extraction starts, look out for the ‘in progress’ message.

                      This will take a while, wait for ‘completed’ message. monitor the logs in the meantime.

                      Evaluation:

                      Once the extraction is completed. we have to go to ‘Scenario Evaluation’

                      Click on ‘create’

                      The following fields will be filled based on the data extraction we ran in the previous step, specially the ‘data extraction request’ name

                      Click create and watch out for evaluation completed status

                      Download the artefacts

                      Assessment result interpretation:

                      The artefacts contain an excel sheet and a pdf document

                      In the excel we will get two sheets. ‘Evaluation by integration scenario’ and ‘Full evaluation results’.

                      Let us look at ‘Evaluation by integration scenario’ first. This will give a list of all the Integration scenarios that are available in the PO system

                      Notice the three attributes – ‘weight’, ‘T shirt size’ and the ‘Assessment category’.

                      The ‘Assessment Category’ signifies how easy or difficult it is to migrate the Integrations to Integration Scenario. See following definitions for assessment categories

                      • Ready to migrate: can be migrated automatically using the migration tool, some post migration efforts may still be required.
                      • Adjustments required: Can be migrated partially by the tool and remaining adjustments will have to be performed before activating the scenario.
                      • Evaluation Required: means that the tool could not evaluate the migration scenario and that manual assessment is required.

                      ‘Weight’ and ‘T shirt size’ are indicators of the efforts required for the migration.

                      T-shirt sizing is derived based on below rules

                      Looking at the ‘Full Evaluation Results’, we will see a set of ‘rules’ against each ‘integration scenario’

                      Each rule will indicate the interfaces engaged and each of these rules will have respective weightage

                      The sum of all the weightage for rules will provide the weightage of the Integration Scenario.

                      Another important thing to note is that even if one of the rule is categorized as ‘Evaluation required’ then the whole Integration scenario will be marked as ‘Evaluation required’ and should be dealt with accordingly, similarly for ‘Adjustment required’ cases.

                      The pdf from the assessment reports provides a lot of details too.

                      The summary provides the efforts estimated and a view of the assessment category

                      a pie chart view of the assessment categories

                      The report will also provide Sender and Receiver adapters count that will be migrated and that may not be supported.

                      Estimations by effort classification

                      Tabular view of the efforts required for each T shirt size – against the assessment categories

                      In the report, there are recommendations provided for the migration approach, eg. Reference templates from Integration suite can be used.

                      Rating: 0 / 5 (0 votes)

                      The post SAP PO to Integration Suite | Migration Assessment Tool appeared first on ERP Q&A.

                      ]]>
                      SAP Cloud Integration Looping Process Call https://www.erpqna.com/sap-cloud-integration-looping-process-call/?utm_source=rss&utm_medium=rss&utm_campaign=sap-cloud-integration-looping-process-call Tue, 18 Jun 2024 11:15:32 +0000 https://www.erpqna.com/?p=85654 Introduction Looping Process Call refers to a way of repeating processes in an SAP program using loops, calling one or more processes within a specific cycle. This can be useful, especially when working with large data sets or when automating a series of tasks. we are working with large datasets, fetching them all at once […]

                      The post SAP Cloud Integration Looping Process Call appeared first on ERP Q&A.

                      ]]>
                      Introduction

                      Looping Process Call refers to a way of repeating processes in an SAP program using loops, calling one or more processes within a specific cycle. This can be useful, especially when working with large data sets or when automating a series of tasks. we are working with large datasets, fetching them all at once increases RAM consumption.

                      With this method, we retrieve our data in fragments based on a certain condition, and after the loop ends, our data is merged as a whole. From a performance perspective, we alleviate the strain on memory. It reduces processing time and enhances performance. It simplifies our overall data analysis.

                      Now, we will design a scenario in cloud integration. We will fetch large data from OData and loop it through a specific condition (looping process call). Then, we will observe the results together.

                      Prerequisite: BTP Cockpit access and Integration Suite

                      Figure 1. Integration Overview

                      Step 1. First, we will create a CPI link to be able to make calls to the service.

                      Figure 2. Sender HTTPS Adapter

                      Adapter Type: HTTPS
                      Address: Specific

                      Step 2. We specify that the looping process call will work according to the condition expression specified in the “condition expression” field. By stating “.hasMoreRecords contains ‘true’, we indicate that the loop will continue to run as long as there are multiple records.

                      When this condition returns false, the loop will end.

                      Figure 3. Loop Process Call

                      Step 3. OData informations.

                      Figure 4.Odata Adapter Connection Information

                      Step 4. We use the “select” clause to choose which fields we want to retrieve from the Orders entity.

                      Our method is GET.

                      We need to mark “Process in Pages”. If we don’t mark it, the system will send all the data at once after entering the loop once.

                      Figure 5.Odata Adapter Processing Information

                      Step 5. After passing through the filter, the data will no longer include “Orders” but will start with “Order.” This is because we need the information of “Orders/Order” due to sending the data in fragments. After completing the process of sending fragmented data, we will merge it in the “Message Body” of the Content Modifier.

                      Figure 6.Filter

                      Step 6. ${property.payloadStack}${in.body} : We use it to continue adding each incoming fragmented data to the body. You can take a look at it.

                      Figure 7. Content Modifier-Exchange Properties-Append Body

                      Step 7. We add the “Orders” tag, which we ignored with the filter, to this content modifier. Once the loop is completely finished, we add the merged data as a property.

                      Figure 8.Content Modifier-Message Body

                      Step 8. To indicate that the last data will come in XML format, we add the “Content-Type” header.

                      Figure 9.Content Type

                      Step 9. We fill in the necessary information for the email adapter.

                      Figure 10. Mail Adapter Connection Information

                      Step 10. We determine who the email will come from and who it will go to.

                      Figure 11. Mail Adapter Processing Information

                      Step 11. Save and Deploy. Then once we have created a CPI link, we need to call it using the GET method in Postman after the deployment.

                      Step 12. We are making a call to the CPI service using the CPI username and password.

                      Figure 12.Postman

                      Step 13. It entered the loop a total of 6 times, but on the 6th request, since there was no data left inside, it combined the data sent in fragments, exited the loop, and continued to ‘End’.

                      Figure 12.Monitoring

                      When we look at our first loop, it indicates that in the first request, it fetched the first 200 records from the entire data set and provided the information that the next loop would start with OrderID 10448 using the expression “$skiptoken=10447”.

                      In each loop, as it adds data, it indicates that there were 400 records in the 2nd request, and when it enters the 3rd loop, it won’t fetch the same initial 400 records again. Similarly, it shows that in the next loop, the data will start with OrderID 10648.

                      The important point to note is that it continues to loop as long as the condition we set is met, meaning it enters the loop as long as it evaluates to true.

                      When we check the final step, we understand that this condition returns false, indicating that it has fetched all the data inside.

                      Due to the condition, since the Loop process has ended, we receive information that the last data has the OrderID number 11047.

                      Finally, I wanted to add an email adapter. It notifies all the information via email.

                      Rating: 0 / 5 (0 votes)

                      The post SAP Cloud Integration Looping Process Call appeared first on ERP Q&A.

                      ]]>
                      Automatically update SSL Certificates before they expire in SAP CPI https://www.erpqna.com/automatically-update-ssl-certificates-before-they-expire-in-sap-cpi/?utm_source=rss&utm_medium=rss&utm_campaign=automatically-update-ssl-certificates-before-they-expire-in-sap-cpi Fri, 31 May 2024 11:27:28 +0000 https://www.erpqna.com/?p=85157 I will explain how to automatically install SSL certificates on CPI using SAP’s APIs.You can follow the steps below to set a timer and have it loaded automatically, without having to manually check whether it has expired or not. Automatically update system certificates before they expire with SAP CPI and Groovy (openssl command) Instead of […]

                      The post Automatically update SSL Certificates before they expire in SAP CPI appeared first on ERP Q&A.

                      ]]>
                      I will explain how to automatically install SSL certificates on CPI using SAP’s APIs.You can follow the steps below to set a timer and have it loaded automatically, without having to manually check whether it has expired or not.

                      Automatically update system certificates before they expire with SAP CPI and Groovy (openssl command)

                      Instead of manually updating the certificate, we can automatically install the certificate before it expires with this API created by SAP.

                      You can use CPI APIs to update a certificate in Keystore.

                      In this scenario, we will perform a PUT operation to the /CertificateResources path of the CPI API below.

                      MethodResource Path
                      PUT/CertificateResources(‘{Hexalias}’)/$value

                      We must convert the name (alias) of the certificate we want to update in CPI KeyStore to hexadecimal.

                      In this scenario, we will update the certificate for facebook

                      hexadecimal value for facebook: 66616365626F6F6B

                      Note: For hexadecimal format, you can use text to hexadecimal converter online.

                      The URL to be sent in the put operation:

                      https://<host address>/api/v1/CertificateResources(‘66616365626F6F6B)/$value?fingerprintVerified=true&returnKeystoreEntries=false&update=true

                      When you test the service with the hexadecimal value in Postman, you can manually import and update the certificate.

                      The current content of the certificate is written to the Request Body:

                      For Example:

                      [---Begin Certificate----]
                      
                      AHcAdv+IPwq2.....
                      
                      .....
                      
                      .....
                      
                      1tIQYIeaHKDHPA==
                      
                      [---End Certificate----]

                      Header, Params and Body fields are defined as in the service document.

                      Header:

                      Request Body:

                      We will now do the same operation we did manually in Postman in CPI using SAP’s API.

                      The steps to be taken in CPI for this process are as follows.

                      First, we get FetchToken to log into CPI.

                      We write the following information for Get Token.

                      A user authorized in CPI is defined and the CPI link is written in the Address field.

                      In the next step, we will check the SSL/TLS certificate of the server named ” graph.facebook.com ” with groovy and obtain the current certificate.

                      Code detail is as follows:

                      import java.security.cert.X509Certificate
                      import java.util.Base64
                      import javax.net.ssl.SSLPeerUnverifiedException
                      import javax.net.ssl.SSLSession
                      import javax.net.ssl.SSLSocket
                      import javax.net.ssl.SSLSocketFactory
                      import com.sap.gateway.ip.core.customdev.util.Message
                      
                      def processData(Message message) {
                          try {
                              def factory = SSLSocketFactory.getDefault() as SSLSocketFactory
                              def socket = factory.createSocket("graph.facebook.com", 443) as SSLSocket
                      
                              // Connect to the peer
                              def session = socket.getSession()
                              X509Certificate cert = session.peerCertificates[0] as X509Certificate
                      
                              def sDNName = cert.issuerDN.name // Server's DN Name
                              def sDEREncoded = Base64.getEncoder().encodeToString(cert.encoded)
                      
                              // Set Properties
                              message.setProperty("sDNName", sDNName)
                              message.setProperty("sDEREncoded", sDEREncoded)
                      
                              return message
                          } catch (SSLPeerUnverifiedException e) {
                              throw new Exception("graph.facebook.com did not present a valid cert.")
                          }
                      }

                      Then we add a new content modifier.

                      Header details:

                      We store the certificate we received in our body.

                      Put service details as follows:

                      The part that should not be skipped here is; Http Session Reuse option should be On Exchange.

                      This option enables Http session reuse. More than one message can be exchanged with one http session. Since there will be no re-authentication in the second message, subsequent calls will be made faster.

                      After saving and deploying the integration, we can view it from the logs.

                      In the following section, we write the sDEREncode of the certificate to the body and thus the certificate in the keystore is updated.

                      Certification dates before operation

                      When we run the integration, it is updated as follows:

                      You can understand that the imported certificate has changed when the date below is updated.

                      Rating: 0 / 5 (0 votes)

                      The post Automatically update SSL Certificates before they expire in SAP CPI appeared first on ERP Q&A.

                      ]]>
                      Fetch data in chunks using pagination from S/4 Hana Cloud’s OData API https://www.erpqna.com/fetch-data-in-chunks-using-pagination-from-s-4-hana-clouds-odata-api/?utm_source=rss&utm_medium=rss&utm_campaign=fetch-data-in-chunks-using-pagination-from-s-4-hana-clouds-odata-api Mon, 01 Jan 2024 09:29:07 +0000 https://www.erpqna.com/?p=80628 Introduction: This document describes how to fetch data in chunks using pagination from S/4 Hana cloud’s OData API. Let’s understand pagination first, In the context of OData (Open Data Protocol), pagination refers to the practice of dividing a large set of data into smaller, more manageable chunks or pages. This is done to improve the […]

                      The post Fetch data in chunks using pagination from S/4 Hana Cloud’s OData API appeared first on ERP Q&A.

                      ]]>
                      Introduction: This document describes how to fetch data in chunks using pagination from S/4 Hana cloud’s OData API.

                      Let’s understand pagination first,

                      In the context of OData (Open Data Protocol), pagination refers to the practice of dividing a large set of data into smaller, more manageable chunks or pages. This is done to improve the performance of data retrieval and to reduce the amount of data transferred over the network. Pagination is a common technique in APIs and web services to handle large result sets efficiently.

                      In OData, pagination is typically achieved through the use of query parameters, specifically the $skip and $top parameters. Here’s a brief explanation of these parameters:

                      1. $skip: This parameter is used to specify the number of items that should be skipped from the beginning of the result set. It is often used in conjunction with $top to implement paging. For example, if you want to retrieve results 11 to 20, you would set $skip=10.
                      2. $top: This parameter is used to specify the maximum number of items to be returned in the result set. It works in conjunction with $skip to define the size of each page. For example, if you want to retrieve the first 10 items, you would set $top=0.
                      3. $inlinecount: This parameter is used to get total count of records by passing value “allpages” to this parameter.

                      By using $skip,$top and $inlinecount together, you can navigate through the result set in chunks, effectively implementing pagination.

                      For instance, to retrieve the second page of results (items 501-600), you might use $top=100 and $skip=500.

                      Now, you have a scenario where you need to fetch the full load from the Business Partner API or any other API of S/4 Hana public cloud with 500 records per page.

                      Here, $inlinecount parameter will come in the picture.

                      First get call to the API with https://<S4hanaAPIHostName>:<Port>/API_BUSINESS_PARTNER/A_BusinessPartner?$top=500&$skip=0$inlinecount=allpages

                      in response, you will get 500 record if records are equal to 500 or more else full load will come in first page it self.

                      Along with these records, you will also get count of total records in “count” element

                      if (count>500)

                      then calculate the number of API call you need to make based on total count

                      After every API call, Increase the value of skip by 500.

                      To do all this, you need to write a program in language supported by you application or middleware system.

                      Let’s call this API in CPI with Pagination

                      In CPI, OData adapter can manage all parameters, we need to just manage the number of times API should be called by “looping process call” artifact.

                      As you can see in the below picture, fetching data from S/4 Hana, transform the data in Salesforce Soap API format and transferring to Salesforce system is happing in “Local Integration Process 1”

                      And Integration Process is only use to call “Local Integration Process 1” in loop.

                      In OData adapter, just tick “Process in Pages” and enter the total number or records should be called in a single call in “Page size”.

                      Every time when API call fetches the data, a property “${property.<RECEIVER>.<Channel>.hasMoreRecords} ” is set with true if there are still records to fetch from S/4 Hana or set with false if no records are left to fetch

                      This property we need to set in Lopping Process Call to give the condition for stopping loop

                      Receiver: S4Hana

                      Channel: ODataS4

                      Condition in Lopping Process Call: ${property.S4Hana.ODataS4.hasMoreRecords} contains ‘true’

                      Conclusion: After reading this document, you can easily use OData API with pagination and how to use it in CPI.

                      Rating: 0 / 5 (0 votes)

                      The post Fetch data in chunks using pagination from S/4 Hana Cloud’s OData API appeared first on ERP Q&A.

                      ]]>
                      SAP Event Mesh: S4 HANA On-Prem and SAP CI(BTP IS) through WebSocket and Webhooks https://www.erpqna.com/sap-event-mesh-s4-hana-on-prem-and-sap-cibtp-is-through-websocket-and-webhooks/?utm_source=rss&utm_medium=rss&utm_campaign=sap-event-mesh-s4-hana-on-prem-and-sap-cibtp-is-through-websocket-and-webhooks Wed, 11 Oct 2023 09:39:07 +0000 https://www.erpqna.com/?p=78869 Introduction What is Event Mesh? SAP Event Mesh allows applications to communicate through asynchronous events. Experience greater agility and scalability when you create responsive applications that work independently and participate in event-driven business processes across your business ecosystem. In this decoupled integration the producers of the events will not know the consumers of the events. […]

                      The post SAP Event Mesh: S4 HANA On-Prem and SAP CI(BTP IS) through WebSocket and Webhooks appeared first on ERP Q&A.

                      ]]>
                      Introduction

                      What is Event Mesh?

                      SAP Event Mesh allows applications to communicate through asynchronous events. Experience greater agility and scalability when you create responsive applications that work independently and participate in event-driven business processes across your business ecosystem. In this decoupled integration the producers of the events will not know the consumers of the events.

                      Publish business events from SAP and non-SAP sources across hybrid landscapes from the digital core to extension applications through event-driven architecture. Consume business events from SAP and non-SAP sources throughout SAP’s event-driven ecosystem including SAP Extension Suite, SAP Integration Suite, and selected inbound enabled SAP backends. Achieve reliable data transmission for extension and integration scenarios through decoupled communication.

                      What is Websocket?

                      WebSocket is a stateful protocol used in the client-server communications used to enable the application you are using to receive the data without requesting it from the server like two-way communication. It means the connection between client and server will keep alive until it is terminated by either party (client or server). Data is sent and received much faster than HTTP as HTTP has a 2000-byte overhead whereas WebSocket has only a 2-byte cost. WebSockets supports multiple data types. In SAP CI AMQP adapter supports the websocket transport protocol.

                      What is Webhook?

                      Webhooks are stateless, meaning that each event notification is independent and carries all the necessary information about the event. Webhooks are for server to server communication. Web-hooks are commonly used to perform smaller requests and tasks on top of standard API calls. Webhooks are suitable for POST/PUSH mechanism. In the Event Driven Architecture when the Producer publish the events to the respective topics, then the Queues subscribed to those topics will trigger the event data to the consumer using the Webhook instead of consumer trying to pull the events.

                      What is Queue?

                      A queue is used to store a message until it’s received by the subscribed consumer. Queue cannot process message, it is first consumer and just store message/event. If any events are published in the Queue then only one subscriber to that Queue received a message. Queues are managed and persistent. When there are no consumers to a queue, messages are stored in the queue.

                      What is Topic?

                      Topics are not managed and not persistent. A topic is created on the go when we start publishing messages to a topic, and destroyed when there are no consumers listening to that topic. If the consumer to a topic is not running when a message is published to the topic, that event will not be received. One Event Producer can publish to one Topic. One Topic can be connected to multiple queues. Multiple queues can be subscribed to one topic.

                      Prerequisites:

                      1. BTP cockpit access with Event Mesh subscription.
                      2. S4 HANA access in tcodes STRUST and SPRO
                      3. SAP BTP IS Tenant Access with required roles/roles collection having Enterprise Messaging*.

                      Step-1: Add Service Plans in the Entitlements of subaccount for Event mesh along with creation of instance and subscription.

                      Select the standard and default service plans for the Event Mesh in the Entitlement

                      “Standard” for Event Mesh subscription and “default” for instance creation

                      In the Instances and Subscription, create the “standard” event mesh subscription and “default” instance creation on top of the cloud foundry space and runtime environment.

                      While creating the service instance creation the Instance Name and EMNAME in JSON file should be same along with the Namespace in required format e.g. a/b/c

                      Step-2: Creation of Service Key for eminstance created

                      The clientid, clientsecret,tokenendpoint, protocol and uri should be noted for further configurations

                      Step-3: Creation of new Queue in Instance of Event Mesh

                      Step-4: Test publishing and consuming messages on a queue. [Optional]

                      Select the queuename and message client name for publishing and consuming the messages.

                      Step-5: Creation of OAuth2 client credentials in Security Material of Integration Suite

                      Use the TokenEndPoint URL and ClientID and Client Secret from the Event Mesh service Key created.

                      Step-6: Produce the events from SAP CI into Event Mesh

                      A new IFlow needs to be created with either HTTPS as sender for triggering from Postman or timer based event with some message in content modifier and AMQP as receiver.

                      For Demo purpose we will use the HTTPS as sender and AMQP receiver with WebSocket message protocol. Host will be the URI from the service key. Port:443 Path will be the protocol from service key. Credential Name from the security material.

                      We can publish on topics or the queue from the AMQP(Websocket) of CI.

                      For topic, topic:topicname(this topicname should be subscribed on the queue created in event mesh) and for queue, queue:queuename

                      Test from Postman with deployed IFLOW endpoint with the clientid and clientsecret of the Process Integration Runtime service key. This is to understand how the event are published. Not a realtime scenario.

                      Check the CI Message Monitor for successful message to Event Mesh

                      Check in the Event Mesh Queue with messages count increased from 0 to 1 along with queue size.

                      Step-7: Consume the events from Event Mesh into CI

                      Consuming the events from Event Mesh into CI can be done either with websocket(AMQP) or webhook.

                      In this case will create a websocket(AMQP) connection by configuring a new IFLOW with HOST, Path from Event Mesh service key, Port:443 and Credential Name from Security Material.

                      The queue name as queue:queuename. While consuming the Events from CI its always best to consume from queue rather the topic as topic are not persisted and queues are persisted which we can use for guaranteed delivery. As the message will be consume in JSON format we can convert it into XML as required.

                      Once deployed the message in the queue will be consumed by the CI IFLOW

                      Step-8: Create Topic and upload the certificate to produce the Events from S4 HANA onto Event Mesh

                      The overall design for S4 HANA to publish and CI to consume can be done in two ways

                      A. Websocket Communication

                      B. Webhook Communication

                      Websocket is already explained in the previous steps. Will check now the Webhook communication.

                      On top of the created Queue in Event Mesh we need to subscribe to a Topic

                      Download the certificate from the Token Endpoint along with clientid and clientsecret generated from the Event Mesh Service Key

                      Upload the certificate in S4 HANA STRUST in SSL Client SSL Client(Standard) and add to certificate list

                      Step-9: Create a AMQP Channel in SPRO tcode

                      Goto SAP Reference IMG=>ABAP Platform=>Enterprise Event Enablement=>Administration=>Channel Connection Settings=>Manage Channel and Parameters

                      Use the service key option to create the AMQP channel

                      Check the connection and activate the channel

                      Step-10: Create outbound binding on the created AMQP channel

                      Select the created channel and click on outbound binding and provide the topic name.

                      Step-11: Create webhook in the SAP Event Mesh

                      In order to create the webhook we need any HTTPs URL to be build and deployed. In our case we will create a new IFLOW with HTTPS as sender in CI itself.

                      Create a webhook with queue name, CI deloyed IFLOW endpoint and authentication with client id and client secret of the CI Process Integration Runtime

                      Testing:

                      Testing the Event from S4 HANA onto Event Mesh which will push through the WEBHOOK on CI.

                      Goto SAP Reference IMG=>ABAP Platform=>Enterprise Event Enablement=>Test Tools=>Produce Test Events

                      Select the channel name created and trigger the Event.

                      Check the CI Monitoring whether the event triggered from S4 HANA to Event Mesh was pushed onto CI through Webhook

                      Question 1??? What happens for triggered event from S4 HANA(Producer) is published on one topic in Event Mesh and there are both Websocket Queue Consumer to CI and Webhook Queue Consumer to CI with same topic/queue details.(Ideally it will not be the case in realtime but for testing will explore that case here)

                      Answer!!! As the queue consumer will be in continuous consumption mode it will be more quick to take the message in rather waiting for the webhook to trigger the message to the configured HTTPs endpoint.

                      Evidence***

                      The deployed consumer IFLOW is in continuous consumption mode. Also the consumer should be subscribed from Queue instead of Topic for persistence.

                      Question 2??? What happens when there are some errors(mapping/runtime) during the consumer receiving messages from the Event Mesh Queue? Will the message be deleted from Queue immediately or it will persist till the successful consumption?

                      Answer!!! The message in the queue will be present till the consumer receives and process it successfully. During that time the consumer will continuously retry.

                      Evidence***

                      When the consumer IFLOW intentionally failed for some content exception the message in the queue was still persisted and the IFLOW was automatically retried almost 8770 times in less than a minute.

                      Rating: 0 / 5 (0 votes)

                      The post SAP Event Mesh: S4 HANA On-Prem and SAP CI(BTP IS) through WebSocket and Webhooks appeared first on ERP Q&A.

                      ]]>