SAP Integration Strategy - ERP Q&A https://www.erpqna.com/tag/sap-integration-strategy/ Trending SAP Career News and Guidelines Wed, 19 Feb 2025 06:44:27 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP Integration Strategy - ERP Q&A https://www.erpqna.com/tag/sap-integration-strategy/ 32 32 RISE with SAP: Multi-layer Defense in Depth Architecture of SAP S/4HANA Cloud, Private Edition https://www.erpqna.com/rise-with-sap-multi-layer-defense-in-depth-architecture-of-sap-s-4hana-cloud-private-edition/ Wed, 19 Feb 2025 06:44:22 +0000 https://www.erpqna.com/?p=90553 Introduction The SAP S/4HANA Cloud, Private Edition is at the core of “RISE with SAP” offering which holds customer’s mission critical data and business processes. SAP Enterprise Cloud Services (ECS) provides a managed private environment with multi-layer defense in depth architecture handling infrastructure and technical managed services. This includes end-to-end SLA for the full solution […]

The post RISE with SAP: Multi-layer Defense in Depth Architecture of SAP S/4HANA Cloud, Private Edition appeared first on ERP Q&A.

]]>
Introduction

The SAP S/4HANA Cloud, Private Edition is at the core of “RISE with SAP” offering which holds customer’s mission critical data and business processes. SAP Enterprise Cloud Services (ECS) provides a managed private environment with multi-layer defense in depth architecture handling infrastructure and technical managed services. This includes end-to-end SLA for the full solution stacks and a proven security architecture, minimizing risk for our customers. Multi-layer security requires security to be handled at people, process and technology levels. In this blog, we will discuss high level multi-layer “defense in depth” architecture offered to customers. For the sake of simplicity, only high level and abstracted approach is presented.

Approach to Multi-Layer Defense in Depth Architecture:

SAP S/4HANA Cloud, Private Edition is a “single tenanted” managed private environment for customers where SAP creates a separate account (AWS) or subscription (Azure) or project (GCP) for each customer. The applications and database virtual instances are solely dedicated to a single customer. Security by Design and Security by Default are deeply embedded into the multi-layer architecture.

Defense in Depth – SAP S/4HANA Cloud, Private Edition

Data Security

The SAP S/4HANA Cloud, Private Edition supports the following data security features:

  • A separate virtual instance for database and applications servers for a customer.
  • Data encryption at rest: SAP HANA Data Encryption uses AES-256-CBC (256-bit key length) algorithm. Various encryption root keys (data volume, log volume, backup, application) are stored in the Instance Secure Store File System (SSFS) within the HANA database instance. SAP Crypto libraries used are FIPS 140-2 certified. Contents of SSFS is protected by SSFS Master Key.
  • Unique encryption root keys and master key are generated during installation and during HANA version updates. Master keys can also be changed in regular intervals upon request. Segregation of duties (SoD) principle is applied to key management.​
  • The data at rest is encrypted – database volume, backups, redo logs and storage encryption (Server-Side Encryptions) at the Hyperscaler storage
  • All HTTP traffics are protected with TLS 1.2 transport layer encryption with AES-256-GCM
  • SAP HANA has many built in security features for role-based access control, authorizations, UI masking and anonymization capabilities
Landscape of SAP S/4HANA Cloud, Private Cloud

Application Security

  • The Web Application Firewall is integrated with Application Gateway (Azure) or Application Load Balancer (AWS) to secure inbound traffic from Internet.
  • End-to-end encryption of data in transit.
  • Availability of secure connectors and agents that are required to integrate the SAP S/4HANA system with other SAP SaaS applications securely. The agents are provisioned on request basis and upon customer’s acquisition of respective cloud solutions.
  • Reverse Proxy – Web Dispatcher – No direct access to backend system.
  • Secure Cloud Integrations via SAP Cloud Connector.
  • All outbound connections are based on restricted access control list configured in the security components that are used within the cloud. All outgoing accesses support TLS 1.2 based in-transit encryption.
  • Support for identity authentication via SAML, Kerberos/SPNEGO, X.509 certificates.
  • Support for Multi-Factor Authentication.
Data in Transit Encryption

Network Security

  • A set of account or subscriptions or Project in IaaS provider environment (AWS/Azure/GCP) created for a Customer to deploy dedicated SAP instances (virtual). The customer specific Virtual Private Cloud (VPC) or Virtual Network (VNET) are created within each subscription/account/project to address specific system/data isolation requirements. Within each VPC/VNET, there will be multiple subnets (using private CIDR block IP addresses) created to segregate the environments.
  • Each subnet is configured with Security Group (AWS) or Network Security Group (Azure) or Firewall (GCP) with specific set of rules to control the network traffic.
  • Security policies that are defined at the higher-level hierarchy are pushed to each subscription/ project/ account.
  • Data replication traffic from primary to DR site will always go via private connectivity (peering)
  • Customer access to VPC or VNET will only be via a private dedicated connectivity. It is possible to configure that no network access be allowed to the managed environment from Internet.
  • SAP isolates admin network from the customer VPC/VNET using admin firewalls. Network traffic between customer VNET/VPC and SAP admin network always goes via encrypted VPN tunnels and all administrative data exchanges are encrypted via TLS 1.2 standards.
  • All administration access requests flows through an access manager workflow approval process and gets validated by a designated authority.
  • All actions including the granting/denying the admin access as well as actions performed by administrators are logged and audited.
Segregated Single Tenanted Landscape

Operational Security

SAP Enterprise Cloud Services (ECS) performs a number of tasks to secure the customer environment. This includes security patch management, hardening of operating systems, application, and databases virtual instances. The security incident and event management are available to collect, aggregate, correlate and apply security use cases for automatic alerts in the event of security incident occurrences. The team performs 24×7 infrastructure monitoring, database monitoring, security incident management, secure admin access, regular backups, security scanning and remediation to secure the environment for customers.

SAP S/4HANA Cloud, Private Edition – Security Overview

Audit and Compliance

SAP audits security controls which are validated through various Certifications & Attestations

  • ISO Certificates
    • ISO9001 Quality Management System
    • ISO27001 Information Security Management System
    • ISO27017 Implementation of cloud specific information security controls
    • ISO27018 Protection of personal data in the cloud
    • ISO22301 Business Continuity

SOC1 and SOC2 Type 2 audits are performed to validate design of security controls and implementation effectiveness of the security controls. SOC2 Type 2 report can be directly requested to SAP Trust Center subject to NDA. SOC1 Type 2 reports are available for existing customers who have production instance and have a valid NDA which can be requested via SAP Trust Center.

Rating: 5 / 5 (1 votes)

The post RISE with SAP: Multi-layer Defense in Depth Architecture of SAP S/4HANA Cloud, Private Edition appeared first on ERP Q&A.

]]>
SAP Integration Suite – Generative AI based Integration Flow Generation https://www.erpqna.com/sap-integration-suite-generative-ai-based-integration-flow-generation/ Sat, 27 Jul 2024 10:30:33 +0000 https://www.erpqna.com/?p=86813 Introduction SAP Cloud Integration version 6.54.**, one of the capabilities of SAP Integration Suite, comes with an enhancement on Generative AI (GenAI) feature – GenAI based Integration Flow Generation. This feature will be available only in SAP Integration Suite Cloud Integration Premium Editions, on specific regions/data centres JP10, EU10, AP10, US10, EU10-003, EU10-002, and US10-002. […]

The post SAP Integration Suite – Generative AI based Integration Flow Generation appeared first on ERP Q&A.

]]>
Introduction

SAP Cloud Integration version 6.54.**, one of the capabilities of SAP Integration Suite, comes with an enhancement on Generative AI (GenAI) feature – GenAI based Integration Flow Generation. This feature will be available only in SAP Integration Suite Cloud Integration Premium Editions, on specific regions/data centres JP10, EU10, AP10, US10, EU10-003, EU10-002, and US10-002. This feature is forecasted to rollout during calendar week 29/30/31 2024 (tentative timelines, subjected to change as per phased rollout plan).

Enabling GenAI Based Integration Flow Generation feature

As tenant administrator of your premium edition tenant, you need to enable this feature in Settings page of SAP Integration Suite. By enabling this feature, you need to agree to the relevant terms and conditions.

Below are the sample screenshots of the same

Settings Tab for Generative AI
Click on Edit
Switch ON the flag
Terms and Conditions

Clicking on Save button will display the user who has accepted the terms and condition. Click on Save and then click on Cancel button to come out of edit mode.

Generating Integration Flows

Once this feature is switched ON by the tenant administrator, you – as integration flow developer persona – can use this feature to generate integration flows.

The GenAI based integration flow generation makes use of the Unified Cloud Landscape (UCL) concept. For that, you need to configure your account’s System Landscape configuration with SAP Integration Suite formation and add the required systems in that formation.

Note: SAP Integration Suite formation/system type is getting rolled out to relevant data centres. This blog note will be updated once rollout is complete.

If you have a successful SAP Integration Suite formation created with some systems (e.g. S4), the GenAI based Integration flow generation feature will browse the systems and APIs and lists them in the Sender and Receiver sections in GenAI based integration flow creation UI dialog, as shown in the screenshot below.

Below are the sample screenshots of System Landscape configuration with Systems with system type SAP Integration Suite (the sub-account where Integration Suite is subscribed to) and Formations of type of Integration with SAP Integration Suite formation type along with other systems participating in the integration.

Some sample systems added

SAP Integration Suite sample formation

In case if you have not enabled System Landscape configuration as above, the systems and API listings will not happen for Sender and Receiver systems, and these fields will be empty in the selection of GenAI based integration flow UI dialog.

Now, you click on Add -> Integration flow menu in your package view, you will be given option of generating the integration flow with assistance from AI or create integration flow manually. Below are the screenshots of generating integration flows using generative AI.

Click on Package Edit

Edit the package

Click on Add -> Integration Flow menu

Select Integration Flow

Choose GenAI based Integration flow generation option
Provide your scenario description and click on Send button
Observe the AI response and correct the description in case of suggestions
After correcting the description, click on send button again. If scenario description is correct, AI will list the sender and receiver systems from configured System Landscape infromation
Clicking on Select button will list other systems from System Landscape configuration
AI will suggest integration scenario name, you can change it
Click on Generate button to generate integration flow

Upon click on Generate button, an integration flow will be generated as shown in

Generated Integration Flow template
Observe the Timer flow step generated as per the schedule in scenario description in GenAI dialog
Sender system address configuration has been pre-filled as per the system discovery from System Landscape (UCL) information along with Receiver system

The generated integration flow will act as a template, you need to further configure the integration flow and/or update the integration with some more integration steps to match your end to end integration scenario requirement.

Note: As mentioned above, in case of issues in UCL, the systems and API listings will not happen for Sender and Receiver systems, and these fields will be empty in the selection of GenAI based integration flow UI dialog as shown in sample screenshot below. You can still continue the integration flow generation.

GenAI Integration Flow generation dialog when UCL systems & API discovery fails, sender and receiver fields will be empty.

This feature is currently provided on the basis of free and fair usage option. Due to this, you will be allowed to generate limited number of integration flows per tenant per calendar month for next 6 months. The exact number of integration flows generated with GenAI will be difficult to provide, because it depends on the scenario description you provided and the responses from GenAI backend, and back-and-forth communication, which ends up in consumption of GenAI transactions. If you exhaust these transactions, you will observe below information message.

GenAI Transaction limits exceeded
GenAI Transaction limits exceeded

Summary

SAP Integration Suite – Cloud Integration GenAI based integration flow generation feature will help you bootstrap and accelerate the integration development activity.

As a first step towards the generative AI in integration, we have introduced this feature which currently able to interpret the scenario description and generate an integration flow with sender and receiver systems only. Going forward, we will be enhancing this offering to include mediation steps (e.g. converters, mappings etc.) based on the description on the integration scenario to generate more enriched integration flow.

Rating: 0 / 5 (0 votes)

The post SAP Integration Suite – Generative AI based Integration Flow Generation appeared first on ERP Q&A.

]]>
Create DataType and Message Type artifact in Cloud Integration capability of SAP Integration Suite https://www.erpqna.com/create-datatype-and-message-type-artifact-in-cloud-integration-capability-of-sap-integration-suite/ Wed, 10 Jul 2024 12:56:31 +0000 https://www.erpqna.com/?p=86393 Introduction SAP Cloud Integration version 6.54.xx comes with new feature, where in one can create Datatype and Messagetype as reusable design time artifacts in Cloud Integration capability of SAP Integration Suite This feature is available only in SAP Integration Suite standard and above service plans. SAP Cloud Integration version 6.54.xx software update is planned on […]

The post Create DataType and Message Type artifact in Cloud Integration capability of SAP Integration Suite appeared first on ERP Q&A.

]]>
Introduction

SAP Cloud Integration version 6.54.xx comes with new feature, where in one can create Datatype and Messagetype as reusable design time artifacts in Cloud Integration capability of SAP Integration Suite

This feature is available only in SAP Integration Suite standard and above service plans.

SAP Cloud Integration version 6.54.xx software update is planned on mid of July 2024 (date and time subjected to change).

Create DataType:

1. Open the Integration Suite Tenant and navigate to Design –>Integrations and API’s

2. Create an Integration Package or open an existing one.

3. Navigate to the Artifacts tab and click on Edit in the top right corner

    4. Click on Add drop down and select Data Type from the list

    5. Add Data Type dialog is displayed with Create (radio button) selected by default.

    6. Enter the values for the fields Name, ID,Target Namespace, Description, select the category – Simple Type(selected by default) or Complex Type for the Datatype you want to create and click on Add or Add and Open in Editor

    7. On Click of Add, the Data Type artifact with the provided name gets created and is listed in the Artifacts list page

    8. On Click of Add and Open in Editor, the Data Type artifact gets created with the provided name and the artifact gets opened in the Editor in display mode.

    9. The Editor contains three tabs : Overview,Structure and XSD.The Structure is shown by default when the artifact is opened. It displays the structure of the datatype in a tree table with the following columns :

      • Name : Contains the Name of the node(element or attribute).For Root node the name is same as the name of the Datatype and it cannot be edited.
      • Category : This column shows whether the root element has subnodes or not. For root node it is either Simple type or Complex Type and for submodes it can be either Element or Attribute. You cannot change values in this column.
      • Type: This column displays the type with which the node is defined.Here you select a built-in data type or reference to an existing data type for an element or attribute. You must specify a type for attributes.
      • Occurrence: Determines how often elements occur.For attributes, you can determine whether the attribute is optional or required.
      • Restrictions : This column displays the facets (if any) defined incase the node is defined by a built-in primitive type or a user defined Simple type Datatype

      9. Switch to edit mode and to define/build the Structure of the Datatype. On selecting the first row(rootnode),the Add drop down in the table header gets enabled and also the details of the row are displayed in the right side section of the editor.

      10. Simple Type Data Type :

      • No child nodes can be added
      • Root node is defined by string built-in primitive datatype.
      • Click on the root node and the Properties sheet which contains the details of the node selected is displayed on the right side of the editor. In Edit mode, user can edit the Type, define the restrictions applicable for the Type selected.

      11. Complex Type Data Type :

      To add child nodes:

      • Click on the root node and the Add drop down in the table header gets enabled.

      Add –>Element to add child element node

      Add –>Attribute to add attribute node

      Add –>Rows to add multiple Elements/Attributes

      • Click on the newly added node and define the details in the Properties sheet

      12. Once the Structure is defined,Click on Save to save the artifact as Draft, Save as Version to save the artifact as versioned artifact.

      13. XSD tab displays the read only view of the xsd schema of the Datatype artifact

        Create MessageType:

        1. Open the Integration Suite Tenant and navigate to Design –>Integrations and API’s

        2. Create an Integration Package or open an existing one.

        3. Navigate to the Artifacts tab and click on Edit in the top right corner

        4. Click on Add drop down and select Message Type from the list

          5. Add Message Type dialog is opened

          6. Enter the values for the fields Name, ID, XMLNamespace, Datatype to be Used, Description, and click on Add or Add and Open in Editor

          7. On Click of Add, Message Type artifact gets created and is listed in the Artifacts list page

            8. On Click of Add and Open in Editor, MessageType artifact gets created and the artifact gets opened in the DataType Editor with Structure tab loaded by default in non-edit mode. The rootnode Name would be same as the Message Type name, Category as Element and Type as Data Type Used (if selected in the Add Message Type dialog)

            9. Overview tab in Edit mode is as shown below :

            10. XSD tab

            11. Datatype Used to create a Message type can be changed in Overview tab or in Structure tab. Switch to edit mode and select the root node in the Structure tab.The properties sheet gets displayed on the right side of the page with Datatype Used field as editable.

            12. No other nodes(child nodes) are editable in the Message Type artifact.

              Rating: 0 / 5 (0 votes)

              The post Create DataType and Message Type artifact in Cloud Integration capability of SAP Integration Suite appeared first on ERP Q&A.

              ]]>
              SAP Integration Suite, advanced event mesh: Using SAP SuccessFactors solutions as an Event Source https://www.erpqna.com/sap-integration-suite-advanced-event-mesh-using-sap-successfactors-solutions-as-an-event-source/ Sat, 24 Dec 2022 09:58:53 +0000 https://www.erpqna.com/?p=71231 SAP SuccessFactors solutions are cloud-based HCM software applications that support core HR and payroll, talent management, HR analytics and workforce planning, and employee experience management. SuccessFactors solutions are used by over 235+ million users in more than 200 countries and territories around the world. SAP SuccessFactors Intelligent Services Events SuccessFactors already comes with Intelligent Services […]

              The post SAP Integration Suite, advanced event mesh: Using SAP SuccessFactors solutions as an Event Source appeared first on ERP Q&A.

              ]]>
              SAP SuccessFactors solutions are cloud-based HCM software applications that support core HR and payroll, talent management, HR analytics and workforce planning, and employee experience management. SuccessFactors solutions are used by over 235+ million users in more than 200 countries and territories around the world.

              SAP SuccessFactors Intelligent Services Events

              SuccessFactors already comes with Intelligent Services events that allow to simplify HR workflows with the capabilities of these HTTP-based events. As a result, a number of SAP-built events are already available in SAP SuccessFactors that can be adjusted to specific use cases and needs and therefore used in event-driven business cases around SuccessFactors.

              Intelligent Services Events include for example:

              • Employee Hire – a new worker is created with a specified start date
              • Change in Manager – published after a job information change for an employee that has been assigned to a new manager
              • Change in Employee Location – a worker has moved to a new location

              A list of all available Intelligent Services Events can be found in the SAP SuccessFactors solutions documentation here.

              SAP Event Mesh as an event broker for SuccessFactors

              A few years ago my colleague Sai Harish Balantrapu has written an excellent blog on using SAP Event Mesh as an event broker for SuccessFactors.

              We will take the approach this blog has described and adjust it for usage with our new offering SAP Integration Suite, advanced event mesh. In the end the approach remains the same, with just a few adjustments that are needed.

              A lot of ground we are covering here has been described in the original blog. I was wondering whether it would make sense to just describe the differences, in the end decided to give you the full picture here to make it as easy as possible to follow.

              Again, Kudos to Sai Harish for all the groundwork!

              SAP Integration Suite, advanced Event Mesh

              SAP Integration Suite, advanced event mesh is a fully managed event streaming and management service that enables enterprise-wide and enterprise-grade event-driven architecture. Advanced Event Mesh is a distributed mesh of event brokers that can be deployed across environments, both in the cloud and on-premise It offers a full purpose set of eventing services covering all relevant use cases AEM supports event streaming, event management and event monitoring Brokers fully scale as required and come in T-shirt sizes to perfectly fit different needs

              High Level Overview of our Approach

              On the SuccessFactors side we will create an integration in the Integration Center. The destination for this integration is going to be REST and we will choose JSON format for the event. We will add selected fields to the event. Then we will have to create the destination settings. We will use basic authentication and REST. The information for the destination we will have to look up in Advanced Event Mesh, so keep it open in parallel.

              There is a very important step that we, most likely and depending on your individual settings, have to take before: we have to change the ports we use. The standard settings for SAP SuccessFactors ports don’t fit the standard settings for Advanced Event Mesh ports. So either we have to open up our AEM standard ports in SuccessFactors, or we can just adjust the ports on the Advanced Event Mesh side. Here we will just adjust the AEM port settings since this is very straightforward.

              Preparation on the Advanced Event Mesh side

              Go to the Cluster Manager and select your Event Broker

              Click on Manage

              Click on Advanced Options

              Scroll down to Port Configuration

              Expand Public Endpoint

              Check on the value for Secured REST Host. AEM standard settings here would be 9443 which is typically blocked by SAP SuccessFactors. By standard the Secured Web Messaging Host is set to port 443 in AEM.

              If you would like to adjust the ports on the AEM side (remember, you could open up the port on the SuccessFactors side as well), click on Edit.

              Change the Secured Web Messaging Host port to a different value (e.g. 7443)

              Then change the Secured REST Host to use port 443.

              It might take some time for these settings to reflect.

              Steps on the SuccessFactors side

              Step 1

              Logon to the SuccessFactors Home Page

              Then search for Integration in the search field

              Select Integration Center

              Step 2

              Click on the tile My Integrations

              Step 3

              Click the Create Button to create a new integration between SuccessFactors and Advanced Event Mesh

              Step 4

              Select More Integration Types

              Step 5

              On the next screen, select:

              Trigger Type → Intelligent Services
              Destination Type → REST
              Source Type → SuccessFactors
              Format → JSON

              Click the Create Button

              Step 6

              A list of all available Intelligent Service Events is displayed. Let us look at the Employee Hire event. Therefore, select Employee Hire

              Step 7

              The Employee Hire event information is displayed in the right-side panel and you can see the fields including a data preview.

              Click the Select button

              Step 8

              Enter an Integration Name and a Description and click Next

              Step 9

              On the next screen, click the + button and select Insert Sibling Element

              Step 10

              Select the newly added Element and enter the following in the Label field: context

              Step 11

              Select the Element “Context” and click the + button and select Add Child Element

              Step 12

              Select the newly added Element and enter the following.

              Label: userId
              Description: userid of the new hire.

              Click the

              button to do a mapping of userId

              Step 13

              Click Entity Tree View and Select User ID

              Step 14

              Click Change Association to “User ID”

              Step 15

              Select the context Element, click the + button and choose Add Child Element

              Step 16

              Select the Element and adjust as follows:

              Label: managerId
              Description: Manager of the new Hire
              Default Value: “Enter your user id”

              Step 17

              Click on Next and then on the Response Fields screen click on Next again until you have made it to the Filter screen

              Step 18

              Expand Advanced Filters, then enter the following:

              • Field: context/userId
              • Operation: is equal to
              • Value: <a user in your system>.

              The filter value enables you to test the integration.

              Click on Next.

              Step 19

              In the destination settings click on REST Server Settings and enter the following pieces of information from Advanced Event Mesh:

              Connection Name → <Any Name>
              REST API URL → <From AEM Secured REST Host>+ your topic (e.g. /successfactors)
              User Name → <From AEM Username>
              Password → <From AEM Password>

              To get this information, in Advanced Event Mesh go to your broker. Then select Connect.

              IMPORTANT: you need to add a topic to write to. So add for example /successfactors as a topic at the end of the REST API URL

              Click Next.

              Step 20

              Save the Integration by clicking the Save button

              Step 21

              Click on “Run Now” to test the event generation.

              Steps on Advanced Event Mesh side

              Employee Hired events are now configured on the SAP SuccessFactors side and will be written to a topic. In order to receive these events in Advanced Event Mesh we will create a queue and a queue subscription to our topic.

              Step 21

              Go to the Cluster Manager in Advanced Event Mesh, select your broker and click on Manage

              Step 22

              Click on Queues

              Step 23

              A new window opens up. Click on the +Queue button

              Step 24

              Enter a name for the queue, e.g. SuccessFactors and click Create

              Step 25

              On the next screen, click Apply

              Step 26

              Click on the queue you have just created. Then click on Subscriptions.

              Step 27

              Click on the button +Subscriptions

              Step 28

              Enter the topic you had used earlier on the SuccessFactors side as part of the REST URL. Most likely successfactors

              Click Create

              Step 29

              Go back to the Queues screen. You queue is now subscribed to your topic.

              Summary and Test

              You have set up an event end-to-end all the way from SAP SuccessFactors to SAP Integration Suite, advanced event mesh where it ends up in your queue based on a queue subscription. In a next step you could now consume the event from the queue.

              This diagram again shows the steps you have taken in SuccessFactors.

              Now you can test your setup.

              Go back to SuccessFactors and select your integration. Click the Run Now button to test the integration.

              Once the integration has run successfully you can see this in the status of Last Run Time.

              Once that has happened, go back to your queue in Advanced Event Mesh and check on whether the event has ended up in your queue.

              You should now have a basic setup for event exposure from SAP SuccessFactors to SAP Integration Suite, advanced event mesh up and running.

              Rating: 0 / 5 (0 votes)

              The post SAP Integration Suite, advanced event mesh: Using SAP SuccessFactors solutions as an Event Source appeared first on ERP Q&A.

              ]]>
              AWS Serverless Lambda Functions to integrate S3 Buckets with SAP S/4 HANA using OData APIs https://www.erpqna.com/aws-serverless-lambda-functions-to-integrate-s3-buckets-with-sap-s-4-hana-using-odata-apis/ Wed, 02 Nov 2022 11:03:21 +0000 https://www.erpqna.com/?p=69323 Introduction This blog shows to integrate AWS S3 Buckets, AWS Lambda Functions and SAP S/4 HANA OData APIs. For those unfamiliar with AWS S3 and Lambda functions, here are descriptions from the AWS websites: AWS (Amazon Web Services) Lambda is a serverless, event-driven service that allows you to execute any type of application logic dynamically […]

              The post AWS Serverless Lambda Functions to integrate S3 Buckets with SAP S/4 HANA using OData APIs appeared first on ERP Q&A.

              ]]>
              Introduction

              This blog shows to integrate AWS S3 Buckets, AWS Lambda Functions and SAP S/4 HANA OData APIs.

              For those unfamiliar with AWS S3 and Lambda functions, here are descriptions from the AWS websites:

              AWS (Amazon Web Services) Lambda is a serverless, event-driven service that allows you to execute any type of application logic dynamically without the need of dedicated servers. Lambda functions can be triggered from most AWS services and you only pay for what you use.

              AWS (Amazon Web Services) Simple Storage Service (Amazon S3) is one of the leading cloud based object storage solutions that can be used for all data sizes from databases to datalakes and IoT. The scalability, reliability, simplicity and flexibility with full API functionality is state-of-the-start.

              Architecture

              Here is an architecture diagram for the prototype. Normally SAP S/4 HANA system would be behind a firewall and there would be another layer to safely expose the services. For the purposes of simplifying the prototype, we want to keep the scenario focused on AWS S3, AWS Lambda and the SAP S/4 HANA OData API call.

              AWS S3 Bucket triggering Lambda function to SAP OData API call

              Background

              I was having an architecture discussion last week with Jesse, a good friend mine whom I used to work together at SAP Labs in Palo Alto. One of the things I really enjoyed over the years is talking to him about the new technologies and how to leverage them in the context of SAP. We have implemented many cutting-edge SAP integration projects and witnessed the evolution of SAP integration starting from the lowest level of C, C++, Java, and Microsoft COM/.Net components in the RFC SDK, Business Connector, BizTalk Integration, XI, PI, PO, CPI to the latest SAP BTP Cloud Integration with the SAP API Business Hub and SAP API Portal.

              My friend mentioned an interesting scenario where he built serverless lambda functions on AWS that could be triggered on a scheduled basis to run the logic which in his case was to check for the prices of a certain item on a website and trigger an alert if the prices reached a certain threshold – similar to the Kayak application which checks for prices across Expedia, Travelocity, etc. No need for dedicated server….No need to build or maintain a server… Just the logic code run on demand… What a powerful and amazing concept!

              I immediately started thinking of all of the possibilities and how it could be used for an SAP focused integration scenario. Let’s drop a sales order file into an AWS S3 bucket and have it immediately trigger an AWS Lambda function written in Node.js that invokes an SAP sales order OData RESTful service and have the response dropped into another AWS S3 Bucket. You can picture this as the evolution of the traditional integration scenario whereby a file is dropped on an SFTP server and then a middleware server regularly polls this folder for new files and then calls the backend SAP system through a BAPI or custom RFC or the modern OData approach. We get away from the antiquated SFTP servers and use the more versatile, flexible, and powerful S3 bucket technology offering. We get away from the older SAP BAPIs (BAPI_SALEORDER_CREATEFROMDAT2) and move to the latest SAP OData API. Evolution…..

              Getting started with the fully activated SAP S/4 HANA 2021 appliance

              So I convinced Roland, another SAP colleague whom I used to work with at SAP Labs in Palo Alto, to get this scenario up and running. We decided to spin up a new trial SAP S/4 HANA 2021 full appliance on the AWS Cloud through the SAP Cloud Appliance Library and start the adventure.

              SAP S/4 HANA 2021 FPS02 Fully Activate Appliance on SAP CAL

              SAP Cloud Appliance Library

              If you have access to an SAP S/4 HANA system that you can call (through SAP BTP Cloud integration, SAP API Hub, or a reverse proxy that exposes your cloud or on premise SAP S/4 HANA system) from the internet, then no need to spin up an appliance. If you do not have an SAP S/4 HANA system, then definitely spin one up and follow this blog to implement the integration scenario. It is easy to spin up a test SAP S/4 HANA fully activated appliance and it only takes a matter of minutes. Getting an SAP system up and running to do prototyping would normally take weeks of planning the hardware, purchasing the SAP software, installing, patching, and configuring. Now it takes minutes with the deployment capabilities through AWS, Azure and Google. The new images on SAP’s cloud appliance library are really impressive – the SAP Fiori launchpad runs correctly right away and as for the remote desktop machine, even the Eclipse installation can be triggered through a few mouse clicks which will load the SAP ADT (ABAP Development Tools). All of the SAP OData services and SAP Fiori apps are enabled by default which is very helpful! I have deployed many of these test appliances in the past since this capability was introduced but it still used to take time to get it set up to start working on a prototype development. Not anymore. Here you can see the SAP S/4 HANA instance and the remote desktop instance running. I shut the SAP Business Objects BI Platform and SAP NetWeaver instances off since they are not needed for this demo,

              SAP S/4 HANA instances running on AWS Cloud

              SAP Sales Order JSON Request and SAP OData API Service

              Here is the SAP sales order request JSON that we drop into the AWS S3 bucket. Note that this JSON request works for invoking the standard SAP sales order service API_SALESORDER_SRV to create a sales order in the fully activated appliance. For this project, we do not want to add any complexity by adding mapping requirements. In most common integration scenarios, there may be mapping required from other formats such as cXML, OAG XML, etc. Also, the SAP S/4 HANA system is not exposed directly but through SAP BTP Cloud Integration or SAP API Hub which adds a layer of security control over the interfaces. For now we just want to expose the API available through HTTPS to the lambda function.

              {
                 "SalesOrderType":"OR",
                 "SalesOrganization":"1010",
                 "DistributionChannel":"10",
                 "OrganizationDivision":"00",
                 "SalesGroup":"",
                 "SalesOffice":"",
                 "SalesDistrict":"",
                 "SoldToParty":"10100011",
                 "PurchaseOrderByCustomer":"File dropped into S3 Bucket",
                 "to_Item":[
                    {
                       "SalesOrderItem":"10",
                       "Material":"TG11",
                       "RequestedQuantity":"1"
                    },
                    {
                       "SalesOrderItem":"20",
                       "Material":"TG12",
                       "RequestedQuantity":"5"
                    }
                 ]
              }

              To verify that the SAP sales order service is running, run transaction code /IWFND/MAINT_SERVICE and check to see that the API_SALES_ORDER_SRV (Sales Order A2X) highlighted below is running. If it is not active, make sure to activate it through this transaction code.

              SAP Sales order service – API_SALES_ORDER_SRV

              AWS S3 Bucket for uploading the sales order request file

              Here is our ltc-inbound S3 bucket where we will upload the file to:

              S3 Bucket ltc-inbound

              When we upload the file, this automatically triggers the lambda function. We will show how to do this later in the document:

              File uploaded to the S3 Bucket

              AWS CloudWatch Monitoring View

              Once the Lambda function is triggered, you can see the log in AWS CloudWatch monitoring tool:

              AWS CloudWatch Lambda function execution log

              If you open up the log details, you can see that logging we have performed from our Node.js code – “writing sales order 4969 to S3 bucket ltc-outbound”.

              Lambda function detailed logs on AWS CloudWatch

              AWS S3 Bucket for writing the sales order response file

              In our Lambda function, we save the sales order response file in the ltc-outbound S3 bucket. Note that we have parsed out the order number and named the file with it to make it easier to discern amongst other files.

              Sales Order Response saved to AWS S3 Bucket

              Sales Order response JSON file

              Here is the response sales order JSON:

              {
                  "d":
                  {
                      "__metadata":
                      {
                          "id": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')",
                          "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')",
                          "type": "API_SALES_ORDER_SRV.A_SalesOrderType",
                          "etag": "W/\"datetimeoffset'2022-10-24T06%3A05%3A25.7050130Z'\""
                      },
                      "SalesOrder": "4969",
                      "SalesOrderType": "OR",
                      "SalesOrganization": "1010",
                      "DistributionChannel": "10",
                      "OrganizationDivision": "00",
                      "SalesGroup": "",
                      "SalesOffice": "",
                      "SalesDistrict": "",
                      "SoldToParty": "10100011",
                      "CreationDate": null,
                      "CreatedByUser": "",
                      "LastChangeDate": null,
                      "SenderBusinessSystemName": "",
                      "ExternalDocumentID": "",
                      "LastChangeDateTime": "/Date(1666591525705+0000)/",
                      "ExternalDocLastChangeDateTime": null,
                      "PurchaseOrderByCustomer": "File dropped into S3 J1",
                      "PurchaseOrderByShipToParty": "",
                      "CustomerPurchaseOrderType": "",
                      "CustomerPurchaseOrderDate": null,
                      "SalesOrderDate": "/Date(1666569600000)/",
                      "TotalNetAmount": "105.30",
                      "OverallDeliveryStatus": "",
                      "TotalBlockStatus": "",
                      "OverallOrdReltdBillgStatus": "",
                      "OverallSDDocReferenceStatus": "",
                      "TransactionCurrency": "EUR",
                      "SDDocumentReason": "",
                      "PricingDate": "/Date(1666569600000)/",
                      "PriceDetnExchangeRate": "1.00000",
                      "RequestedDeliveryDate": "/Date(1666569600000)/",
                      "ShippingCondition": "01",
                      "CompleteDeliveryIsDefined": false,
                      "ShippingType": "",
                      "HeaderBillingBlockReason": "",
                      "DeliveryBlockReason": "",
                      "DeliveryDateTypeRule": "",
                      "IncotermsClassification": "EXW",
                      "IncotermsTransferLocation": "Walldorf",
                      "IncotermsLocation1": "Walldorf",
                      "IncotermsLocation2": "",
                      "IncotermsVersion": "",
                      "CustomerPriceGroup": "",
                      "PriceListType": "",
                      "CustomerPaymentTerms": "0001",
                      "PaymentMethod": "",
                      "FixedValueDate": null,
                      "AssignmentReference": "",
                      "ReferenceSDDocument": "",
                      "ReferenceSDDocumentCategory": "",
                      "AccountingDocExternalReference": "",
                      "CustomerAccountAssignmentGroup": "01",
                      "AccountingExchangeRate": "0.00000",
                      "CustomerGroup": "01",
                      "AdditionalCustomerGroup1": "",
                      "AdditionalCustomerGroup2": "",
                      "AdditionalCustomerGroup3": "",
                      "AdditionalCustomerGroup4": "",
                      "AdditionalCustomerGroup5": "",
                      "SlsDocIsRlvtForProofOfDeliv": false,
                      "CustomerTaxClassification1": "",
                      "CustomerTaxClassification2": "",
                      "CustomerTaxClassification3": "",
                      "CustomerTaxClassification4": "",
                      "CustomerTaxClassification5": "",
                      "CustomerTaxClassification6": "",
                      "CustomerTaxClassification7": "",
                      "CustomerTaxClassification8": "",
                      "CustomerTaxClassification9": "",
                      "TaxDepartureCountry": "",
                      "VATRegistrationCountry": "",
                      "SalesOrderApprovalReason": "",
                      "SalesDocApprovalStatus": "",
                      "OverallSDProcessStatus": "",
                      "TotalCreditCheckStatus": "",
                      "OverallTotalDeliveryStatus": "",
                      "OverallSDDocumentRejectionSts": "",
                      "BillingDocumentDate": "/Date(1666569600000)/",
                      "ContractAccount": "",
                      "AdditionalValueDays": "0",
                      "CustomerPurchaseOrderSuplmnt": "",
                      "ServicesRenderedDate": null,
                      "to_Item":
                      {
                          "results":
                          []
                      },
                      "to_Partner":
                      {
                          "__deferred":
                          {
                              "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_Partner"
                          }
                      },
                      "to_PaymentPlanItemDetails":
                      {
                          "__deferred":
                          {
                              "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_PaymentPlanItemDetails"
                          }
                      },
                      "to_PricingElement":
                      {
                          "__deferred":
                          {
                              "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_PricingElement"
                          }
                      },
                      "to_RelatedObject":
                      {
                          "__deferred":
                          {
                              "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_RelatedObject"
                          }
                      },
                      "to_Text":
                      {
                          "__deferred":
                          {
                              "uri": "https://**.**.***.*:*****/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder('4969')/to_Text"
                          }
                      }
                  }
              }

              Sales Order response JSON file

              Sales Order in SAP S/4 HANA and transaction codes

              Here is the list of orders from the SAP VBAK sales order header table through the transaction code SE16n. Note, the order number 4969 captured in our AWS CloudWatch logs above.

              Table entries in SAP table VBAK – sales order header table

              We can bring up the sales order in SAP transaction code VA03:

              Sales order transaction VA03

              Here is the sales order – note the customer reference:

              SAP Transaction VA03 – Sales order display

              Lambda function details

              So here are the details of the lambda function.

              AWS Lambda Dashboard

              Click on the button – Create the function:

              Choose the – Use a blueprint option to get the sample code and choose the Get S3 Object:

              Use the Blueprint to Get S3 Object
              Blueprint Get S3 Object

              Note that we also need to create a role – processOrderRole.- which we will need permissions to read the S3 bucket file content:

              processOrder Lambda function configuration

              Here is the S3 trigger that invokes the lambda function:

              S3 trigger
              Additional configuration for the lambda function

              Here is the auto-generated Node.js code that is triggered. This code gets the event from which you can get the S3 bucket name and the S3 object key which you can then use to read the file contents.

              Generated Lambda Node.js code

              Here is our Node.js code which takes the input file and then makes the OData RESTful call to SAP sales order service endpoint. A challenging part we encountered was making the call to the SAP system which has a self-signed certificate. Also, note that since we are doing an HTTP Post to create the sales order, we need to pass in an x-csrf-token. We get the token from the metadata request using a Get which is the first call to the system before the post call.

              Here is the lambda function node.js code. You can reuse this code:

              // Author: Roland & Jay
              // Note: This code requires the request and lodash npm modules
              // Description: This code uses the AWS S3 events and objects and calls the SAP S/4 HANA sales order
              // OData API service
              
              console.log('Loading function')
              
              const aws = require('aws-sdk')
              const request = require('request')
              const {get} = require('lodash')
              
              const s3 = new aws.S3({ apiVersion: '2006-03-01' })
              
              
              exports.handler = async (event, context) => {
                //console.log('Received event:', JSON.stringify(event, null, 2))
              
                // Get the object from the event and show its content type
                const bucket = event.Records[0].s3.bucket.name
                const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '))
                const params = {
                  Bucket: bucket,
                  Key: key,
                }
                try {
                  const obj = await s3.getObject(params).promise()
                  const { ContentType, Body } = obj
                  const body = JSON.parse(Body.toString())
                  await processOrder(body)
                  return ContentType
                } catch (err) {
                  console.log(err)
                  const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`
                  console.log(message)
                  throw new Error(message)
                }
              }
              
              
              const hostname = '**.**.***.*'
              const port = *****
              const interface = 'sap/opu/odata/sap/API_SALES_ORDER_SRV'
              const auth = {
                user: '*********',
                pass: '**********',
              }
              const bucket = 'ltc-outbound'
              
              const buildCallParameters = (url, request, method = 'GET', extraHeaders = {}, jar = request.jar(), token = 'fetch', body) => {
                console.log('build', method)
                const params = {
                  url,
                  jar,
                  method,
                  rejectUnauthorized: false,
                  requestCert: true,
                  agent: false,
                  auth,
                  headers: {
                    'x-csrf-token': token,
                    ...extraHeaders,
                  },
                }
                return !body ? params : { ...params, body, json: true }
              }
              
              const httpCall = async (url, request, method = 'GET', extraHeaders = {}, jar, token, body) => {
                return new Promise((resolve, reject) => {
                  const params = buildCallParameters(url, request, method, extraHeaders, jar, token, body)
                  request(params,
                    (error, response) => {
                      if (error) {
                        return reject(error)
                      }
                      return resolve(response)
                    })
                })
              }
              
              const postDataToSAP = async function (json, metaDataUrl, postUrl) {
                const jar = request.jar()
                const tokenResp = await httpCall(metaDataUrl, request, 'GET', {}, jar)
                const token = tokenResp.headers['x-csrf-token']
                console.log('token: ', token)
                const postResp = await httpCall(
                  postUrl,
                  request,
                  'POST',
                  { 'Content-Type': 'application/json' },
                  jar,
                  token,
                  json,
                )
                return postResp
              }
              
              const processOrder = async (order) => {
                console.log('starting')
                try {
                  const { body } = await postDataToSAP(
                    order,
                    `https://${hostname}:${port}/${interface}/$metadata`,
                    `https://${hostname}:${port}/${interface}/A_SalesOrder`,
                  )
                  console.log('success: ', body)
                  const orderNum = get(body, 'd.SalesOrder', 'error')
                  console.log(`writing sales order ${orderNum} to S3 bucket ${bucket}`)
                  await putObjectToS3(bucket, `${orderNum}.json`, JSON.stringify(body))
                } catch (error) {
                  console.log('error:', error, ' error')
                }
              }
              
              
              const putObjectToS3 = (bucket, key, data) => {
                const params = {
                  Bucket: bucket,
                  Key: key,
                  Body: data
                }
                return new Promise((resolve, reject) => {
                  s3.putObject(params, (err, data) => {
                    if (err) {
                      return reject(err)
                    }
                    resolve(data)
                  })
                })
              }

              This code makes backend calls using the request npm module using await, async functions, and promises. There are two SAP backend calls that are made. The first one is get the metadata and the x-csrf-token. The next call which is the main call is to create the sales order. In order to run this code in Node.js, we need to load the npm libraries for request and lodash to the Lambda project. To do this, create a directory in your filesystem and then run the following two commands and take the node_modules folder, zip it up and upload it to the Lambda function;

              npm install request
              
              npm install lodash
              Upload the node_modules zip file with request and lodash npm modules

              Besides loading the code, you need to ensure that the lambda function has permissions to read and write the files to the S3 buckets. The read permissions where created when creating the Lambda function above.

              Here is the role and policy to write out the file to the S3 bucket:

              processOrderRole

              We need to add a policy to the processOrderRole to allow writing to the S3 bucket:

              Attach policies
              Policy allow-sap-lambda-write-to-outbound
              Policy allow-sap-lambda-write-to-outbound-policy
              Rating: 0 / 5 (0 votes)

              The post AWS Serverless Lambda Functions to integrate S3 Buckets with SAP S/4 HANA using OData APIs appeared first on ERP Q&A.

              ]]>