SAP Process Integration - ERP Q&A https://www.erpqna.com/tag/sap-process-integration/ Trending SAP Career News and Guidelines Mon, 05 Jan 2026 10:53:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP Process Integration - ERP Q&A https://www.erpqna.com/tag/sap-process-integration/ 32 32 Exploring End-to-End Business Processes in SAP Business Suite https://www.erpqna.com/exploring-end-to-end-business-processes-in-sap-business-suite/?utm_source=rss&utm_medium=rss&utm_campaign=exploring-end-to-end-business-processes-in-sap-business-suite Mon, 05 Jan 2026 10:53:53 +0000 https://www.erpqna.com/?p=94881 Modern business processes such as procurement, production, sales, service, finance, controlling, and human resources are highly interconnected and characterized by numerous interfaces, dependencies, and dynamic integration points. Consistent master data usage and close cross-functional collaboration are essential to meet time, cost, and quality objectives. SAP provides an integrated solution portfolio that supports these end-to-end business […]

The post Exploring End-to-End Business Processes in SAP Business Suite appeared first on ERP Q&A.

]]>
Modern business processes such as procurement, production, sales, service, finance, controlling, and human resources are highly interconnected and characterized by numerous interfaces, dependencies, and dynamic integration points. Consistent master data usage and close cross-functional collaboration are essential to meet time, cost, and quality objectives.

SAP provides an integrated solution portfolio that supports these end-to-end business processes based on modern technologies and evolving business requirements. Due to the breadth and continuous evolution of the SAP portfolio, gaining an initial overview can be challenging. The objective is therefore to offer a structured understanding of SAP process terminology, integration concepts, and corresponding solutions.

An overview of end-to-end business processes in SAP Business Suite, showing how integrated workflows across finance, procurement, sales, manufacturing, HR, and service improve efficiency, data consistency, and overall business performance.
Mapping SAP Solutions

Finance as a core function of end-to-end business processes

Finance ensures the proper recording, processing and evaluation of all financial transactions of a company. It reflects the financial reality of operations and ensures compliance with legal, regulatory and country-specific accounting standards.

Duties and responsibilities

Its core tasks include the continuous posting of business transactions, the analysis of financial data and its structured reporting. Finance thus provides the basis for transparency, control and external reporting.

Central interfaces

Finance is closely interlinked with other areas of the company:

  • Customers / Debtors: Receivables management and complaint support
  • Sales: Early detection of credit risks and clarification of payment or delivery problems
  • Procurement: Review of incoming invoices and management of accounts payable

These interfaces make finance an integrative cross-sectional function.

Typical results

  • Periodic financial statements (balance sheet, profit and loss account, cash flow statement)
  • Maintain and manage accounts receivable, vendor, and asset data
  • Financial and liquidity reporting for management and controlling

Critical success factors

  • High data quality and timely availability of financial information
  • Transparent assessment of creditworthiness and payment behavior
  • Compliance with compliance and risk management requirements
  • Consistent master data and end-to-end system integration

Conclusion

Finance is a central integration point of end-to-end business processes. Incorrect or delayed data transfers have a direct impact on transparency, controllability and compliance.

Controlling as a management function of the company

Controlling supports management by planning, calculating, analysing and communicating financial information. It creates transparency about the economic situation of the company and serves as internal control, control and decision-making support.

Duties and responsibilities

The core tasks of controlling include internal planning, budgeting, control and reporting on the financial health of the company. The focus is on the analysis of costs, revenues and profitability as well as on the derivation of key performance indicators relevant to management.

Central interfaces

Controlling works closely with operational and market-related departments:

  • Production: Cost and revenue control via cost accounting, budgeting of organizational expenses, and planning and estimating production costs
  • Marketing and Sales: Providing information on sales variances to adjust sales campaigns or correct price ranges

Typical results

  • Analysis results and management reports to assess financial soundness
  • Cost, result and variance analyses
  • Decision-making bases for improving cost accounting and financial planning

Critical success factors

  • Effective, realistic planning and budgeting
  • Accurate, consistent, and timely data
  • Appropriate control and key performance indicator systems

Conclusion

Controlling is the central control body between operational business and management. The quality and timeliness of data as well as clearly defined control systems are crucial for effective corporate management and well-founded decisions.

Human Resources as a Structuring Business Process (R2R-Recruit To Retire)

Human Resources ensures that the company has qualified, efficient and motivated employees. It is responsible for all HR-related business processes along the entire employee lifecycle and thus makes a significant contribution to operational stability and strategic development.

Duties and responsibilities

Core responsibilities include recruiting employees with the right qualifications, onboarding new employees in a structured manner, and managing performance, training, and development opportunities. In addition, HR is responsible for payroll, pension and additional benefits as well as all other cost-relevant personnel topics. Another focus is on improving employee and professional experience.

Central interfaces

Human Resources is a cross-departmental cross-sectional function:

  • All departments: Implement performance management systems, employee and benefits programs, and secure budgets for personnel and compensation changes
  • Internal and external accounting: reconciliation of payroll, HR-related costs, accruals and budget planning

Typical results

  • Employee surveys and evaluations
  • Leadership, training, and development programs
  • Proper payroll
  • Culture, Health, and Employee Retention Programs

Critical success factors

  • Clear strategic alignment of workforce and capacity planning
  • Successfully attracting and retaining talent
  • Effective, transparent performance management
  • Reliable, integrated workforce data and processes

Conclusion in the context of business process analysis

Human resources is a central end-to-end business process . Media disruptions, a lack of integration with finance or unclear responsibilities have a direct impact on costs, employee satisfaction and performance. An end-to-end HR process is therefore an essential prerequisite for stable and scalable business processes.

See the deep dive for more Informations:

An overview of end-to-end business processes in SAP Business Suite, showing how integrated workflows across finance, procurement, sales, manufacturing, HR, and service improve efficiency, data consistency, and overall business performance.

Procurement as an operational and strategic business process (S2P source-to-pay)

Procurement ensures that the company is supplied with goods and services in a timely manner, cost-efficiently and in suitable quality. It supports both operational operations and strategic value creation.

Duties and responsibilities

The core tasks of procurement include overseeing all goods and service requirements, selecting reliable suppliers, and concluding and managing supplier contracts. In addition, procurement procures items to support internal business operations and takes purchasing positions to manufacture products.

Central interfaces

Procurement is closely interlinked with commercial and operational areas:

  • Controlling: Coordination for the planning and budgeting of order items
  • Financial accounting: Close collaboration, as accounts payable is part of financial accounting
  • Departments: Notification of requirements and use of purchasing catalogs

Typical results

  • Available materials in stock
  • Valid supplier contracts
  • Standardized purchasing catalogs for employees

Critical success factors

  • Strategic Sourcing and Effective Stakeholder Management
  • Reliable, efficient and sustainable suppliers
  • Efficient, transparent purchasing processes

Conclusion

Procurement is a central end-to-end business process that directly influences costs, delivery capability and productivity. Close integration with controlling and financial accounting as well as efficient, digitally supported processes are crucial for a stable supply and sustainable value creation.

See the deep dive for more Informations:

An overview of end-to-end business processes in SAP Business Suite, showing how integrated workflows across finance, procurement, sales, manufacturing, HR, and service improve efficiency, data consistency, and overall business performance.

Sales as a revenue-driving end-to-end business process (L2C – Lead to Cash)

Sales is responsible for the sale of goods and services and is largely responsible for revenue generation and profitability. It covers the entire process from new customer acquisition to quotation preparation and order fulfillment.

Duties and responsibilities

The core tasks of sales include the acquisition of new customers, the maintenance of existing customer relationships and the implementation of presales activities. Sales creates quotes, takes customer orders, fulfills them in collaboration with the supply chain, and plans future sales.

Central interfaces

Sales is closely involved in cross-departmental processes:

  • Marketing: Using marketing leads to create targeted offers and contracts
  • Internal and external accounting: Reporting and reconciliation of revenues from customer orders
  • Service: Support of service sales, e.g. through consulting or support services

Typical results

  • Quotes and confirmed customer orders
  • Long-term and sustainable customer relationships
  • Revenue and sales plans

Critical success factors

  • Effective sales estimates and realistic sales planning
  • Seamless integration into the supply chain
  • High customer satisfaction
  • Sustainable revenue generation and increased profitability

Conclusion

Sales is a central end-to-end business process whose performance depends directly on integration with marketing, service, logistics and finance. Media disruptions or a lack of coordination have a direct impact on customer satisfaction, sales and margins.

See the deep dive for more Informations:

An overview of end-to-end business processes in SAP Business Suite, showing how integrated workflows across finance, procurement, sales, manufacturing, HR, and service improve efficiency, data consistency, and overall business performance.

Production as a value-adding core process (D2O Design To Operate)

Production is responsible for the production of finished products using materials, machines and labour. The aim is on-time and high-quality production at the lowest possible cost – in the example of the Bike Company, the production of bicycles.

Duties and responsibilities

The core tasks of production include the conversion of materials into finished products and compliance with defined manufacturing processes. Standardised processes reduce production costs, ensure quality and meet delivery deadlines.

Central interfaces

Production is closely linked to commercial and upstream functions:

  • Controlling / internal accounting: Estimation, planning and analysis of production costs
  • Financial accounting: Reporting of production and cost data as well as optimization of product costs
  • Procurement: Ensuring material availability through timely purchase of the required components

Typical results

  • Finished bikes that meet quality standards
  • Timely delivery of products
  • Production and cost data for other divisions

Critical success factors

  • Availability of materials, machines and personnel
  • Efficient, stable production processes
  • Minor variances between actual and planned costs

Conclusion

Production is the company’s central value-added process. Close integration with procurement, finance and controlling is crucial to controlling costs, ensuring quality and reliably guaranteeing delivery capability.

See the deep dive for more Informations:

An overview of end-to-end business processes in SAP Business Suite, showing how integrated workflows across finance, procurement, sales, manufacturing, HR, and service improve efficiency, data consistency, and overall business performance.

Service as a customer-loyal end-to-end business process

The service ensures the maintenance, support and servicing of sold products . It makes a significant contribution to customer satisfaction, customer loyalty and long-term value creation.

Duties and responsibilities

The core tasks of the service include the execution of maintenance and services, the provision of round-the-clock support and the planning and handling of service visits. A central component is ensuring the availability of spare parts and qualified service staff.

Central interfaces

The service is closely interlinked with other company divisions:

  • Sales: Sharing service histories, handling customer and support requests, and supporting service sales
  • Procurement: Coordination to ensure that required spare parts are available and delivered within 24 hours

Typical results

  • Round-the-clock support and service delivery
  • Service visits carried out with spare parts and qualified service employees
  • Fully documented service histories

Critical success factors

  • Short response times and spare parts availability within 24 hours
  • Effective planning of service costs and deadlines
  • Use of digital technologies, e.g. Internet of Things (IoT) for premium bicycles

Conclusion

Service is a crucial end-to-end post-sale business process . Close integration with sales and procurement as well as data-based, digitally supported service processes are central to high customer satisfaction and sustainable customer relationships.

Process Integration

The following graphic illustrates process steps within the end-to-end processes Lead to Cash, Source to Pay, Design to Operate, Recruit to Retire and their main integration points.

An overview of end-to-end business processes in SAP Business Suite, showing how integrated workflows across finance, procurement, sales, manufacturing, HR, and service improve efficiency, data consistency, and overall business performance.
Process Integration

End-to-End Business Processes and Integration Challenges

SAP end-to-end (E2E) business processes cover the entire flow of a business function across departments, eliminating fragmentation. Sales, supply chain, procurement, finance, and HR are integrated into a single, intelligent workflow.

Key Benefits of SAP

  • Integration: Unified platform connecting all core business functions
  • Efficiency: Automation reduces manual effort and errors
  • Data Consistency: Centralized data as a single source of truth
  • Compliance: Built-in support for regulatory requirements
  • Scalability: Flexible solutions that grow with the business
  • Real-Time Insights: Up-to-date analytics for better decisions
  • Customer Satisfaction: Faster, more reliable processes improve service quality
Rating: 5 / 5 (1 votes)

The post Exploring End-to-End Business Processes in SAP Business Suite appeared first on ERP Q&A.

]]>
Int4 Suite Agents Empowers Functional Consultants To Test Integrated SAP S/4HANA Business Processes https://www.erpqna.com/int4-suite-agents-empowers-functional-consultants-to-test-integrated-sap-s-4hana-business-processes/?utm_source=rss&utm_medium=rss&utm_campaign=int4-suite-agents-empowers-functional-consultants-to-test-integrated-sap-s-4hana-business-processes Fri, 21 Nov 2025 12:20:32 +0000 https://www.erpqna.com/?p=94093 Introduction Integrated business processes are the bloodstream of SAP systems. Every Sales Order, Purchase Order, Delivery, and Invoice has to flow smoothly, not just within SAP, but across EDI partners (customer, vendors, 3PL partners) banks, warehouses, tax portals. Here’s the paradox: SAP S/4HANA projects have plenty of sophisticated automation tools, but they rarely help functional […]

The post Int4 Suite Agents Empowers Functional Consultants To Test Integrated SAP S/4HANA Business Processes appeared first on ERP Q&A.

]]>
Introduction

Integrated business processes are the bloodstream of SAP systems. Every Sales Order, Purchase Order, Delivery, and Invoice has to flow smoothly, not just within SAP, but across EDI partners (customer, vendors, 3PL partners) banks, warehouses, tax portals.

Here’s the paradox: SAP S/4HANA projects have plenty of sophisticated automation tools, but they rarely help functional consultants in their manual tests. Instead, those tools get pushed into the narrow niche of automation testers. Functional consultants treat them like mythical dragons, complicated, dangerous, and likely to drag them away from their real work into procedural swamps.

The result? Slow testing cycles, dependency on integration specialists, and endless waiting for external partners to provide messages or confirmations.

Changing the story with simulation agents

The better path is not to force functional consultants into scripting or automation frameworks, but to give them simulation agents that mimic the system environment.

Instead of saying: “learn a test framework and run automated scripts,” we can say: “here are agents that simulate your missing EDI partner, your unavailable 3rd party/legacy system and you can test with them right now.”

This changes the game:

  • No competition with automation teams.
  • No learning curve with complex frameworks and procedural delays.
  • Consultants get something they instantly understand: realistic test conditions on demand, using actual historical production data.

How Int4 Suite Agents Work

With Int4 Suite, simulation agents provide a simple interface: the consultant performs the transaction in SAP, the agent feeds in authentic historical test data, and then automatically checks whether the newly generated EDI or non-EDI message matches what was sent in production.

Below are examples of key agents and how they fit into typical integrated OTC and P2P processes

Figure – meet the Int4 Suite Agents

EDI Partner Agent (based on historical data)

Role: Replays authentic production EDI messages from trading partners (ORDERS, DESADV, INVOIC).

How it works:

  • Consultant performs the transaction in SAP (e.g., creates delivery, sends invoice).
  • Agent provides historical test data from previously exchanged documents.
  • Agent automatically compares the newly generated EDI message with the production one for a similar case.

OTC examples:

  • Consultant in the OTC team replays historical ORDERS from the largest customer and verifies whether, after pricing condition changes, the system still calculates correctly.
  • Consultant tests goods receipt with historical DESADV data; agent compares the new EDI message against the production one.
  • Consultant issues a sales invoice (INVOIC) and agent validates it against the original production invoice, checking VAT rules.
Figure – Select Historical EDI messages from Production system which need to be rerun on Test System

P2P examples:

  • Consultant creates a purchase order; agent provides a historical ORDRSP where the supplier delivered partially, then compares the new outbound message.
  • Agent simulates a supplier INVOIC and verifies whether workflows behave the same after configuration changes.
Figure – manipulate the historical/production landscape EDI message data before sending that to the test environment

Unavailable System Agent (Non-SAP)

Role: Simulates external systems (banks, customs, WMS/TMS, tax portals) with historical production communications.

How it works:

  • Consultant runs the business process in SAP.
  • Agent injects historical test data from the external system.
  • Agent compares the new outbound message with the original production one.

OTC examples:

  • Consultant tests e-invoicing using a historically rejected invoice; agent checks whether the new output matches the original and if the new rules handle it.
  • Consultant tests shipment confirmations with historical WMS responses.

P2P examples:

  • Consultant tests bank payments; agent supplies historical payment files and checks the new output structure.
  • Consultant tests tax submissions; agent provides historical records and compares new vs. old messages.
Figure – Display the historical EDI data used on production landscape before rerunning that on the test environment

Historical Data Agent

Role: The production message librarian, replays large volumes or special cases directly from production.

How it works:

  • Consultant triggers transactions in SAP (bulk orders, invoices, returns).
  • Agent provides the historical payloads.
  • Agent verifies test messages against the production equivalents.

OTC examples:

  • Consultant replays a “Black Friday” scenario with 10,000 ORDERS; agent validates each new EDI message against its production twin.
  • Consultant tests credit memo flows from historical complaint cases.
Figure – run many historical messages on the test environment for bulk testing purposes

P2P examples:

  • Consultant tests bulk supplier invoices; agent validates the outputs against production.
  • Consultant tests blocked spare-parts orders with historical references.

Integration Consultant Agent

Role: A technical assistant that retrieves and compares messages from middleware layers (PI/PO, CPI).

How it works:

  • Consultant executes the business process in SAP.
  • Agent fetches the historical integration payload.
  • Agent highlights differences between new and historical messages.

OTC examples:

  • Consultant creates a sales order; agent compares the IDoc/XML message with the historical SAP Integration Suite payload.
  • Agent highlights mapping differences at field level after configuration changes.
Figure – fetch an EDI payload produced by the integration platform (SAP Integration Suite, etc.) from a newly created business document and compare with the historical one from the production landscape without asking SAP Integration Consultant for help

P2P examples:

  • Consultant enters a supplier invoice; agent pulls historical Ariba-CPI payload and checks consistency.
  • Agent validates purchase order messages across production and test runs.

Why it matters for SAP S/4HANA projects

In S/4HANA transformations, the external world doesn’t care about your internal redesigns. Customers, suppliers, and banks still expect exactly the same messages they used to get. Outbound and inbound interfaces are fragile bridges that must remain stable.

By equipping functional consultants with Int4 Suite agents:

  • Test cycles shorten dramatically.
  • Reliance on external partners and scarce integration resources drops.
  • Confidence in end-to-end quality rises.

This isn’t about replacing automation experts or integration teams. It’s about enabling functional consultants to independently confirm that what leaves SAP (or comes into it) is still what the outside world expects.

It’s the missing puzzle piece for smooth, low-friction testing of integrated business processes in SAP transformations.

Rating: 5 / 5 (1 votes)

The post Int4 Suite Agents Empowers Functional Consultants To Test Integrated SAP S/4HANA Business Processes appeared first on ERP Q&A.

]]>
SAP Integration Suite – Generative AI based Integration Flow Generation https://www.erpqna.com/sap-integration-suite-generative-ai-based-integration-flow-generation/?utm_source=rss&utm_medium=rss&utm_campaign=sap-integration-suite-generative-ai-based-integration-flow-generation Sat, 27 Jul 2024 10:30:33 +0000 https://www.erpqna.com/?p=86813 Introduction SAP Cloud Integration version 6.54.**, one of the capabilities of SAP Integration Suite, comes with an enhancement on Generative AI (GenAI) feature – GenAI based Integration Flow Generation. This feature will be available only in SAP Integration Suite Cloud Integration Premium Editions, on specific regions/data centres JP10, EU10, AP10, US10, EU10-003, EU10-002, and US10-002. […]

The post SAP Integration Suite – Generative AI based Integration Flow Generation appeared first on ERP Q&A.

]]>
Introduction

SAP Cloud Integration version 6.54.**, one of the capabilities of SAP Integration Suite, comes with an enhancement on Generative AI (GenAI) feature – GenAI based Integration Flow Generation. This feature will be available only in SAP Integration Suite Cloud Integration Premium Editions, on specific regions/data centres JP10, EU10, AP10, US10, EU10-003, EU10-002, and US10-002. This feature is forecasted to rollout during calendar week 29/30/31 2024 (tentative timelines, subjected to change as per phased rollout plan).

Enabling GenAI Based Integration Flow Generation feature

As tenant administrator of your premium edition tenant, you need to enable this feature in Settings page of SAP Integration Suite. By enabling this feature, you need to agree to the relevant terms and conditions.

Below are the sample screenshots of the same

Settings Tab for Generative AI
Click on Edit
Switch ON the flag
Terms and Conditions

Clicking on Save button will display the user who has accepted the terms and condition. Click on Save and then click on Cancel button to come out of edit mode.

Generating Integration Flows

Once this feature is switched ON by the tenant administrator, you – as integration flow developer persona – can use this feature to generate integration flows.

The GenAI based integration flow generation makes use of the Unified Cloud Landscape (UCL) concept. For that, you need to configure your account’s System Landscape configuration with SAP Integration Suite formation and add the required systems in that formation.

Note: SAP Integration Suite formation/system type is getting rolled out to relevant data centres. This blog note will be updated once rollout is complete.

If you have a successful SAP Integration Suite formation created with some systems (e.g. S4), the GenAI based Integration flow generation feature will browse the systems and APIs and lists them in the Sender and Receiver sections in GenAI based integration flow creation UI dialog, as shown in the screenshot below.

Below are the sample screenshots of System Landscape configuration with Systems with system type SAP Integration Suite (the sub-account where Integration Suite is subscribed to) and Formations of type of Integration with SAP Integration Suite formation type along with other systems participating in the integration.

Some sample systems added

SAP Integration Suite sample formation

In case if you have not enabled System Landscape configuration as above, the systems and API listings will not happen for Sender and Receiver systems, and these fields will be empty in the selection of GenAI based integration flow UI dialog.

Now, you click on Add -> Integration flow menu in your package view, you will be given option of generating the integration flow with assistance from AI or create integration flow manually. Below are the screenshots of generating integration flows using generative AI.

Click on Package Edit

Edit the package

Click on Add -> Integration Flow menu

Select Integration Flow

Choose GenAI based Integration flow generation option
Provide your scenario description and click on Send button
Observe the AI response and correct the description in case of suggestions
After correcting the description, click on send button again. If scenario description is correct, AI will list the sender and receiver systems from configured System Landscape infromation
Clicking on Select button will list other systems from System Landscape configuration
AI will suggest integration scenario name, you can change it
Click on Generate button to generate integration flow

Upon click on Generate button, an integration flow will be generated as shown in

Generated Integration Flow template
Observe the Timer flow step generated as per the schedule in scenario description in GenAI dialog
Sender system address configuration has been pre-filled as per the system discovery from System Landscape (UCL) information along with Receiver system

The generated integration flow will act as a template, you need to further configure the integration flow and/or update the integration with some more integration steps to match your end to end integration scenario requirement.

Note: As mentioned above, in case of issues in UCL, the systems and API listings will not happen for Sender and Receiver systems, and these fields will be empty in the selection of GenAI based integration flow UI dialog as shown in sample screenshot below. You can still continue the integration flow generation.

GenAI Integration Flow generation dialog when UCL systems & API discovery fails, sender and receiver fields will be empty.

This feature is currently provided on the basis of free and fair usage option. Due to this, you will be allowed to generate limited number of integration flows per tenant per calendar month for next 6 months. The exact number of integration flows generated with GenAI will be difficult to provide, because it depends on the scenario description you provided and the responses from GenAI backend, and back-and-forth communication, which ends up in consumption of GenAI transactions. If you exhaust these transactions, you will observe below information message.

GenAI Transaction limits exceeded
GenAI Transaction limits exceeded

Summary

SAP Integration Suite – Cloud Integration GenAI based integration flow generation feature will help you bootstrap and accelerate the integration development activity.

As a first step towards the generative AI in integration, we have introduced this feature which currently able to interpret the scenario description and generate an integration flow with sender and receiver systems only. Going forward, we will be enhancing this offering to include mediation steps (e.g. converters, mappings etc.) based on the description on the integration scenario to generate more enriched integration flow.

Rating: 0 / 5 (0 votes)

The post SAP Integration Suite – Generative AI based Integration Flow Generation appeared first on ERP Q&A.

]]>
Create DataType and Message Type artifact in Cloud Integration capability of SAP Integration Suite https://www.erpqna.com/create-datatype-and-message-type-artifact-in-cloud-integration-capability-of-sap-integration-suite/?utm_source=rss&utm_medium=rss&utm_campaign=create-datatype-and-message-type-artifact-in-cloud-integration-capability-of-sap-integration-suite Wed, 10 Jul 2024 12:56:31 +0000 https://www.erpqna.com/?p=86393 Introduction SAP Cloud Integration version 6.54.xx comes with new feature, where in one can create Datatype and Messagetype as reusable design time artifacts in Cloud Integration capability of SAP Integration Suite This feature is available only in SAP Integration Suite standard and above service plans. SAP Cloud Integration version 6.54.xx software update is planned on […]

The post Create DataType and Message Type artifact in Cloud Integration capability of SAP Integration Suite appeared first on ERP Q&A.

]]>
Introduction

SAP Cloud Integration version 6.54.xx comes with new feature, where in one can create Datatype and Messagetype as reusable design time artifacts in Cloud Integration capability of SAP Integration Suite

This feature is available only in SAP Integration Suite standard and above service plans.

SAP Cloud Integration version 6.54.xx software update is planned on mid of July 2024 (date and time subjected to change).

Create DataType:

1. Open the Integration Suite Tenant and navigate to Design –>Integrations and API’s

2. Create an Integration Package or open an existing one.

3. Navigate to the Artifacts tab and click on Edit in the top right corner

    4. Click on Add drop down and select Data Type from the list

    5. Add Data Type dialog is displayed with Create (radio button) selected by default.

    6. Enter the values for the fields Name, ID,Target Namespace, Description, select the category – Simple Type(selected by default) or Complex Type for the Datatype you want to create and click on Add or Add and Open in Editor

    7. On Click of Add, the Data Type artifact with the provided name gets created and is listed in the Artifacts list page

    8. On Click of Add and Open in Editor, the Data Type artifact gets created with the provided name and the artifact gets opened in the Editor in display mode.

    9. The Editor contains three tabs : Overview,Structure and XSD.The Structure is shown by default when the artifact is opened. It displays the structure of the datatype in a tree table with the following columns :

      • Name : Contains the Name of the node(element or attribute).For Root node the name is same as the name of the Datatype and it cannot be edited.
      • Category : This column shows whether the root element has subnodes or not. For root node it is either Simple type or Complex Type and for submodes it can be either Element or Attribute. You cannot change values in this column.
      • Type: This column displays the type with which the node is defined.Here you select a built-in data type or reference to an existing data type for an element or attribute. You must specify a type for attributes.
      • Occurrence: Determines how often elements occur.For attributes, you can determine whether the attribute is optional or required.
      • Restrictions : This column displays the facets (if any) defined incase the node is defined by a built-in primitive type or a user defined Simple type Datatype

      9. Switch to edit mode and to define/build the Structure of the Datatype. On selecting the first row(rootnode),the Add drop down in the table header gets enabled and also the details of the row are displayed in the right side section of the editor.

      10. Simple Type Data Type :

      • No child nodes can be added
      • Root node is defined by string built-in primitive datatype.
      • Click on the root node and the Properties sheet which contains the details of the node selected is displayed on the right side of the editor. In Edit mode, user can edit the Type, define the restrictions applicable for the Type selected.

      11. Complex Type Data Type :

      To add child nodes:

      • Click on the root node and the Add drop down in the table header gets enabled.

      Add –>Element to add child element node

      Add –>Attribute to add attribute node

      Add –>Rows to add multiple Elements/Attributes

      • Click on the newly added node and define the details in the Properties sheet

      12. Once the Structure is defined,Click on Save to save the artifact as Draft, Save as Version to save the artifact as versioned artifact.

      13. XSD tab displays the read only view of the xsd schema of the Datatype artifact

        Create MessageType:

        1. Open the Integration Suite Tenant and navigate to Design –>Integrations and API’s

        2. Create an Integration Package or open an existing one.

        3. Navigate to the Artifacts tab and click on Edit in the top right corner

        4. Click on Add drop down and select Message Type from the list

          5. Add Message Type dialog is opened

          6. Enter the values for the fields Name, ID, XMLNamespace, Datatype to be Used, Description, and click on Add or Add and Open in Editor

          7. On Click of Add, Message Type artifact gets created and is listed in the Artifacts list page

            8. On Click of Add and Open in Editor, MessageType artifact gets created and the artifact gets opened in the DataType Editor with Structure tab loaded by default in non-edit mode. The rootnode Name would be same as the Message Type name, Category as Element and Type as Data Type Used (if selected in the Add Message Type dialog)

            9. Overview tab in Edit mode is as shown below :

            10. XSD tab

            11. Datatype Used to create a Message type can be changed in Overview tab or in Structure tab. Switch to edit mode and select the root node in the Structure tab.The properties sheet gets displayed on the right side of the page with Datatype Used field as editable.

            12. No other nodes(child nodes) are editable in the Message Type artifact.

              Rating: 0 / 5 (0 votes)

              The post Create DataType and Message Type artifact in Cloud Integration capability of SAP Integration Suite appeared first on ERP Q&A.

              ]]>
              SAP PO to Integration Suite | Migration Assessment Tool https://www.erpqna.com/sap-po-to-integration-suite-migration-assessment-tool/?utm_source=rss&utm_medium=rss&utm_campaign=sap-po-to-integration-suite-migration-assessment-tool Fri, 05 Jul 2024 10:44:43 +0000 https://www.erpqna.com/?p=86192 SAP Integration Suite is a new age solution designed to streamline the integration of SAP and non-SAP systems. Integration Suite extends its reach to integrate Cloud and On-premises applications, data sources and APIs. SAP is encouraging customers to transition older integration solutions (like PI/PO) to a more unified and cloud centric solution which is the […]

              The post SAP PO to Integration Suite | Migration Assessment Tool appeared first on ERP Q&A.

              ]]>
              SAP Integration Suite is a new age solution designed to streamline the integration of SAP and non-SAP systems. Integration Suite extends its reach to integrate Cloud and On-premises applications, data sources and APIs.

              SAP is encouraging customers to transition older integration solutions (like PI/PO) to a more unified and cloud centric solution which is the Integration Suite. This means that now we must consider doing migration assessments from PI/PO to Integration Suite.

              Fortunately, SAP provides a ready to use Migration Assessment tool with the Integration Suite subscription.

              High level process flow of how we can achieve the Migration Assessment

              Pre-requisite: Application subscription to Integration Suite in your BTP subaccount.

              Preparation: When you intend to do the assessment on SAP PO which is hosted on-premises using a tool which is enabled by Integration Suite, a cloud hosted service, you must make sure that you have a valid and secure connection well established, this is where we use SAP Cloud Connector service.

              SAP cloud connector is a service hosted on-premises and is connected with BTP subaccount and further leveraged by Cloud/SaaS applications. The diagram below provides a general overview:

              On-Premise setup and configurations:

              Once installed, you can perform the configuration by logging into the Cloud connector using the url – Access https://localhost:8443, initially with Name “Administrator” and Password “manage”, password to be changed on initial logon.

              Provide the BTP subaccount details where you have your Integration Suite subscription and save.

              Check for successful connection with BTP subaccount on the top bar of the Cloud Connector logon page.

              Now, we will have to create a new connection with the backend SAP PI/PO system.

              In the Cloud Connector, on the left side bar, click ‘Cloud to On-Premise’ and click on ‘add’ system

              From the drop-down options select ‘SAP Process Integration’

              Next, provide the protocol (HTTP/HTTPS)

              Next, provide PI/PO host and port information, note that this must be accessible from Cloud connector host.

              Next, provide a virtual host and port, which will be mapped to the actual ones

              Next, choose the option to use Virtual host and port

              Next, check Internal host and finish the configuration

              Under the check result if we see ‘Reachable’ comment that means we have a successful connection from Cloud Connector to SAP PO. Otherwise, troubleshooting may be required.

              In addition to this we will have to expose following PO services for Migration assessment (service names are listed below)

              /CommunicationChannelInService
              /IntegratedCongurationInService
              /SenderAgreementInService
              /AlertRuleInService
              /IntegratedConfiguration750InService
              /ValuemappingInService
              /ConfigurationScenarionInService
              /BPMFacaseBeanImplService
              /ReceiverAgreementInService
              /rep/read/ext
              /dir/read/ext
              /rep/support/SimpleQuery

              Add each of the services here with path and all sub paths selection

              This is where our Cloud Connector configuration is completed.

              Configurations and Setup on cloud:

              To check the connectivity from Integration Suite, log in to Integration Suite, click ‘monitor’ from the left pane and select ‘integrations’

              Go to the ‘Cloud Connector’ tab, provide location ID that was defined during Cloud Connector setup and click on ‘Send’

              If you get a message like below, we have a successful connection between Cloud Connector and the Integration suite which also means we have successfully connected to On-premise PO system.

              Now, that we have the connectivity set up and we can proceed with the assessment.

              Please make sure that you have a valid SAP PO user setup with the following roles/authorizations and that you have the user credentials.

              SAP_XI_API_DISPLAY_J2EE

              SAP_XI_API_DEVELOP_J2EE

              SAP_XI_MONITOR_J2EE

              Let us now explore how we can perform the assessment on PI/PO environment.

              Data extraction:

              On the Integration Suite, select ‘Access Migration Scenario’ -> ‘create requests’

              On the left had panel, select ‘settings’ to add the on-premise PO system

              add to the fields. Remember the credentials we created in PO and the location ID of the cloud connector.

              now we can select the system and perform test. look out for ‘successfully connected’ message

              Now that this is done.

              Click ‘Request’ -> ‘Data Extraction’

              create data extraction request

              provide a request name of choice and the system (that we created in previous step)

              as soon as you click ‘create’ the extraction starts, look out for the ‘in progress’ message.

              This will take a while, wait for ‘completed’ message. monitor the logs in the meantime.

              Evaluation:

              Once the extraction is completed. we have to go to ‘Scenario Evaluation’

              Click on ‘create’

              The following fields will be filled based on the data extraction we ran in the previous step, specially the ‘data extraction request’ name

              Click create and watch out for evaluation completed status

              Download the artefacts

              Assessment result interpretation:

              The artefacts contain an excel sheet and a pdf document

              In the excel we will get two sheets. ‘Evaluation by integration scenario’ and ‘Full evaluation results’.

              Let us look at ‘Evaluation by integration scenario’ first. This will give a list of all the Integration scenarios that are available in the PO system

              Notice the three attributes – ‘weight’, ‘T shirt size’ and the ‘Assessment category’.

              The ‘Assessment Category’ signifies how easy or difficult it is to migrate the Integrations to Integration Scenario. See following definitions for assessment categories

              • Ready to migrate: can be migrated automatically using the migration tool, some post migration efforts may still be required.
              • Adjustments required: Can be migrated partially by the tool and remaining adjustments will have to be performed before activating the scenario.
              • Evaluation Required: means that the tool could not evaluate the migration scenario and that manual assessment is required.

              ‘Weight’ and ‘T shirt size’ are indicators of the efforts required for the migration.

              T-shirt sizing is derived based on below rules

              Looking at the ‘Full Evaluation Results’, we will see a set of ‘rules’ against each ‘integration scenario’

              Each rule will indicate the interfaces engaged and each of these rules will have respective weightage

              The sum of all the weightage for rules will provide the weightage of the Integration Scenario.

              Another important thing to note is that even if one of the rule is categorized as ‘Evaluation required’ then the whole Integration scenario will be marked as ‘Evaluation required’ and should be dealt with accordingly, similarly for ‘Adjustment required’ cases.

              The pdf from the assessment reports provides a lot of details too.

              The summary provides the efforts estimated and a view of the assessment category

              a pie chart view of the assessment categories

              The report will also provide Sender and Receiver adapters count that will be migrated and that may not be supported.

              Estimations by effort classification

              Tabular view of the efforts required for each T shirt size – against the assessment categories

              In the report, there are recommendations provided for the migration approach, eg. Reference templates from Integration suite can be used.

              Rating: 0 / 5 (0 votes)

              The post SAP PO to Integration Suite | Migration Assessment Tool appeared first on ERP Q&A.

              ]]>
              Test Scenerio – ProcessDirect https://www.erpqna.com/test-scenerio-processdirect/?utm_source=rss&utm_medium=rss&utm_campaign=test-scenerio-processdirect Wed, 03 Jan 2024 09:48:37 +0000 https://www.erpqna.com/?p=80707 Scenerio: Customer Order data is coming from sender using SOAP adaptor. CPI should generate order_no acc. to the given format. After generating order_no we need to add a field company_name. After this we need to check the action is Pending/Not_Available/Delivered. If its pending then its end the flow, if its delivered then it direct to […]

              The post Test Scenerio – ProcessDirect appeared first on ERP Q&A.

              ]]>
              Scenerio:

              Customer Order data is coming from sender using SOAP adaptor. CPI should generate order_no acc. to the given format. After generating order_no we need to add a field company_name. After this we need to check the action is Pending/Not_Available/Delivered.

              If its pending then its end the flow, if its delivered then it direct to another IFlow using Process Direct in which we are generating TransactionId according to the given format. One its generated it should send the mail to the customer and the company admin mail, if its Not_Available it should end and send the mail to the customer and admin that your order is not available.

              Order_no: random alphanumeric string of length 6 and concat it with quantity and Item as below.
              
              Example: Item:- xyz, Quantity: 1 => Order_no = AB12CDxyz1
              
              TransactionId:- Random alpha character string of length 10 and concat it with the Order_no.
              
              Example: ABCDEFGHIJAB12CD1xyz

              Steps:

              1. Create an IFlow connect with sender using SOAP 1x adaptor as it is one way communication.

              2. Add message mapping pallete and add source and target xsd as per the data.

              Source XSD:

              <?xml version="1.0" encoding="utf-8"?>
              
              <!-- Created with Liquid Technologies Online Tools 1.0 (https://www.liquid-technologies.com) -->
              
              <xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
              
                <xs:element name="Order_root">
              
                  <xs:complexType>
              
                    <xs:sequence>
              
                      <xs:element maxOccurs="unbounded" name="Order">
              
                        <xs:complexType>
              
                          <xs:sequence>
              
                            <xs:element name="Orderno" />
              
                            <xs:element name="Cust_Name" type="xs:string" />
              
                            <xs:element name="Cust_Add" type="xs:string" />
              
                            <xs:element name="Item" type="xs:string" />
              
                            <xs:element name="Action" type="xs:string" />
              
                            <xs:element name="Quantity" type="xs:unsignedByte" />
              
                            <xs:element name="Email" type="xs:string" />
              
                          </xs:sequence>
              
                        </xs:complexType>
              
                      </xs:element>
              
                    </xs:sequence>
              
                  </xs:complexType>
              
                </xs:element>
              
              </xs:schema>
              

              Target XSD:

              <?xml version="1.0" encoding="utf-8"?>
              
              <!-- Created with Liquid Technologies Online Tools 1.0 (https://www.liquid-technologies.com) -->
              
              <xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
              
                <xs:element name="Order_root">
              
                  <xs:complexType>
              
                    <xs:sequence>
              
                      <xs:element maxOccurs="unbounded" name="Order">
              
                        <xs:complexType>
              
                          <xs:sequence>
              
                            <xs:element name="Orderno" />
              
                            <xs:element name="Cust_Name" type="xs:string" />
              
                            <xs:element name="Cust_Add" type="xs:string" />
              
                            <xs:element name="Item" type="xs:string" />
              
                            <xs:element name="Action" type="xs:string" />
              
                            <xs:element name="Quantity" type="xs:unsignedByte" />
              
                            <xs:element name="Company" type="xs:string" />
              
                            <xs:element name="Email" type="xs:string" />
              
                          </xs:sequence>
              
                        </xs:complexType>
              
                      </xs:element>
              
                    </xs:sequence>
              
                  </xs:complexType>
              
                </xs:element>
              
              </xs:schema>

              3. Configure Message Mapping.

              Custom Function Script:

              import com.sap.it.api.mapping.*;
              def String customFunc(String arg1){
                def chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
                def random = new Random();
                def sb = new StringBuilder(6);
                for (int i = 0; i < 6; i++) {
                  sb.append(chars.charAt(random.nextInt(chars.length())));
                }
                return sb.toString();
              }

              4. Add Splitter to split the XML record and send to router.

              Route 2 is ending as it is for Action Pending.

              Route 3 is connecting to receiver using process direct adaptor.

              Route 4 is connecting content modifier as we need customer mail id from the respective data.

              Route 5 is default if the action not defined it should end the flow.

              5. Configuring route 4 as below.

              6. Configure Process-Direct.

              7. Save and deploy this IFlow and create another IFlow for processdirect.

              8. Configure Process Direct, give the same address as you give in previous flow.

              9. Add mapping pallete and source and target xsd.

              Source xsd should be same as the previous IFLOW Target xsd.

              <?xml version="1.0" encoding="utf-8"?>
              
              <!-- Created with Liquid Technologies Online Tools 1.0 (https://www.liquid-technologies.com) -->
              
              <xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
              
                <xs:element name="Order_root">
              
                  <xs:complexType>
              
                    <xs:sequence>
              
                      <xs:element maxOccurs="unbounded" name="Order">
              
                        <xs:complexType>
              
                          <xs:sequence>
              
                            <xs:element name="Orderno" />
              
                            <xs:element name="Cust_Name" type="xs:string" />
              
                            <xs:element name="Cust_Add" type="xs:string" />
              
                            <xs:element name="Item" type="xs:string" />
              
                            <xs:element name="Action" type="xs:string" />
              
                            <xs:element name="Quantity" type="xs:unsignedByte" />
              
                            <xs:element name="Company" type="xs:string" />
              
                            <xs:element name="Email" type="xs:string" />
              
                          </xs:sequence>
              
                        </xs:complexType>
              
                      </xs:element>
              
                    </xs:sequence>
              
                  </xs:complexType>
              
                </xs:element>
              
              </xs:schema>

              Target XSD:

              <?xml version="1.0" encoding="utf-8"?>
              
              <!-- Created with Liquid Technologies Online Tools 1.0 (https://www.liquid-technologies.com) -->
              
              <xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
              
                <xs:element name="Order_root">
              
                  <xs:complexType>
              
                    <xs:sequence>
              
                      <xs:element maxOccurs="unbounded" name="Order">
              
                        <xs:complexType>
              
                          <xs:sequence>
              
                            <xs:element name="Orderno" type="xs:string" />
              
                            <xs:element name="Cust_Name" type="xs:string" />
              
                            <xs:element name="Cust_Add" type="xs:string" />
              
                            <xs:element name="Item" type="xs:string" />
              
                            <xs:element name="Action" type="xs:string" />
              
                            <xs:element name="Quantity" type="xs:unsignedByte" />
              
                            <xs:element name="Company" type="xs:string" />
              
                            <xs:element name="Transition_ID" type="xs:string" />
              
                            <xs:element name="Email" type="xs:string" />
              
                          </xs:sequence>
              
                        </xs:complexType>
              
                      </xs:element>
              
                    </xs:sequence>
              
                  </xs:complexType>
              
                </xs:element>
              
              </xs:schema>

              10. Configure Mapping.

              Custom Function Groovy:

              import com.sap.it.api.mapping.*;
              def String customFunc(String arg1){
                def chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
                def random = new Random();
                def sb = new StringBuilder(10);
                sb.append(arg1);
                for (int i = 0; i < 8; i++) {
                  sb.append(chars.charAt(random.nextInt(chars.length())));
                }
                return sb.toString();
              }

              11. Add Content modifier to store the customer email.

              12. Configure XML to CSV convertor.

              13. Configure Mail Adaptor.

              14. Save and deploy the IFlow. Change the log of both IFlow to trace in monitoring.

              15. Open SOAP UI to trigger the data.

              16. Add sample data.

              <Order_root>
              
              <Order>
              
              <Orderno></Orderno>
              
              <Cust_Name>Sahil</Cust_Name>
              
              <Cust_Add>xyz</Cust_Add>
              
              <Item>Pen</Item>
              
              <Action>Pending</Action>
              
              <Quantity>1</Quantity>
              
              <Email>abc@gmail.com</Email>
              
              </Order>
              
              <Order>
              
              <Orderno></Orderno>
              
              <Cust_Name>Dushyant</Cust_Name>
              
              <Cust_Add>xyz</Cust_Add>
              
              <Item>Notebook</Item>
              
              <Action>Delivered</Action>
              
              <Quantity>4</Quantity>
              
              <Email>efg@gmail.com</Email>
              
              </Order>
              
              <Order>
              
              <Orderno></Orderno>
              
              <Cust_Name>Aman</Cust_Name>
              
              <Cust_Add>xyz</Cust_Add>
              
              <Item>Pencil</Item>
              
              <Action>Not Available</Action>
              
              <Quantity>2</Quantity>
              
              <Email>ijk@gmail.com</Email>
              
              </Order>
              
              </Order_root>

              17. Check the customer mail.

              Rating: 0 / 5 (0 votes)

              The post Test Scenerio – ProcessDirect appeared first on ERP Q&A.

              ]]>
              SAP Event Mesh: S4 HANA On-Prem and SAP CI(BTP IS) through WebSocket and Webhooks https://www.erpqna.com/sap-event-mesh-s4-hana-on-prem-and-sap-cibtp-is-through-websocket-and-webhooks/?utm_source=rss&utm_medium=rss&utm_campaign=sap-event-mesh-s4-hana-on-prem-and-sap-cibtp-is-through-websocket-and-webhooks Wed, 11 Oct 2023 09:39:07 +0000 https://www.erpqna.com/?p=78869 Introduction What is Event Mesh? SAP Event Mesh allows applications to communicate through asynchronous events. Experience greater agility and scalability when you create responsive applications that work independently and participate in event-driven business processes across your business ecosystem. In this decoupled integration the producers of the events will not know the consumers of the events. […]

              The post SAP Event Mesh: S4 HANA On-Prem and SAP CI(BTP IS) through WebSocket and Webhooks appeared first on ERP Q&A.

              ]]>
              Introduction

              What is Event Mesh?

              SAP Event Mesh allows applications to communicate through asynchronous events. Experience greater agility and scalability when you create responsive applications that work independently and participate in event-driven business processes across your business ecosystem. In this decoupled integration the producers of the events will not know the consumers of the events.

              Publish business events from SAP and non-SAP sources across hybrid landscapes from the digital core to extension applications through event-driven architecture. Consume business events from SAP and non-SAP sources throughout SAP’s event-driven ecosystem including SAP Extension Suite, SAP Integration Suite, and selected inbound enabled SAP backends. Achieve reliable data transmission for extension and integration scenarios through decoupled communication.

              What is Websocket?

              WebSocket is a stateful protocol used in the client-server communications used to enable the application you are using to receive the data without requesting it from the server like two-way communication. It means the connection between client and server will keep alive until it is terminated by either party (client or server). Data is sent and received much faster than HTTP as HTTP has a 2000-byte overhead whereas WebSocket has only a 2-byte cost. WebSockets supports multiple data types. In SAP CI AMQP adapter supports the websocket transport protocol.

              What is Webhook?

              Webhooks are stateless, meaning that each event notification is independent and carries all the necessary information about the event. Webhooks are for server to server communication. Web-hooks are commonly used to perform smaller requests and tasks on top of standard API calls. Webhooks are suitable for POST/PUSH mechanism. In the Event Driven Architecture when the Producer publish the events to the respective topics, then the Queues subscribed to those topics will trigger the event data to the consumer using the Webhook instead of consumer trying to pull the events.

              What is Queue?

              A queue is used to store a message until it’s received by the subscribed consumer. Queue cannot process message, it is first consumer and just store message/event. If any events are published in the Queue then only one subscriber to that Queue received a message. Queues are managed and persistent. When there are no consumers to a queue, messages are stored in the queue.

              What is Topic?

              Topics are not managed and not persistent. A topic is created on the go when we start publishing messages to a topic, and destroyed when there are no consumers listening to that topic. If the consumer to a topic is not running when a message is published to the topic, that event will not be received. One Event Producer can publish to one Topic. One Topic can be connected to multiple queues. Multiple queues can be subscribed to one topic.

              Prerequisites:

              1. BTP cockpit access with Event Mesh subscription.
              2. S4 HANA access in tcodes STRUST and SPRO
              3. SAP BTP IS Tenant Access with required roles/roles collection having Enterprise Messaging*.

              Step-1: Add Service Plans in the Entitlements of subaccount for Event mesh along with creation of instance and subscription.

              Select the standard and default service plans for the Event Mesh in the Entitlement

              “Standard” for Event Mesh subscription and “default” for instance creation

              In the Instances and Subscription, create the “standard” event mesh subscription and “default” instance creation on top of the cloud foundry space and runtime environment.

              While creating the service instance creation the Instance Name and EMNAME in JSON file should be same along with the Namespace in required format e.g. a/b/c

              Step-2: Creation of Service Key for eminstance created

              The clientid, clientsecret,tokenendpoint, protocol and uri should be noted for further configurations

              Step-3: Creation of new Queue in Instance of Event Mesh

              Step-4: Test publishing and consuming messages on a queue. [Optional]

              Select the queuename and message client name for publishing and consuming the messages.

              Step-5: Creation of OAuth2 client credentials in Security Material of Integration Suite

              Use the TokenEndPoint URL and ClientID and Client Secret from the Event Mesh service Key created.

              Step-6: Produce the events from SAP CI into Event Mesh

              A new IFlow needs to be created with either HTTPS as sender for triggering from Postman or timer based event with some message in content modifier and AMQP as receiver.

              For Demo purpose we will use the HTTPS as sender and AMQP receiver with WebSocket message protocol. Host will be the URI from the service key. Port:443 Path will be the protocol from service key. Credential Name from the security material.

              We can publish on topics or the queue from the AMQP(Websocket) of CI.

              For topic, topic:topicname(this topicname should be subscribed on the queue created in event mesh) and for queue, queue:queuename

              Test from Postman with deployed IFLOW endpoint with the clientid and clientsecret of the Process Integration Runtime service key. This is to understand how the event are published. Not a realtime scenario.

              Check the CI Message Monitor for successful message to Event Mesh

              Check in the Event Mesh Queue with messages count increased from 0 to 1 along with queue size.

              Step-7: Consume the events from Event Mesh into CI

              Consuming the events from Event Mesh into CI can be done either with websocket(AMQP) or webhook.

              In this case will create a websocket(AMQP) connection by configuring a new IFLOW with HOST, Path from Event Mesh service key, Port:443 and Credential Name from Security Material.

              The queue name as queue:queuename. While consuming the Events from CI its always best to consume from queue rather the topic as topic are not persisted and queues are persisted which we can use for guaranteed delivery. As the message will be consume in JSON format we can convert it into XML as required.

              Once deployed the message in the queue will be consumed by the CI IFLOW

              Step-8: Create Topic and upload the certificate to produce the Events from S4 HANA onto Event Mesh

              The overall design for S4 HANA to publish and CI to consume can be done in two ways

              A. Websocket Communication

              B. Webhook Communication

              Websocket is already explained in the previous steps. Will check now the Webhook communication.

              On top of the created Queue in Event Mesh we need to subscribe to a Topic

              Download the certificate from the Token Endpoint along with clientid and clientsecret generated from the Event Mesh Service Key

              Upload the certificate in S4 HANA STRUST in SSL Client SSL Client(Standard) and add to certificate list

              Step-9: Create a AMQP Channel in SPRO tcode

              Goto SAP Reference IMG=>ABAP Platform=>Enterprise Event Enablement=>Administration=>Channel Connection Settings=>Manage Channel and Parameters

              Use the service key option to create the AMQP channel

              Check the connection and activate the channel

              Step-10: Create outbound binding on the created AMQP channel

              Select the created channel and click on outbound binding and provide the topic name.

              Step-11: Create webhook in the SAP Event Mesh

              In order to create the webhook we need any HTTPs URL to be build and deployed. In our case we will create a new IFLOW with HTTPS as sender in CI itself.

              Create a webhook with queue name, CI deloyed IFLOW endpoint and authentication with client id and client secret of the CI Process Integration Runtime

              Testing:

              Testing the Event from S4 HANA onto Event Mesh which will push through the WEBHOOK on CI.

              Goto SAP Reference IMG=>ABAP Platform=>Enterprise Event Enablement=>Test Tools=>Produce Test Events

              Select the channel name created and trigger the Event.

              Check the CI Monitoring whether the event triggered from S4 HANA to Event Mesh was pushed onto CI through Webhook

              Question 1??? What happens for triggered event from S4 HANA(Producer) is published on one topic in Event Mesh and there are both Websocket Queue Consumer to CI and Webhook Queue Consumer to CI with same topic/queue details.(Ideally it will not be the case in realtime but for testing will explore that case here)

              Answer!!! As the queue consumer will be in continuous consumption mode it will be more quick to take the message in rather waiting for the webhook to trigger the message to the configured HTTPs endpoint.

              Evidence***

              The deployed consumer IFLOW is in continuous consumption mode. Also the consumer should be subscribed from Queue instead of Topic for persistence.

              Question 2??? What happens when there are some errors(mapping/runtime) during the consumer receiving messages from the Event Mesh Queue? Will the message be deleted from Queue immediately or it will persist till the successful consumption?

              Answer!!! The message in the queue will be present till the consumer receives and process it successfully. During that time the consumer will continuously retry.

              Evidence***

              When the consumer IFLOW intentionally failed for some content exception the message in the queue was still persisted and the IFLOW was automatically retried almost 8770 times in less than a minute.

              Rating: 0 / 5 (0 votes)

              The post SAP Event Mesh: S4 HANA On-Prem and SAP CI(BTP IS) through WebSocket and Webhooks appeared first on ERP Q&A.

              ]]>
              Integration of SAP CI(BTP IS) with IBM MQ through AMQP https://www.erpqna.com/integration-of-sap-cibtp-is-with-ibm-mq-through-amqp/?utm_source=rss&utm_medium=rss&utm_campaign=integration-of-sap-cibtp-is-with-ibm-mq-through-amqp Sat, 02 Sep 2023 11:34:04 +0000 https://www.erpqna.com/?p=77129 Introduction What is IBM MQ? IBM MQ is a family of message-oriented middleware products that IBM launched in December 1993. It was originally called MQSeries, and was renamed WebSphere MQ in 2002 to join the suite of WebSphere products. In April 2014, it was renamed IBM MQ. IBM MQ supports the exchange of information between […]

              The post Integration of SAP CI(BTP IS) with IBM MQ through AMQP appeared first on ERP Q&A.

              ]]>
              Introduction

              What is IBM MQ?

              IBM MQ is a family of message-oriented middleware products that IBM launched in December 1993. It was originally called MQSeries, and was renamed WebSphere MQ in 2002 to join the suite of WebSphere products. In April 2014, it was renamed IBM MQ. IBM MQ supports the exchange of information between applications, systems, services and files by sending and receiving message data via messaging queues. This simplifies the creation and maintenance of business applications.

              What is SAP CI(BTP-IS)?

              Cloud Integration(BTP-IS) is a set of services and tools provided by SAP on its cloud-based Business Technology Platform (BTP) to enable integration between different systems, applications, and data sources. The key benefit of CI(BTP IS) is that it enables organizations to quickly and easily integrate their systems, data, and applications without the need for extensive coding or custom development. This helps to streamline business processes, reduce costs, and improve operational efficiency.

              How IBM MQ can be integrated?

              IBM MQ provides the messaging and queuing capabilities across multiple modes of operation: point-to-point ; publish/subscribe. IBM MQ has the Queue Managers(QM) in which different types of queues will be created. The QM can be connected directly or using client channel definition table or using a intermediate queue manager. All these will be associated with the channels which provides the in and out movement of the data from the Queues. Along with Queues we can have Topics also which can work with pub-sub approach. Rest APIS, JMS and MFTs can also be leveraged with the IBM MQ package installation.

              How CI integrates with IBM MQ?

              The integration between CI and IBM MQ can be best done using the AMQP 1.0 protocol. There are 7.5,8.0,9.0,9.1,9.2,9.3 versions of IBM MQ installations available in the market today. Out of which only 9.2 and above versions can support the integration between CI and IBM MQ.

              The Queues on IBM MQ can be connected from CI and the Topics which can be published can also be subscribed from CI using AMQP protocol.

              Note: Among the possible integrations with IBM MQ, Message Queues Integration using the AMQP protocol will be explained in detail.

              Integration of SAP CI(BTP IS) with IBM MQ through AMQP

              Prerequisites:

              1. Any IBM MQ server with version 9.2 and above. For Demo purpose using the trial IBM MQ from https://www.ibm.biz/ibmmqtrial.
              2. SAP Cloud Connector with required roles to connect IS tenant and IBM MQ
              3. SAP BTP IS Tenant Access with required CI roles.

              Step-1: Install the IBM MQ 9.2 from the downloaded setup file

              Select all the features and install them along with the MQ Explorer

              IBM MQ Setup File

              Once installed successfully open the IBM MQ Explorer which should open as below

              IBM MQ Explorer

              Step-2: Create the new Queue Manager (QM1)under the left side Queue Managers Pane

              Queue Manager

              Step-3: Create the new Queue (Q2) under the left side Queue Managers (QM1)

              New Queue in the Queue Manager

              Step-4: Create the new AMQP channel under the left side Queue Manager (QM1)

              While Creating the AMQP channel provide the port as 5672 and start the channel. There will be a default AMQP channel which comes along with the installation stop it and use our configured AMQP channel. This is for the reason of handling the queues explicitly.

              AMQP Channel

              Step-5: Configure the Virtual Mapping to IBM MQ Internal System in Cloud Connector

              Make sure the cloud connector is already connected successfully to the SAP BTP subaccount.

              The backend type should be “NON-SAP System” and the protocol should be “TCP” only

              Use the same port (5672) in system mapping which was given in the AMQP channel in IBM MQ and internal host on which the IBM MQ is installed and remaining virtual details as required.

              Cloud Connector Virtual Mapping

              Check the result of the reachable status on the internal host and port

              Test the Virtual Mapping

              Step-6: Test the connectivity from SAP Integration Suite to Cloud Connector.

              The location ID (TESTCC) is the same name which was given during the connection between cloud connector and SAP BTP subaccount.

              Cloud Connector Connectivity Test from IS

              Step-7: Test the connectivity from SAP Integration Suite to IBM MQ through AMQP

              The virtual host and virtual port which was given during system mapping in cloud connector should be used along with the same location id.

              AMQP Connectivity Test from IS

              Step-8: Deploy the SASL username/password of IBM MQ in IS Security Material

              SASL Credentials Deployed in Security Material

              Step-9: Create an IFLOW to send the message to the IBM MQ queue using AMQP channel

              Provide the virtual host and port details along with the Location ID as tested earlier.

              The credential name should be from the deployed security material.

              To IBM MQ Through AMQP

              Provide the destination type as Queue and use the same queue (Q2) created in IBM MQ

              Destination and Queue Details

              For testing purpose providing the sample message in the Content Modifier as below.

              Content Modifier with Message Body

              Step-10: Create an IFLOW to read the message from the IBM MQ queue using AMQP channel

              Provide the virtual host and port details along with the Location ID as tested earlier.

              The credential name should be from the deployed security material.

              From IBM MQ through AMQP

              For testing purpose storing the payload read from IBM MQ queue (Q2) using groovy script

              Queue Details to be processed

              Monitoring the Messages in CI

              CI Monitoring Dashboard
              Payload read from IBM MQ Queue

              Monitoring the Messages in IBM MQ

              Before Execution:

              Before Execution with 0 read and write status

              After Execution:

              After Execution with 1 received and 1 sent status

              Conclusion:

              AMQP which is used mostly for Event Broker topics pub-sub can also be used for the exchanging of messaging queues data which provides a tight integration between CI and IBM MQ.

              If the data in the IBM MQ TOPICS to be stored and retrieved then similar configurations to be done considering the pub-sub approach of the Topics using AMQP protocol itself.

              Rating: 0 / 5 (0 votes)

              The post Integration of SAP CI(BTP IS) with IBM MQ through AMQP appeared first on ERP Q&A.

              ]]>
              Test SAP CPI Mappings using Postman https://www.erpqna.com/test-sap-cpi-mappings-using-postman/?utm_source=rss&utm_medium=rss&utm_campaign=test-sap-cpi-mappings-using-postman Tue, 06 Jun 2023 11:00:54 +0000 https://www.erpqna.com/?p=75242 Overview In this blog, we will see how you can use postman, to test the SAP CPI mappings. You can use this approach for testing your groovy/xslt mapping as well if you find it useful. SAP CPI mapping simulation lacks more functionality where as SAP PO has better one. In SAP CPI there is no […]

              The post Test SAP CPI Mappings using Postman appeared first on ERP Q&A.

              ]]>
              Overview

              In this blog, we will see how you can use postman, to test the SAP CPI mappings. You can use this approach for testing your groovy/xslt mapping as well if you find it useful.

              SAP CPI mapping simulation lacks more functionality where as SAP PO has better one. In SAP CPI there is no way you can save your xml files as instance, generate xml, copy paste, etc. Even in SAP PO, you have to manually to go output and check the result. For verifying output, you have to manually check it, no way to automate it.

              Use Case

              For illustration purpose let us consider this simple Invoice xml with one field.

              xsd:

              <?xml version="1.0" encoding="UTF-8"?>
              <schema xmlns="http://www.w3.org/2001/XMLSchema" xmlns:tns="http://www.example.org/Shipping/" targetNamespace="http://www.example.org/Shipping/">
                  <element name="Invoice" type="tns:Invoice"></element>
                  
                  <complexType name="Invoice">
                  	<sequence maxOccurs="1" minOccurs="1">
                  		<element name="ProductType" type="string"></element>
                  	</sequence>
                  </complexType>
              </schema>

              xml:

              <?xml version="1.0" encoding="utf-8"?>
              <!-- Created with Liquid Technologies Online Tools 1.0 (https://www.liquid-technologies.com) -->
              <ns1:Invoice xmlns:ns1="http://www.example.org/Shipping/">
                <ProductType>Power</ProductType>
              </ns1:Invoice>

              Mapping Artifact: We have simple transformation which is using fix values to transport input.

              Simulate Mapping test

              output xml

              <?xml version="1.0" encoding="UTF-8"?>
              <ns1:Invoice xmlns:ns1="http://www.example.org/Shipping/">
                  <ProductType>P</ProductType>
              </ns1:Invoice>

              Problem

              We have to run the mapping manually four times to validate all four values. And we have to manually check the values every time we run the mapping. This might sound easy, but any incremental changes, we have to run these tests again manually and check the output by looking at the output xml.

              For each simulation test, you have to wait for few secs to get the output.

              What if you have more than 1 field is using transform logic and more than 10 fix values to test. Not easy to do it manually.

              For the documentation purpose, you have to capture the inputs and outputs and keep it as record. It is cumbersome to do it manually.

              Solution

              What I need

              1. Automate the mapping simulation test
              2. Automate cross checking output. No need to go to output xml and check it manually.
              3. Automate capturing input and output

              Let us create a separate Iflow to test the mapping. Here is the simple Iflow with https sender adapter.

              Create request in postman for each test case.

              Write a test case for checking product type for all types. I have used chatGPT for writing test script as I am new to java script.

              // Parse the XML response
              const responseXml = pm.response.text();
              
              // Extract the ProductType value
              const productType = responseXml.match(/<ProductType>(.*?)<\/ProductType>/)[1];
              
              // Check the ProductType value
              pm.test("Check ProductType Power", () => {
                  pm.expect(productType).to.eql("P");
              });

              Run the collection

              Here you go.

              So now we are able to automate mapping test via postman and able to very the output. Creating test scripts should be one time activity, once done, you can run it multiple time.

              If you create postman workspace with visibility as team, you should be able to share it with your team.

              Documentation

              For documenting your tests, you can make use of Newman htmlextra for postman. With this you can document it.

              Rating: 5 / 5 (1 votes)

              The post Test SAP CPI Mappings using Postman appeared first on ERP Q&A.

              ]]>
              Cloud Integration(CPI) Monitoring made easy with SAP CAI Chatbot to help monitor message processing in Mobile Devices https://www.erpqna.com/cloud-integrationcpi-monitoring-made-easy-with-sap-cai-chatbot-to-help-monitor-message-processing-in-mobile-devices/?utm_source=rss&utm_medium=rss&utm_campaign=cloud-integrationcpi-monitoring-made-easy-with-sap-cai-chatbot-to-help-monitor-message-processing-in-mobile-devices Mon, 29 May 2023 11:51:17 +0000 https://www.erpqna.com/?p=75068 Introduction: In this current blog, we will discuss about cloud integration monitoring and accessing of message processing logs (MPL). The MPL stores information about the individual processing steps for each processed message in the tenant therefore providing the detailed overview of every individual messages on the tenant. To access MPL, we have got an OData […]

              The post Cloud Integration(CPI) Monitoring made easy with SAP CAI Chatbot to help monitor message processing in Mobile Devices appeared first on ERP Q&A.

              ]]>
              Introduction:

              In this current blog, we will discuss about cloud integration monitoring and accessing of message processing logs (MPL). The MPL stores information about the individual processing steps for each processed message in the tenant therefore providing the detailed overview of every individual messages on the tenant.

              To access MPL, we have got an OData API called “MessageProcessingLogs” which can be called to get details.

              Now to call this API we have to do certain config in SAP BTP Cockpit and add required Roles in BTP to access MPL. We will deep dive into the config and design steps below.

              Design/Development Steps:

              Let’s break down the entire steps into simpler steps:

              1. Create Service Instance and Assign Roles in BTP Cockpit
              2. Develop Integration flow to call the API
              3. Design/ Develop SAP CAI Chatbot to access MPL and display details of message processing

              Step 1. Create Service Instance and Assign Roles in BTP Cockpit:

              To access MPL we need to create a service instance of SAP BTP service Process Integration Runtime with plan as API. To do that we need to navigate to SAP BTP Cockpit and then click on Services > Instances and Subscriptions under sub account used for Cloud Integration.

              Now click on Create to proceed.

              Service Instance Creation_Step1

              Granting required Role:

              Service Instance Creation_Step2

              Once Service Instance is created we need to create Service Keys. And to do that, we need to go to instances created and choose the instance and clicking on the Actions( three dots after the name ) will open up a menu. Please choose Create Service Key there as shown below:

              Service Key Creation

              Once Key is created, we can utilize the clientid, clientsecrete and tokenurl to create Oauth Client Credentials in SAP Cloud Integration’s “Security Material” to authenticate the API access.

              Service Key Generated
              OAuth Client Credentials in Security Material

              Step 2. Develop Integration flow to call the API:

              • Step 2.1 : Integration Flow Overview:
              Integration Flow

              Runtime Configuration:

              Allowed Headers

              Once the integration flow is deployed, we received an endpoint to call this integration flow. Now we will utilize this endpoint in SAP CAI chatbot and call this for MPL data.

              Step 3. Design/ Develop SAP CAI Chatbot to access MPL and display details of message processing:

              Logon to SAP CAI and create you chatbot.

              Please follow these steps to create your bot:

              1. Click on “+New bot”.
              2. Select Perform Actions tile with no predefined skills.
              3. Enter a name for your bot something like getmpl.
              4. Choose “Type of data” as Non-personal and store conversation data as store.
              5. You can select the visibility of bot of your choice.

              Now, your bot is created and ready to be configured.

              • Create Intent:

              Now, let’s create intent for the bot and to do that please navigate to Intents tab under Train and click on “+New Intent”. Please input a name for the intent and a description. Find below the created intent for the bot:

              Intent
              • Create Entity:

              Now, let’s create an entity and to do that click on “+New Entity” in Entities tab under Train and name the entity as Messageprocessinglogs and choose the entity to be a Free entity because we want machine learning algorithm to detect all the possible values such as “status”, “logs” or “state” to help detect the actual status to the entity #Messageprocessinglogs.

              P.S: SAP Conversational AI provides many predefined entities (gold entities) such as #location, #datetime, #number, etc. These keywords are already there to be used by the bot.

              Now, we will highlight the keyword “status”, “logs” and “state” to match with the entity #Messageprocessinglogs and do this to all the expressions as below:

              Entity with Expressions

              You can test if the correct intent is being executed based on the sentence you type.

              On the right-hand side, click on the Expression Analysis console. A pop up window will open for testing.

              Type some sentences there. If it’s correct, it should detect the keyword “status”, “logs” and “state” as the entity #Messageprocessinglogs and the status input will automatically be extracted.

              Expression Analysis
              • Add Skill to Chatbot:

              On the build tab, create a new skill called ‘fetchmonitoringsuggestion’.

              Skill

              You can either choose skill type as Business or Floating because the technical purpose of both these skill type is same.

              Now, click on the created skill add a firing condition of the skill means add a condition for which you want this skill to respond. To do that, go to the Triggers tab as shown add the condition:

              Triggering Conditions

              Please have a look for other enough additional information prior to executing the action. Therefore, the value of status should exist.

              Now, go the Requirements tab, to add the requirement.

              Requirements for Triggering

              If the status is complete, bot will send monitoring interval suggestions as quick reply and based on the user action Cloud Integration endpoint will be called to fetch the MPLs.

              If status is missing, it will send a card informing user to ask about the status.

              Now, after all the requirements met, we will now connect external services and consume the API we created in Cloud Integration. We have formatted the response to a list output and MPLs will come in a Item-Details list.

              API Consumption
              • Test Bot:

              Now time to test bot:

              If you want to see more details of the status sent, please click on any of the list item and that will take you to this screen below:

              MPL Details
              Rating: 0 / 5 (0 votes)

              The post Cloud Integration(CPI) Monitoring made easy with SAP CAI Chatbot to help monitor message processing in Mobile Devices appeared first on ERP Q&A.

              ]]>