SAP Cloud - ERP Q&A https://www.erpqna.com/tag/sap-cloud/ Trending SAP Career News and Guidelines Wed, 26 Mar 2025 12:20:38 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://i0.wp.com/www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna.png?fit=32%2C32&ssl=1 SAP Cloud - ERP Q&A https://www.erpqna.com/tag/sap-cloud/ 32 32 109149340 Top 10 SAP Cloud ALM News March 2025 https://www.erpqna.com/top-10-sap-cloud-alm-news-march-2025/ Wed, 26 Mar 2025 12:20:34 +0000 https://www.erpqna.com/?p=91210 This March, we’re thrilled to introduce a powerful set of new features designed to elevate your project management, monitoring, and traceability. Discover the top 10 enhancements that will drive your digital transformation forward. IMPLEMENTATION UPDATES 1. Manage Process Hierarchy Nodes in Solution Activities A significant enhancement has been introduced to simplify the management of process […]

The post Top 10 SAP Cloud ALM News March 2025 appeared first on ERP Q&A.

]]>
This March, we’re thrilled to introduce a powerful set of new features designed to elevate your project management, monitoring, and traceability. Discover the top 10 enhancements that will drive your digital transformation forward.

IMPLEMENTATION UPDATES

1. Manage Process Hierarchy Nodes in Solution Activities

A significant enhancement has been introduced to simplify the management of process hierarchies for solution activities. Users can now visually navigate and manage process hierarchy nodes directly from both the list view and the Object Page view.

    In the Object Page view, you have the ability to edit the details of the selected solution activity’s process hierarchy node. Additionally, you can add new process hierarchy nodes by selecting Assign or remove existing ones by selecting one or more entries and choosing Unassign.

    This feature provides a more intuitive way to manage and visualize the hierarchy of processes, making it easier to understand and modify the relationships between different solution activities.

    2. Mass Editing of Quality Gate Checklist Items

    You can now update multiple checklist items within a quality gate simultaneously. In Edit mode, you can modify the following values for multiple items at once:

      • Status
      • Planned Completion
      • Team
      • User Type
      • Responsible

      This feature allows for more efficient management and customization of your quality gate checklists.

      3. Assign Relations in Documents App

      You can now assign or unassign relations to the configuration library elements directly from within the Documents app. The assigned relations are listed in the table configurations on the object page of a document, providing a seamless and integrated experience for managing your documents and their associated configurations.

        4. Streamline Processes with the New “Scope State” Filter

        You can now use this new filter to display only those solution processes that are not yet in scope. This functionality enhances other tasks, such as using the mass editing feature to set multiple processes into scope.

        Note

        • The Scope State filter is hidden by default but is available under “Adapt Filters”
        • Please note that if you select the “Out of Scope” value while viewing the list of Open Solution Processes in display mode, the list will appear empty as the list view was already prefiltered to show scoped processes only.

        5. Add Template to Project

        When you access the “Setup Project” task in a new project that hasn’t had a template assigned yet, you now can modify the project title and choose an SAP Activate Roadmap within the Guided Implementation application.

        6. New Defects Overview Card Available

        Gain a clear perspective on the statuses and priorities of your defects with the new Defects Overview Card. It not only makes your management process more efficient but also allows you to focus on resolving issues, ensuring quality and simplifies your workflow.

        7. Simplified Test Plan Duplication

        You’re now able to duplicate your test plans, making it easier to use them for future testing cycles. The duplicate test plan will be created within the same project and will retain the same attributes as the original. However, any test results linked to the original test plan will not be duplicated.

        Note: The default title for a copied test plan is Copy of [Original Title] and its status is set to In Preparation.

        OPERATIONS UPDATES

        8. SAP Analytics Cloud Data in Real User Monitoring

        The Real User Monitoring app in SAP Cloud ALM now monitors SAP Analytics Cloud, allowing you to view real-time user interaction data – for example, you can monitor response times of individual stories (interactive reports). This feature currently focuses on the client-side components of SAP Analytics Cloud, providing instant insights into real-time front-end user experiences and enabling quick identification of any performance problems.

        9. New KPIs available for SAP Systems

        The following KPIs are now available for SAP S/4HANA Cloud Private Edition, SAP S/4HANA, and SAP Business Suite 7:

          • Overdue Purchase Requisition Items Blocking PP/PM Orders
          • Production Orders Not Settled
          • Outbound Delivery Order Items Overdue for Picking.

          These KPIs will be activated automatically if the steps outlined in the Setup for SAP ABAP Systems have been completed and you are utilizing ST-PI version 7.40 SP29.

          10. Solving Tips for Exception Solutions

          In Integration & Exception Monitoring, the exception details screen now features a “Solving Tips” tab, which displays potential solutions for this exception. These solutions appear as cards, showing summaries and resource links that can be clicked for more detailed information. To refine the suggested solutions, enter more information such as program name or reporting components in the “Fine-tune results with additional keywords” field. The suggestions will then be updated based on the keywords provided.

          Empower Your Business with SAP Cloud ALM This March and bloom into a season of action with SAP Cloud ALM’s latest innovations! These new features are designed to fuel your success and make your digital transformation journey even more exciting.

          Rating: 5 / 5 (1 votes)

          The post Top 10 SAP Cloud ALM News March 2025 appeared first on ERP Q&A.

          ]]>
          91210
          Keeping Up-to date with SAP Cloud ALM https://www.erpqna.com/keeping-up-to-date-with-sap-cloud-alm/ Sat, 01 Mar 2025 07:31:16 +0000 https://www.erpqna.com/?p=90751 SAP Cloud ALM, tenant extension Usage Rights is included SAP Cloud Service subscriptions containing Enterprise Support, cloud editions, in SAP Enterprise Support and in Product Support for Large Enterprises. This includes a baseline of 8 GB SAP HANA Memory and 8 GB monthly outbound API data transfer SAP Cloud ALM, tenant extension is now available […]

          The post Keeping Up-to date with SAP Cloud ALM appeared first on ERP Q&A.

          ]]>
          SAP Cloud ALM, tenant extension

          Usage Rights is included SAP Cloud Service subscriptions containing Enterprise Support, cloud editions, in SAP Enterprise Support and in Product Support for Large Enterprises. This includes a baseline of 8 GB SAP HANA Memory and 8 GB monthly outbound API data transfer

          SAP Cloud ALM, tenant extension is now available on the SAP price list. This material comprises:

          • An extra productive SAP Cloud ALM tenant
          • 8 GB of SAP HANA memory usage
          • 8 GB of outbound API data transfer per calendar month

          Framework on Cloud ALM

          SAP Cloud ALM helps customers to implement and operate their cloud or hybrid solutions in Integration with tools.

          • SAP Signavio: SAP Signavio Process tools are primarily used for process analysis and process mining activities.
          • SAP Leanix: SAP LeanIX is a SaaS application that allows to manage and optimize enterprise architecture.
          • SAP Enable Now: SAP Enable Now is a digital adoption platform (DAP) from SAP that provides in-application help and e-learning content to improve the adoption, productivity, and efficiency of SAP software
          • Tricentis: Accelerates testing of all your applications, data, and business processes and integrates with tools across your delivery pipeline.
          • CPI: Design the interfaces on CPI and monitor them in Integration Monitoring

          Available features on Project Setup

          • SAP Activate Template for Clean Core Success Plan is available

          • Charts Available for Project Tracking
          • Table View

          • Gannt Chart

          • Cards View

          • Adding Project Tile for quick access

          The tile gets added to my Home as per the selection

          • Mass Editing of User Stories and Tasks

          Select the Project Tasks and User stories to be modified:

          Mass edits the Project Tasks and User stories to define the required values

          • Predecessor-Successor Relationship Setup

          Download the excel from Cloud ALM and define the Predecessor-Successor Relationship in the excel by adding the Rows in the Predecessor column of excel and upload on Cloud ALM.

          The relation gets defined as below:

          Limitations on Project Setup

          • Holiday calendar function is not supported currently in project setup
          • Custom card is not supported currently.
          • Resource Planning is not supported currently.
          • There is currently no workflow or approval process in place to enforce certain business logic based on document status.

          RISE with SAP Methodology Dashboard on SAP Cloud ALM

          The dashboard delivers set of capabilities to monitor and track the status of systems, offering insights and recommendations and ensuring alignment with SAP’s best practices.

          • Access a detailed overview of system adherence and optimization opportunities for production or non-production S/4HANA, directly aligned with SAP’s clean core approach.
          • Analyze detailed insights and data on the clean core status of systems, hosted on the SAP Cloud ALM.
          • To identify deviations from SAP’s recommendations and prepare for future upgrades. It supports maintaining a streamlined system with minimal customizations.

          Transport Checks on Feature

          The Transport Checks are available for Features application in Cloud ALM.

          The checks include Downgrade Protection and a Cross-Reference Check. It’s possible to choose to perform an individual check or both.

          Documents support for uploading external files

          Documents support now uploading external files in different formats (PDF, PPT, etc.) from the SAP Document Management service directly in SAP Cloud ALM.

          It’s possible to upload, replace, download, and delete external files from SAP Cloud ALM. This new feature is available on the object page of a document under the tab External File.

          To use this feature, there must be established a connection between the SAP Document Management service and SAP Cloud ALM.

          The SAP Document Management Service supports the following plans:

          Use the standard plan for productive development.

          • Default plan with all the standard features.
          • Excellent for productive usage.
          • 100-GB data storage.
          • API access is available with API calls in blocks of 50000.

          Use the free plan for evaluation purposes only.

          • Plan with basic functionality and relaxed security, excellent for development and try-out purposes.
          • 1-GB data storage.
          • 1000 API calls.

          Use the build-code plan if you’re using this service as part of SAP Build Code.

          • This is a dedicated plan for the SAP Build Code.
          • For build-code plan, API calls shall be calculated in blocks of 1000.

          DevOps tools chain on Cloud ALM

          DevOps is Continuous Integration (CI) & Continuous Deployment (CD) within an agile lead approach, utilizing automation and tools to ease deployment and release management.

          SAP Continuous Influence for SAP Cloud ALM

          Enables customers to continuously suggest improvements directly to the product development teams.

          This program is specifically aimed towards customers with productive licenses for SAP products and who would like to be able to suggest improvements in a continuously open channel.

          Rating: 5 / 5 (1 votes)

          The post Keeping Up-to date with SAP Cloud ALM appeared first on ERP Q&A.

          ]]>
          90751
          Managing Nested CSV Structures in SAP Cloud Integration: A Generic Groovy Script Solution https://www.erpqna.com/managing-nested-csv-structures-in-sap-cloud-integration-a-generic-groovy-script-solution/ Wed, 29 Jan 2025 12:39:09 +0000 https://www.erpqna.com/?p=90332 Introduction File-based integrations are a common use case in SAP Cloud Integration (CI), enabling the exchange of structured or unstructured data between systems using file formats such as CSV, XML, or flat files. Below are the Business Use Cases for File-Based Integrations in SAP CI, Handling CSV Files in SAP CI Using CSV to XML […]

          The post Managing Nested CSV Structures in SAP Cloud Integration: A Generic Groovy Script Solution appeared first on ERP Q&A.

          ]]>
          Introduction

          File-based integrations are a common use case in SAP Cloud Integration (CI), enabling the exchange of structured or unstructured data between systems using file formats such as CSV, XML, or flat files.

          Below are the Business Use Cases for File-Based Integrations in SAP CI,

          1. Employee Data Transfer: A company using SuccessFactors for HR management needs to send employee payroll data in CSV format to an external payroll vendor. SAP CI retrieves the CSV file from an SFTP server, transforms it into the required format (e.g., XML), and delivers it to the vendor’s system.
          2. Order Management: An e-commerce platform generates daily order details in CSV format, which needs to be integrated into an ERP system like SAP S/4HANA. SAP CI reads the CSV file, maps it to the required format, and posts it to the ERP system via an OData or IDoc interface.
          3. Inventory Updates: A retail chain sends CSV files with inventory updates from multiple stores to a central warehouse system. SAP CI processes these files, aggregates the data, and updates the inventory database in real-time.
          4. Bank Statement Processing: Banks provide daily account statements in CSV format, which businesses need to reconcile with their ERP systems. SAP CI retrieves these files, converts them into XML or JSON, and posts them to the ERP system for reconciliation.
          5. EDI File Pre-Processing: Trading partners send EDI files embedded within CSV files. SAP CI extracts the EDI content, processes it using Trading Partner Management (TPM), and forwards it to the downstream systems.
          6. Data Migration: During a system migration project, legacy data stored in CSV files needs to be imported into a new system. SAP CI reads these files, performs necessary transformations, and loads the data into the target system.
          7. Periodic Reporting: A manufacturing company generates production reports in CSV format and shares them with stakeholders via email or SFTP. CI automates this process by fetching the files, applying transformations, and distributing them to designated recipients.

          Handling CSV Files in SAP CI Using CSV to XML Converter

          SAP CPI handles CSV files using the CSV to XML Converter to transform flat-file data into a structured XML format, making it easier to process and integrate with other systems. The converter allows you to define the structure of the CSV file, including delimiters, headers, and field mappings, and then generates XML output based on this configuration. This transformation simplifies downstream processing, such as applying mapping logic, enriching data, or routing messages. Additionally, the converter supports handling complex scenarios like multi-line records, optional fields, and data validation, ensuring seamless integration of CSV files into various enterprise systems like SAP S/4HANA or third-party applications.

          Limitation with Standard CSV to XML Converter

          The standard CSV to XML converter is particularly useful for transforming CSV files into flat XML structures. However, when dealing with input CSV files containing nested structures—such as Header and Line record values—the standard converter falls short, as it does not support hierarchical transformations.

          This document introduces an innovative approach to address such scenarios by utilizing a reusable Groovy script. This script is designed to handle nested CSV structures and convert them into the required XML format. The solution is versatile and can be applied across different CSV files with nested structures, ensuring consistent and efficient transformation.

          By implementing this Groovy script, integration developers can overcome the limitations of the standard CSV to XML converter and streamline the handling of complex CSV structures within SAP CI.

          Example Scenario

          The CSV file below contains customer details (Identifier, Customer Name, ID, City) followed by order details (Identifier, Order ID, Order Name, Order Quantity) along with a footer.

          Expected XML Structure

          Steps

          To achieve the above transformation, a customized Groovy script can be utilized. This script is designed to be reusable for similar requirements without requiring modifications.

          The only requirement is to pass the necessary properties to the script using a Content Modifier. This approach ensures flexibility and simplifies the configuration process, making the script adaptable to various scenarios.

          Content Modifier

          Create a Content Modifier step, add the exchange properties, and assign values to them as described below.

          • Delimiter – Specify the delimiter based on the Input file
          • Main_Record_1 – Main record name in the xml structure (ex: Customer)
          • Main_Record_2 – Sub Record name in the xml structure (ex: Record)
          • Header_RecordHeader – Record name in the xml structure (ex: Customer_Details)
          • Header_Columns – Column names of the header record (ex: Identifier, Customer_Name, Customer_ID, Customer_City). Values should be separated by delimiter.
          • Line_Item_Record – LineItem record name in the xml structure (ex: Order_Details).
          • Line_Items – Column names of the LineItem record (ex: Identifier, Order_ID, Order_Name, Order_Quantity). Values should be separated by delimiter.
          • Footer_Required – Based on the need of footer values in the xml structure we need to set this value as YES/NO.
          • Footer_Record – If footer is required then footer record name in the target xml should be given here (ex: Footer). If footer is not required, then value should be NA.
          • Footer – If footer is required then column names of the footer record should be given here (ex: Identifier, Date). If footer not required, then value should be NA.

          Groovy Script

          Create a Groovy script step and use the attached code as-is, following the Content Modifier step

          import com.sap.gateway.ip.core.customdev.util.Message;
          import java.util.HashMap;
          import java.io.StringWriter;
          import java.io.*;
          def Message processData(Message message) {
              //Body 
                 def body = message.getBody(java.lang.String) as String;
                 
                 def h=[]
                 def hn=[]
                 def ln=[]
                 def ft=[]
                 
              //reading the property values
                 map = message.getProperties();
                 
                 def delimiter=map.get("Delimiter");
                 def HeaderColumn=map.get("Header_Columns");
                 def HeaderRecord=map.get("Header_Record");
                 def LineItems=map.get("Line_Items");
                 def LineItemRecord=map.get("Line_Item_Record");
                 def MainRecord=map.get("Main_Record_1");
                 def SubRecord=map.get("Main_Record_2");
                 def footer=map.get("Footer");
                 def FooterRecord=map.get("Footer_Record");
                 def FooterRequired=map.get("Footer_Required");
                 
                 StringBuilder sb = new StringBuilder();
                 
                 String s1="<"+MainRecord+">"+"<"+HeaderRecord+">";
                 String s2="<"+MainRecord+">"+"<"+SubRecord+">"+"<"+HeaderRecord+">";
                 String s3="</"+LineItemRecord+">"+"</"+MainRecord+">";
                 String s4="</"+LineItemRecord+">"+"</"+SubRecord+">"+"</"+MainRecord+">";
                 String s5="</"+LineItemRecord+">"+"<"+HeaderRecord+">";
                 String s6="</"+LineItemRecord+">"+"</"+SubRecord+">"+"<"+SubRecord+">"+"<"+HeaderRecord+">";
                 String s7="</"+FooterRecord+">"+"<"+HeaderRecord+">";
                 String s8="</"+FooterRecord+">"+"</"+SubRecord+">"+"<"+SubRecord+">"+"<"+HeaderRecord+">";
                 String s9="</"+FooterRecord+">"+"</"+MainRecord+">";
                 String s10="</"+FooterRecord+">"+"</"+SubRecord+">"+"</"+MainRecord+">";
                 
                  sb.append("<").append(MainRecord).append(">");
                 
                 //Splitting the CSV, line by line
                 def b=body.split('\n');
                 
                 hn=HeaderColumn.split(delimiter);
                 ln=LineItems.split(delimiter);
                 ft=footer.split(delimiter);
                 int n=hn.length;
                 int l=ln.length;
                 int k=ft.length;
                 for (int i=0;i<b.size();i++)
                 {
                     h=b[i].split(delimiter);
                        if(h[0].equals('H'))           //header
                        {
                          sb.append("<").append(HeaderRecord).append(">");
                          for (int j=0;j<n;j++)
                          {
                            sb.append("<").append(hn[j]).append(">").append(h[j]).append("</").append(hn[j]).append(">"); 
                          }
                         sb.append("</").append(HeaderRecord).append(">");
                        }
                     
                     if(h[0].equals('L'))             //lineitems
                     {
                        sb.append("<").append(LineItemRecord).append(">"); 
                        for (int j=0;j<l;j++)
                        {
                            sb.append("<").append(ln[j]).append(">").append(h[j]).append("</").append(ln[j]).append(">"); 
                        }
                         sb.append("</").append(LineItemRecord).append(">");
                        
                     }
                     
                     if(h[0].equals('F') && (FooterRequired.equals('YES')))  //footer
                     {
                        sb.append("<").append(FooterRecord).append(">"); 
                        for (int j=0;j<k;j++)
                        {
                            sb.append("<").append(ft[j]).append(">").append(h[j]).append("</").append(ft[j]).append(">"); 
                        }
                         sb.append("</").append(FooterRecord).append(">");
                      }
                 }
               
                 sb.append("</").append(MainRecord).append(">")
          message.setBody(sb.toString().replaceAll(s1,s2).replaceAll(s3,s4).replaceAll(s5,s6).replaceAll(s7,s8).replaceAll(s9,s10));
                 
                 return message;
          }

          Other Scenarios

          The same approach is equally effective for CSV files with multiple headers, below is the example CSV file.

          Note

          • This script assumes that the header line starts with “H,” the line item starts with “L,” and the footer line starts with “F”.
          • Please ensure that the properties in the Content Modifier are defined with the same names as mentioned above, paying attention to case sensitivity.
          Rating: 5 / 5 (1 votes)

          The post Managing Nested CSV Structures in SAP Cloud Integration: A Generic Groovy Script Solution appeared first on ERP Q&A.

          ]]>
          90332
          Donuts and SAP ALM: The sweet analogy of agile product development for SAP Cloud ALM https://www.erpqna.com/donuts-and-sap-alm-the-sweet-analogy-of-agile-product-development-for-sap-cloud-alm/ Thu, 05 Dec 2024 13:02:22 +0000 https://www.erpqna.com/?p=88914 Many of us enjoy a freshly baked donut. We not only think of the delicious taste, but also of the shape – a ring. And although you might reflexively think that the article at hand is dealing with the analogy of the outer shape of a donut and the Application Lifecycle Management ring, I will […]

          The post Donuts and SAP ALM: The sweet analogy of agile product development for SAP Cloud ALM appeared first on ERP Q&A.

          ]]>
          Many of us enjoy a freshly baked donut. We not only think of the delicious taste, but also of the shape – a ring. And although you might reflexively think that the article at hand is dealing with the analogy of the outer shape of a donut and the Application Lifecycle Management ring, I will not go into that, but rather into the difference between plan-driven and value-driven development and their problems.

          The question of why certain features are still missing from SAP Cloud ALM is particularly interesting. Many customers ask precisely this question: why is this or that feature still not available? I am not trying to defend SAP here, because it could certainly be argued that with more or faster developer resources, some things could be implemented more quickly. But for me, it is more about creating an understanding of the challenges: How do you balance the pressure of customer expectations with long-term vision and quality assurance? This is precisely where the difficulty of combining both stability and flexibility in a plan-driven approach to deliver sustainable value becomes apparent.

          From SAP SolMan to SAP Cloud ALM

          SAP Solution Manager (also affectionately known as SolMan by the community) will no longer be covered by mainstream maintenance from the end of 2027. Although SAP Cloud ALM is not a direct successor to SAP Solution Manager, as there will be no feature parity, it is at least the logical successor.

          And it is precisely here that the connection between a doughnut and SAP Cloud ALM arises. We have already established that almost everyone knows that a doughnut is ring-shaped. Many of you also know what a good doughnut should taste like. And SAP customers know what the SAP Solution Manager looks like and what ALM functions it offers. It is therefore obvious that these are the functions that are expected of the “successor product”.

          The Solution Manager can therefore be compared to a well-known doughnut recipe. It is a recipe and a preparation in which we know exactly which ingredients belong and what the end result should be. It is a proven recipe that is loved by many.

          The development of SAP Cloud ALM, on the other hand, represents a completely new recipe and approach. It is no longer developed in a plan-driven manner, where the end product and its features are fixed from the outset. Rather, it is a value-driven approach that prioritizes customer value and benefit.

          Value-Driven vs. Plan-Driven Development: A Recipe Duel

          Understanding the differences between value-driven and plan-driven development is important to recognize the benefits and challenges. Here is a comparison:

          Plan-Driven Development (PDD)

          PDD is based on a systematic and sequential approach in which each phase (requirements, design, implementation, validation) of software development is strictly planned before it actually begins. The scope of services is fixed, while the costs and deadlines are “variable” or the levers that can be adjusted.

          • Planning: The main focus is on advance planning. It is expected that all requirements will be determined at the beginning of the project and that little or no change will occur during the course of the project.
          • Documentation: PDD emphasizes the importance of comprehensive documentation. Each step is documented in detail to ensure that all parties involved understand the process and requirements exactly.
          • Flexibility: PDD is usually less flexible with regard to changes, since changes are often costly and time-consuming, especially if they are made late in the process.
          • Risk management: PDD tries to minimize risks early on by planning everything in advance.
          • Examples of methods: Waterfall model, V-model.

          Value-Driven Development (VDD)

          VDD focuses on creating continuous value for the customer or end user by relying on feedback and iterative development. With this approach, the costs and deadlines are “fixed” and the scope of services is variable. Good, agile requirements engineering is a must for VDD.

          • Planning: Instead of strictly adhering to a predetermined plan, VDD flexibly adapts to new information or changing customer needs.
          • Documentation: While documentation is still important, it can be less extensive in VDD than in PDD. The focus is on functional software and customer feedback.
          • Flexibility: VDD is very flexible with regard to changes, since the approach expects that requirements may change over time.
          • Risk management: VDD accepts that risks exist, but relies on minimizing them through regular feedback loops and adjustments.
          • Examples of methods: Agile development methods such as Scrum and Kanban

          Plan-driven Development: As familiar as a glazed doughnut or an SAP Solution Manager

          A few years ago, SAP Solution Manager was developed. A platform for implementing and operating SAP solutions (mainly for on-premise products). The recipe is fixed, the ingredients are known, and the result is predictable, i.e. the principle for the development of SolMan was largely the systematic approach of “plan-driven development». Similar to an architect who plans every detail before construction begins.

          The challenge of change: SAP Cloud ALM

          Imagine you are standing in a modern doughnut bakery. Instead of just baking a set doughnut, the baker experiments with different ingredients. He first brings you the naked dough ring, the basic structure – a minimum viable product (MVP), so to speak. For some, this is already enough, others wait for the icing, still others for the sprinkles, and based on the feedback, the baker can gradually (in several iterations) refine the doughnut. Still others give feedback and demand a certain filling. Which way is the right one?

          The development of SAP Cloud ALM is reminiscent of the trend towards experimental doughnut flavors. SAP Cloud ALM was also developed for the implementation and operation of SAP solutions, but with a clearer focus: for SAP cloud products.

          Agile development at SAP Cloud ALM is now like the baker stopping by every two weeks and saying, “Try this!” But instead of a full doughnut, there might only be one bite. Some customers are excited about the quick delivery and the opportunity to provide feedback. Others might be disappointed that the doughnut isn’t “done” yet.

          A world of flavors that is constantly changing

          Just as the world of donuts is constantly evolving – just think of trends like cronuts or donut burgers – so is our world of information systems and general conditions. The most important thing is that we are prepared to adapt, provide feedback and be open to the countless possibilities that lie ahead.

          Value-driven development requires flexibility and openness to change. And while some customers appreciate the opportunity to design their own “donuts,” others find it challenging to constantly adapt to new flavors.

          A shift in mindset is needed in at least three ways:

          1. Development teams must adapt so that they are able to actually develop a product in an agile way.
          2. Customers must develop the willingness and acceptance that not every function already adds 100% value, but that sub-functions can be used more quickly and at least add partial value.
          3. Customers must embark on a journey and not just demand functions because they “have always done it that way”, but question whether alternative approaches could bring even greater benefits.

          In a constantly changing (digital) world, we should perhaps all adopt the values of doughnut lovers a little more: curiosity, flexibility, patience and an open mind for the next potentially best and sweetest bite of doughnut innovation – even if one or two attempts don’t taste so good.

          Rating: 5 / 5 (1 votes)

          The post Donuts and SAP ALM: The sweet analogy of agile product development for SAP Cloud ALM appeared first on ERP Q&A.

          ]]>
          88914
          All the buzz about SAP CLOUD ALM https://www.erpqna.com/all-the-buzz-about-sap-cloud-alm/ Mon, 30 Sep 2024 10:38:42 +0000 https://www.erpqna.com/?p=88032 Structure This blog is structured to understand the following: Read More: SAP Solution Transformation Consultant with SAP Cloud ALM Certification Preparation Guide Cloud ALM Vs Solution Manager Ultimately, the decision between these two tools will depend on the organization specific needs. Overview of Cloud ALM SAP Integrated Tools Overview SAP Readiness Check to move to […]

          The post All the buzz about SAP CLOUD ALM appeared first on ERP Q&A.

          ]]>
          Structure

          This blog is structured to understand the following:

          • Cloud ALM Vs Solution Manager
          • Overview of Cloud ALM
          • SAP Integrated Tools Overview
          • SAP Readiness Check to move to SAP S/4HANA
          • Project Management Approach using SAP Activate Phase
          • Change Management in Cloud ALM
          • Manual Testing Use Case Sample

          Read More: SAP Solution Transformation Consultant with SAP Cloud ALM Certification Preparation Guide

          Cloud ALM Vs Solution Manager

          • SAP Cloud ALM is a good choice for organizations that needs an Agile, cloud-based solution that integrates easily with other SAP solutions.
          • SAP Solution Manager is a better fit for organizations that need a highly customizable tool with a proven track record.

          Ultimately, the decision between these two tools will depend on the organization specific needs.

          • Solution Manager: Mainstream support for the Solution Manager will end in 2027, and Extended Support will end in 2030.
          • CALM: Usage rights are included in SAP Cloud Service subscriptions containing Enterprise Support, cloud editions, in SAP Enterprise Support and in Product Support for Large Enterprises
          • SAP Cloud ALM includes a baseline of 8 GB SAP HANA Memory.
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Overview of Cloud ALM

          • SAP Cloud ALM enables you to build, run, and optimize your intelligent enterprise.
          • SAP Activate methodologies with CALM makes it easier to adapt in phases as we along in the CALM usage journey.
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          SAP Integrated Tools Overview

          • SAP has partnered with tools which helps in Business Process Mining, Project Management and Test Automation to make the transition journey easier.
          • These tools can be used at different phases of the transition journey.
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          SAP Readiness Check to move to SAP S/4HANA

          • SAP Cloud ALM enables to set up references between SAP Readiness Check findings and SAP Cloud ALM project.
          • With this functionality, we can create new and assign existing follow-ups to the findings within SAP Readiness Check analysis and manage them in SAP Cloud ALM project.
          • This allows to administrate SAP Cloud ALM project and prepare for a Brownfield conversion or Greenfield Implementation

          The integration is available for the following check scenarios:

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          • SAP highly recommend running SAP Readiness Check tools in the production system to get an accurate picture of actual system usage.

          Check analysis information such as the following is available:

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          How to utilize Cloud ALM Readiness Check Feature:

          • Role required to access the readiness check feature:
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          You might see the below screen when you login initially on CALM Instance.

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          • SAP Readiness Check analyzes existing system using built-in APIs, which may need to be updated to ensure data is pushed to Cloud ALM to the readiness check analysis.
          • Each supported scenario provides its own central SAP Note, which guides through the preparation steps required for the specific scenario.
          • To start a new analysis, input the readiness report as below:
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          • We can then start reviewing the results of the Readiness Check

          Dashboard Analysis as per the sample readiness check report:

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          We can use the readiness check report to create all the tasks i.e., user stories and subtask and do the follow-ups.

          We are now ready to implement a project using SAP Activate Methodology (Phases) and roadmaps.

          Project Management Approach using SAP Activate Phases

          Project Management contains:

          • Definition of teams and roles.
          • Timeboxes and releases.
          • Agile methodology using sprints.

          Project Creation

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Managing Scope

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Managing Requirements

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Managing User story

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Managing Feature and Q-Gates

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Change Management in Cloud ALM

          • Change Management comes with straight forward flow for changes in SAP Cloud ALM.
          • SAP is planning to integrate APIs as well with change management process to bring in more automation. With this we can easily embed sub-workflows created via SAP Workflow Management.

          Change Management Workflow

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Creating and deploying a transport on Cloud ALM

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Manual Testing Use Case Sample

          Overall steps for Manual testing

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Cloud ALM screen for test management

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Test Preparation

          • Create to create a new test case
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          • Add the test steps as required and change the status to Prepared. SAVE.
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Test Plan

          • Create to create a new test plan.
          • Assign Test Cases to add one or more test cases to the plan
          • Assign the person responsible for completing the tests and the start/end dates
          • Change the Status to In Testing when finished, then save.
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Test Execution

          • Execute the test cases that have been prepared in the Test Preparation
          • Select the test results as applicable
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Defect Management

          • Raise a defect if test results are not as expected
          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          Overall Cloud ALM dashboard overview for the Project deliverables summary:

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification

          SAP Cloud ALM, SAP Solution Transformation Consultant with SAP Cloud ALM Certification
          Rating: 0 / 5 (0 votes)

          The post All the buzz about SAP CLOUD ALM appeared first on ERP Q&A.

          ]]>
          88032
          Job and Automation Monitoring features – SAP Cloud ALM https://www.erpqna.com/job-and-automation-monitoring-features-sap-cloud-alm/ Thu, 21 Mar 2024 10:00:31 +0000 https://www.erpqna.com/?p=82785 SAP Cloud ALM is an out-of-box, native cloud solution which serves as the central hub for managing SAP landscapes, offering guided implementation driven by content and highly automated operations. In this blog post, I aim to delve into a very crucial capability of cloud ALM: Job and Automation Monitoring. Through Job and Automation monitoring within […]

          The post Job and Automation Monitoring features – SAP Cloud ALM appeared first on ERP Q&A.

          ]]>
          SAP Cloud ALM is an out-of-box, native cloud solution which serves as the central hub for managing SAP landscapes, offering guided implementation driven by content and highly automated operations. In this blog post, I aim to delve into a very crucial capability of cloud ALM: Job and Automation Monitoring.

          Through Job and Automation monitoring within SAP Cloud ALM, it is possible to monitor SAP S/4 HANA Application jobs, SAP ABAP jobs, SAP BW Process Chains, SAP Build Process Automation Jobs(or SAP Intelligent RPA jobs) and SAP Business workflow and Job scheduler service jobs in customer BTP application.

          Job & Automation Monitoring guarantees uninterrupted business operations while enhancing the quality and performance of business process execution. This is achieved through:

          • Monitoring the health of job executions and detecting anomalies during execution.
          • Promptly alerting both business and IT users about disruptions and exceptions.
          • Providing detailed insights at the job execution level to facilitate fact-based root cause analysis.
          • Minimizing configuration efforts by leveraging historical execution information.

          Having recently implemented Job and Automation Monitoring, I’m excited to outline its standout features that make it exceptionally useful.

          1. Overview – Overview provides a summary status regarding job execution for the Managed Components. It shows the status of the latest execution of every job with regards to the Technical exceptions, Application exceptions and run time. It also displays the number of unconfirmed alerts and status events.

          Clicking on a card takes us to the Monitoring view which shows the list of jobs and automations with their latest Execution status, Application Status, Start delay and runtime.

          Rating rules are visible in Configuration section under Monitoring Rating Rules. Execution Status and Application Status are non-modifiable, but Start Delay and Run Time can be modified as per requirement.

          Clicking on the job name allows to further drill down to the list of executions of the selected job. Additionally, clicking on the information icon provides further details and enables direct navigation to the job within the managed component by selecting the Run ID.

          2. Data collection – Data collection offers a convenient feature to toggle the Collection of data for managed Objects. Within the configuration section, under Managed components, there is an option to Switch On or Off the data collection for managed components.

          3. Heart Beat Status – This feature visualizes the heartbeat status of managed services. Within the configuration section, under Managed Components, users can view the heartbeat status. If there is no data collection, a red-light indicator will be displayed.

          4. Alerting – Alerting in cloud ALM allows to configure alerts and notifications based on the status of job executions within the managed components. Alerts can be set up for various events, such as execution status, application status, runtime, or start delay, ensuring timely awareness and proactive management of the system.

          Event Name in the alert can be adjusted to indicate the type of the alert for which event has been created.

          Execution status rating for the event can also be chosen while setting up the event.

          There is another recently introduced feature “At Every Occurrence”. Normally, consecutive job failures prompt just one notification until the alert is acknowledged. However, this new option allows users to receive alerts for each individual instance of job failure.

          Recipients for email notifications can be maintained in Event Actions under “Send email to” section. Email IDs need to be maintained in the “Notification Management” application.

          Job and automation monitoring deliver an enriched notification email:

          • Subject Line: The event name is augmented with the corresponding job name.
          • Content: Detailed information regarding the failed job execution is provided. Additionally, the email body contains a direct hyperlink to the alert, facilitating immediate access to the alert within Cloud ALM.

          5. Integration with Business Service Management(BSM) – Within the Business Service Management application in SAP Cloud ALM for operations, we can see business services, manage assigned cloud services and technical systems, and handle associated events. To view status events such as maintenance, disruption, upgrade etc. In the homepage Overview tab, there is an option to view the status of event for the managed components. Events are maintained in Business Service Management application.

          6. Analytics – The Analysis view allows users to pinpoint jobs with extended response times and highest number of exceptions. For any job it shows aggregated information on total and average run time, number of executions and number of failed executions and the resulting failure rate. By clicking on the arrow at the end of every line, we can drill down into various trend charts for the metrics.

          Rating: 0 / 5 (0 votes)

          The post Job and Automation Monitoring features – SAP Cloud ALM appeared first on ERP Q&A.

          ]]>
          82785
          Modularize your CAPM Project https://www.erpqna.com/modularize-your-capm-project/ Sat, 03 Feb 2024 10:15:20 +0000 https://www.erpqna.com/?p=81363 As you have learned, a CAPM project consist of 3 major components. The first and foremost one is the database. The second one is the service and last the last one is UI5 application(not mandatory). Each of the above components can be created as a separate project, is the way to achieve modularization in CAPM. […]

          The post Modularize your CAPM Project appeared first on ERP Q&A.

          ]]>
          As you have learned, a CAPM project consist of 3 major components. The first and foremost one is the database. The second one is the service and last the last one is UI5 application(not mandatory).

          Each of the above components can be created as a separate project, is the way to achieve modularization in CAPM. Now you may ask why to perform this activity.

          Consider the example of VBAK/VBAP table in SAP. This table can be accessed from multiple programs/application. What if we need to do modification/enhancement to this table?. Apply the same concept here. If all the three components are in the same project, it is hard to do modification. Once you do the modification to the database, all the dependent services/UI5 application might crash/stop. We need to further look into the issues for each of them and need to rectify accordingly.

          In order to achieve this, let’s now create a “HANA Database Project”. use the template wizard to perform this activity.

          Follow the below steps to create an application.

          • Provide Project Name : here i am giving “hcm_employee_cntrl” and press “Next”.
          • You are now requested to add the basic information. BAS will automatically filled with default module Name as “db”. You can either use the same or rename it. Here i am keeping the same. Click on Next to continue.
          • Further set the basic properties for your application. Here i am keeping my Namespace as blank as I will be reusing this database in multiple application. However, there won’t be any problem if you provide namespace here.
          • You can provide a schema name. Here i am using “DB_EC”
          • Choose HANA Database version as “HANA Cloud”
          • Select “yes” for “Bind the database module to a run-time environment service instance?” and click on Next .
          • Select “No” for “Create a new HDI service instance?”
          • Choose your Cloud Foundry Service from the drop down. Here i am selecting “dev_hdi_container_shared”. Click Finish.
          • BAS will create your SAP HANA project in few seconds.

          We can further create a “db” as we have done earlier. You may be note that the naming conventions(extensions) are different here. You can create database table, View, Roles etc as part of your SAP HANA Native application project.

          • Create a new folder “tables” under the generated folder “db/src”. This is again for the modularization purpose. You can even directly create your database table under “db/src” folder.
          • Once you create the “tables” folder, create a new file with name “EMPLOYEES.hdbtable”. Here “.hdbtable” defines it as a HANA Database table.
          • This file is responsible to generate the database table in SAP HANA database. As you are aware, we can create Colum or Row based table in SAP HANA database. Here we will be suing a Column based table.
          • Copy the below code inside your file “EMPLOYEES.hdbtable”
          COLUMN TABLE "EMP_DBTABLE.EMPLOYEES"(
            ID                    NVARCHAR(36),
            EMP_ID_EXT            NVARCHAR(20),
            NAME                  NVARCHAR(255),
            EMAIL_ID              NVARCHAR(128),
            DEPARTMENT_ID         NVARCHAR(36),
            createdAt             TIMESTAMP,
            createdBy             NVARCHAR(50),
            modifiedAt            TIMESTAMP,
            modifiedBy            NVARCHAR(50),
              PRIMARY key(ID)
          )
          • As per the SAP security guidelines, we should never expose the database directly. Instead, we need to create a V1 view to expose the data. In order to achieve this, create a new folder “views” under the folder “db/src”.
          • Once you create the “views” folder, create a new file with name “EMPLOYEES.hdbview”. Here “.hdbview” defines it as a HANA Database view.
          • Copy the below code inside your file “EMPLOYEE.hdbview”
          VIEW "EMP_DBVIEWS.V.V1.EMPLOYEES" AS
          SELECT ID,
            EMP_ID_EXT,
            NAME,
            EMAIL_ID,
            DEPARTMENT_ID,
            createdAt,
            createdBy,
            modifiedAt,
            modifiedBy
          FROM "EMP_DBTABLE.EMPLOYEES";
          • However, we need to create the respective role to access this artifacts outside.
          • Create another older “roles” under “db/src”.
          • Further create a new file “EmployeeReadOnly.hdbrole”. This role we are creating for the ready only purpose, which means, once this role is assigned, that application can only read the information from the underlaying database table.
          • Add the below code in your file “EmployeeReadOnly.hdbrole”
          {
              "role": {
                  "name": "DB_EC::ReadOnly",
                  "object_privileges": [
                      {
                          "name": "EMP_DBVIEWS.V.V1.EMPLOYEES",
                          "type": "TABLE",
                          "privileges": ["SELECT"]
                      }
                  ]
              }
          }
          • As I mentioned above, we are not exposing the database directly instead we will be exposing the”V1″ view created.
          • Now create another role with file name “EmployeeModify.hdbrole”.
          • Add the below code inside the file.
          {
              "role": {
                  "name": "DB_EC::ReadOnly",
                  "object_privileges": [
                      {
                          "name": "EMP_DBVIEWS.V.V1.EMPLOYEES",
                          "type": "TABLE",
                          "privileges": ["SELECT", "INSERT", "UPDATE","DELETE"]
                      }
                  ]
              }
          }
          • There are other role concepts like Privilege and privileges with grand options as well. We are not really discussing that much in detail. Below is the sample code to create the role. You ca explore the same. To consume this database view in another application, we must create the below roles as well. Here the “#” at the end of the role name is must, identify the role as a role with additional privileges.
          {
              "role": {
                  "name": "DB_EC::ReadOnly#",
                  "object_privileges": [
                      {
                          "name": "EMP_DBVIEWS.V.V1.EMPLOYEES",
                          "type": "TABLE",
                          "privileges_with_grant_option":["SELECT"]
                      }
                  ]
              }
          }
          • The above sample code is for read only with grant option.
          {
              "role": {
                  "name": "DB_EC::Modify#",
                  "object_privileges": [
                      {
                          "name": "EMP_DBVIEWS.V.V1.EMPLOYEES",
                          "type": "TABLE",
                          "privileges_with_grant_option": ["SELECT", "INSERT", "UPDATE","DELETE"]
                      }
                  ]
              }
          }
          • The above sample code is for non-read only with grant option.

          Your application is now ready to deploy. As we have done in the previous blogs, follow the same steps.

          • Right click on the “mta.yaml” file and choose “Build MTA Project” option.
          • An “mta_archives” will be generated automatically.
          • Lets do the deployment by log into Cloud Foundry endpoint.
          • Open a new Integrated terminal by right click on the project
          • Use command “cf login”. Follow the instructions form the screen.
          • Once logged in, right click on the generated mta archive file and choose option “Deploy MTA Archive”.
          • Once the deployment is success, you can see the application deployed in CF as shown below. By default, it will be in “stopped” status. You can turn it to “Start” when required.

          At this moment, you have completed the DB artifacts creation and deployment. As I mentioned above, we need to use this in an application. In order to consume the DB artifacts in other application, we need to understand “SINONYMS”

          Create another CAP project to consume the above DB.

          • Use command “cds init EMP_CENTRAL”
          • Create a data model under “db” folder with name “data-model.cds”
          namespace hcm.empcntrl;
          
          using {
              cuid,
              managed
          } from '@sap/cds/common';
          
          entity EMPCNTRL : cuid, managed {
              EMP_ID_EXT    : String(20);
              NAME          : String(255);
              EMAIL_ID      : String(128);
              DEPARTMENT_ID : String(36)
          }
          • to expose, create a file “EMPCNTRL-service.cds” under “srv” folde and add the below code.
          using hcm.empcntrl as ec from '../db/data-model';
          
          service EMPCNTRLService {
              entity empcntrl as select from ec.EMPCNTRL;
                  
              
          }
          • Use the command “cds add mta” to include the MTA file which is needed to specify the remote source to connect with our CAP application.
          • Create a “Run Configuration”, follow the steps in the previous blogs.
          • Open “Package.json” to add HANA dependency or use command “cds add hana”. You can manually add it using the below code.
          "cds": {
              "requires": {
                "db": "hana"
              }
            }
          • Click on “db-hana” to execute the Run configuration.
          • After the execution of Run Configuration, open the “MTA.yaml” file. Here i am doing some corrections
            • Under “module”, the name i am changing to “HCM_BE_EMPLOYEE_CENTRAL” to a more meaningful name
            • Similarly, i am changing the db name as well to “HCM_DB_EMPLOYEE_CENTRAL”. Please find the below screen shot.
          • Now we are about to discuss about the main point, which to consume the remote service
          • Add the below code in the “MTA.yaml” file under the resource section.
          resources:
            - name: EMP_CNTRL-db
              type: com.sap.xs.hdi-container
              parameters:
                service: hana
                service-plan: hdi-shared
              properties:
                hdi-container-name: ${service-name}
            - name: cross-containerservice-empcntrl    
              type: org.cloudfoundry.existing-service
              parameters:
                service-name: hdi_Employee_Central #Cross container name must be used here
                properties:
                  the-service-name: ${service-name}
          • Now mention the “hdi-container-name” in the module as well, refer the last 2 lines.
          modules:
            - name: HCM_BE_EMPLOYEE_CENTRAL
              type: nodejs
              path: gen/srv
              parameters:
                buildpack: nodejs_buildpack
              build-parameters:
                builder: npm
              provides:
                - name: srv-api # required by consumers of CAP services (e.g. approuter)
                  properties:
                    srv-url: ${default-url}
              requires:
                - name: EMP_CNTRL-db
          
            - name: HCM_DB_EMPLOYEE_CENTRAL
              type: hdb
              path: gen/db
              parameters:
                buildpack: nodejs_buildpack
              requires:
                - name: EMP_CNTRL-db
                  properties:
                    TARGET_CONTAINER: ~{hdi-container-name}
          • The default container is not sufficient. We need one more resource to run the DB module.
          - name: cross-containerservice-empcntrl
                  properties:
                    key: db-ec
                    service: {the-service-name}
                  group: SERVICE_REPLACEMENTS
          • Add the above code as part of the “module” section of “MTA.yaml”
          • Make sure your “tab indent” is correct, otherwise it will be an error.
          • Now we need to add a “SYNONYM”.under “srv” folder
          • Create a file “.hdbsynonym” file. Here, i will be using the name “ec.hdbsynonym”. Add below code inside the new file.
          {
              "HCM_EMPCNTRL_EMPCNTRL": {
                  "target": {
                      "object": "HCM_EMP_DB.V_V1_RA_EMP",
                      "schema": "HCM_RA_EMP"
                  }
              }
          }
          • Create another file with “ec.hdbgrants” to grant the necessary authorization for the consuming application. Add the below code in it.
          {
              "db-ec" : {
                  "object_owner": {
                      "container_roles" :["RA_EMP_Modify#"]
                  },
                  "application_user": {
                      "container_roles" :["RA_EMP_Modify"]
                  }
              }
          }
          • One final change, to do in the “.cds” file in order to specify that the database must not create, instead it has to be derived from the existing database. Refer the sample code below. Here “@cds.persistence.exists” does that job
          namespace hcm.empcntrl;
          
          using { cuid, managed } from '@sap/cds/common';
          @cds.persistence.exists
          entity EMPCNTRL : cuid, managed {
              EMP_ID_EXT    : String(20);
              NAME          : String(255);
              EMAIL_ID      : String(255);
              DEPARTMENT_ID : String(36);
          }
          • Lets deploy now. Before, do not forget to run “npm install” to add all the dependent components.
          • If required, you can add the code to convert the API output to OData V2 as well by adding the proxy….
          • Right click on “MTA.yaml” and choose “Build MTA Project”.
          • Wait until the build completes. Once the MTA Archive is ready, right click on the Archive file and choose “Deploy MTA Archive”.
          • Wait until the deployment complete and application status turn to “started”. as this is a lengthy process.
          • Once the deployment is success, pls check the HANA Database Explorer to validate the SINONYMS. Instead of creating a new Database, the application create a relationship with the SAP HANA native application using cross container access.
          Rating: 0 / 5 (0 votes)

          The post Modularize your CAPM Project appeared first on ERP Q&A.

          ]]>
          81363
          How to create a simple SUM scenario for the Fiori Launchpad https://www.erpqna.com/how-to-create-a-simple-sum-scenario-for-the-fiori-launchpad/ Sat, 18 Nov 2023 11:16:02 +0000 https://www.erpqna.com/?p=79397 This blog will guide you through the creation of a simple monitoring scenario for Synthetic User Monitoring in SAP Cloud ALM for Operations. Prerequisites To create your simple SUM scenario with SAP Cloud ALM, you need to follow these 3 main steps: 1. Create the script 1.1. Record the sequence In a web browser configured […]

          The post How to create a simple SUM scenario for the Fiori Launchpad appeared first on ERP Q&A.

          ]]>
          This blog will guide you through the creation of a simple monitoring scenario for Synthetic User Monitoring in SAP Cloud ALM for Operations.

          Prerequisites

          • Have configured a runner on you Synthetic User Monitoring.
          • Have a web browser configured with the Selenium IDE plugin.
          • Have an SAP Cloud ALM user with the role Synthetic User Monitoring Scenario Expert or Synthetic User Monitoring Scenario Administrator.

          To create your simple SUM scenario with SAP Cloud ALM, you need to follow these 3 main steps:

          • Create the script with Selenium,
          • Upload the obtained script in SAP Cloud ALM,
          • Create and configure the SUM monitoring scenario in SAP Cloud ALM.

          1. Create the script

          1.1. Record the sequence

          In a web browser configured for Selenium IDE:

          1. (Recommended to appropriately record the logon) Clean-up the environment to ensure no open session exists.

          2. Open the Selenium IDE.

          This opens the Selenium IDE application and the welcome popup.

          3. Choose Record a new test in a new project.
          4. Give a name to the project. (E.g. Fiori Launchpad) and click OK.

          5. Indicate the URL for your Fiori Launchpad and click START RECORDING.
          E.g. https://<host:port>/sap/bc/ui2/flp?#Shell-home

          This opens a new browser page, normally on the logon page of the Launchpad you want to monitor.

          If you are directly logged on, likely your script-creation environment is not properly prepared.
          For instance: your SSO was automatically applied because your browser session was already connected.

          6. Fill-in the credentials and type <Enter>.

          You are logged on the Launchpad.
          And in the Selenium IDE UI, you can see that your actions were recorded.

          7. Add a wait for element visible to ensure, at runtime, that the logon is successful.
          For instance, in the Launchpad:

          i. Put the mouse cursor over a tile and open the context menu (right click).
          ii. Select Selenium IDE → Wait For → Visible.

          This adds a wait for element visible in the Selenium IDE.

          ii. Hint: the added command is not fully valid. It misses to indicate the maximum amount of time to wait (in milliseconds).

          In the Selenium IDE, in the wait for element visible command, set 30000 in the Value field.

          8. (Optionally) record some navigations.
          9. In the Launchpad, click the profile icon and click Sign out.

          10. Click OK in the confirmation popup.

          You are now signed out.

          11. Like in step 7, add a wait for element visible on the goodbye screen.
          (Do not forget to manually add the maximum amount of wait time)
          12. In the Selenium IDE, stop the recording.

          13. Choose a name for the sequence (test) you created.

          14. (Optional) You may want to save your temporary script.
          This will create a SIDE file.

          Warning
          At this moment, the SIDE file you save still contains the logon credentials in clear.

          1.2. Check and adjust the recorded sequence

          During the recording, Selenium IDE recorded all the actions you performed on the UI.

          It’s now time to replay the sequence and, if required, adjust it by:

          • Adjusting the commands target,
          • Adding missing actions,
          • Deleting useless actions.

          After modifying the script, it’s advised to attempt to replay again the sequence to verify that everything is correct.

          1.2.1. Replay the sequence

          In Selenium IDE, click the Run current test button.

          A browser session is opened and the script is executed.

          It could be that Selenium IDE replay fails on an error like:
          “A listener indicated an asynchronous response by returning true, but the message channel closed before a response was received.”

          Generally, SUM can handle that properly. In Selenium IDE, you can avoid the error by:

          1. Setting a breakpoint of the faulty action by clicking on the row number of the corresponding command.
          2. Executing the script normally by clicking the Run current test button.
          3. When the execution pauses on the faulty command, wait for the application to display the screen.
            Then either:
            i. Continue the execution automatically by clicking the Resume test execution button.
            ii. Or continue the execution command-by-command by using the Step over current command button.

          1.2.2. Adjust the commands targets

          During the recording, Selenium IDE tried to figure out the best way to identify the target element for the actions.
          It supports may ways to identify elements: id, css selector, xpath, …

          This corresponds to the Target field of the commands.

          Generally, Selenium IDE tries to propose several options. The main one is the one you see in the UI. But you can see the others via the Target filed dropdown.

          If during the replay, an element cannot be found despite the fact it is visible in the browser, you may want to adjust the command target.

          • Option 1 – Have Selenium IDE try to propose another target automatically,
          • Option 2 – Define yourself a new target using one of the identification methods supported by Selenium IDE.

          1.2.3. Add missing actions

          During the recording, you may have missed some actions. E.g. some Wait for …
          You can edit the sequence to add them.

          For instance, you may want to wait for the login screen to be displayed before trying to preform the logon.

          You can add the missing commands:

          Either manually …

          In Selenium IDE:

          1. Select the command that should follow the missing action
          2. In the context menu, choose Insert new command
            This adds a new empty command.

          3. Select the new command row and fill-in the Command field.
          Choose wait for element visible.

          4. Then fill-in the Target field.

          You can either enter it manually or use the Select target in page button.
          In our case, “wait for the login screen to be displayed”, we already know a possible target: we can use the same than the one of the type command used to type-in the username.

          5. (Optionally) You can try the entered target by clicking the Find target in page button.
          This will highlight the targeted element.

          … or by recording the missing actions.

          In Selenium IDE:

          1. Put a breakpoint on the command that should follow the missing action.
          2. Execute the sequence by clicking the Run current test button.
            The execution will halt just before the execution of the command.

          3. Stop the execution by clicking the Stop test execution button.

          4. Click the Start recording button.

          5. Record the missing actions.

          6. Stop the recording by click the Stop recording button.

          The commands have been inserted in the sequence.
          (For the wait for … commands, do not forget to indicate the maximum wait time in the command Value field)

          1.2.4 Delete useless actions

          During the recording, you may have performed some useless actions that Selenium IDE recorded.
          You may not want to keep them in the sequence. For that:

          1. Identify the useless commands.
            For instance, by replaying sequence step-by-step.
            E.g. During the logon phase, the click on the username field is generally useless (like the click on the password field).
          2. Select the command to remove.
          3. In the context menu, select Delete.

          Warning

          Selenium IDE does not have “undo” on the command delete. Be careful when deleting commands

          1.3. Prepare the script for SUM

          1.3.1. Adjust the Test suite name

          When the SIDE file is imported in SUM, the scenario name is the one of the Test suite.
          By default, Selenium IDE names the Test suites “Default Suite”.

          To have another name for your script, in Selenium IDE:

          1. Go to the Test suites section.
            You can see the Default Suite Selenium IDE created.
          2. Click the 3 vertical dots to open the context menu (put the mouse cursor above the Default Suite to see them).
          3. Choose Rename.

          4. Enter a new name and click RENAME.

          1.3.2. Define the SUM steps

          During the Scenario execution, SUM captures metrics for all the SUM steps.

          By default, the script defines only one step.

          Here I would like to have several steps for each functional actions in the recorded sequence:

          • Entrance → the opening of the application URL,
          • Login,
          • Logout.

          The first (default) SUM step is named after the name of the Selenium Test. Currently named myTest.
          To change the Test name, in Selenium IDE:

          1. Go back to the Test section.
            You can see the test you created at the end of the recording.
          2. Click the 3 vertical dots to open the context menu (put the mouse cursor above the test name to see them).
          3. Choose Rename.

          4. Enter a new name and click RENAME.

          For the next SUM steps, you can use the SUM annotation @sap.sum.step=”your step name”
          c.f. SAP Cloud ALM for Operations Expert Portal → Synthetic User Monitoring → Creating Synthetic User Monitoring Scripts # Add SUM Steps to Selenium Tests

          For instance, to declare a SUM Step for the Login phase:

          1. Select the first command of the Step.
            In our case, the type of the user name.
          2. In the command Description field, enter @sap.sum.step=”Login”

          Do the same operation for the Logout step which, in our case, starts with the click command on the id=meAreaHeaderButton

          Hint

          It’s advised to end a step with a Wait for … command to ensure that the monitored UI displays the correct information before continuing with the next step (or finishing the script)

          1.3.3. Declare the SUM variables

          You may want to have field not hardcoded in the Selenium script but fed by the SUM Scenario configuration.
          This is typically the case for the sensitive information like the credentials.

          To have a SUM variable for the login password of our example, in Selenium IDE:

          1. Select the type command where the password is entered.
          2. Replace the Value field with a dummy value.

          Note

          If a replay is attempted in Selenium IDE, it will be required to temporarily set again the actual password.

          3. In the Description field, add the @sap.sum.secure.variable=password SUM annotation.

          For more information about the SUM variables, c.f. SAP Cloud ALM for Operations Expert Portal → Synthetic User Monitoring → Creating Synthetic User Monitoring Scripts # Define SUM Variables in Selenium Tests

          Your script is ready, you can save your project.

          This creates the SIDE file to import in SUM.

          2. Upload the script as a SUM resource

          1. Logon to your SAP Cloud ALM tenant.
          2. Start Synthetic User Monitoring.

          3. Open the configuration panel and, on the Resources section, enter the edit mode.
          This opens the SUM configuration popup.

          4. Click the Upload button.

          5. Browse to the SIDE file you created.
          6. Click Upload.

          7. The Upload Report popup lists the scripts SUM detected in the SIDE file.

          Your Selenium IDE script (SIDE file) was now imported and is visible in the resources.

          3. Create and configure the SUM monitoring scenario

          In the Synthetic User Monitoring Configuration popup:

          3.1. Create a new Scenario using the script you previously created

          1. Go to the Scenarios section.
          2. Click the Add button.

          This opens the Add Scenario popup.

          3. In the General section, choose a name for the scenario and enter a description.
          By default, the new scenario will be automatically added to your SUM scope.
          4. In the Execution section, use the drop-down to select the script you imported via the SIDE resource.
          5. In the Business Services section, select a business service to associate your scenario to.
          This is used to indicate which applications are monitored by your scenario.
          If you don’t have specified business services yet, you can use the SUM default one.
          6. Click the Add button.

          Your new scenario is now created. You still have to configure it.

          3.2. Configure the SUM variables

          In the Scenarios list:

          1. Click on the scenario you created.

          This opens the Scenario Details.

          2. Go to the Variables tab.
          3. Enter the variable values to use at execution time. For instance, enter the value for the password variable you defined during the script creation.
          4. Save.

          3.3. Associate the Scenario to a Runner

          In the Scenario Details:

          1. Go to the General tab.
          2. In the Runners section, click the Assign button.
            This opens the Assign Runners popup.

          3. Select one (or several) Runners to assign to the Scenario, then click Assign.

          The popup is closed and the runner is added to the list of Assigned Runners.

          4. Activate the Scenario on the Runner.
          5. Save.

          The Scenario is now scheduled to be regularly executed on the Runner.

          Because you already added the scenario in your SUM Scope during the scenario creation, the SUM overview displays your new scenario.

          3.4. (optional) Adjust the time interval between two executions

          In the Scenario Details:

          1. Go to the Scheduling tab.
          2. Untick the Use default checkbox.
          3. Change the Execution Period.
            E.g. set 900 (900 seconds interval) for an execution every 15 minutes.
          4. Save.

          3.5. (optional) Enable the capture of screenshots during the execution

          In the Scenario Details:

          1. Go to the Selenium tab.
          2. In the Screenshots section, for the Automatic screenshots field:
            i. Untick the Use default checkbox.
            ii. Choose At each step in the drop-down.
          3. Save.

          You now learned how to create a simple Synthetic User Monitoring scenario to monitor a SAP Launchpad.

          Rating: 0 / 5 (0 votes)

          The post How to create a simple SUM scenario for the Fiori Launchpad appeared first on ERP Q&A.

          ]]>
          79397
          Migrating dangerous goods data in SAP S4/HANA: not as simple as it seems! https://www.erpqna.com/migrating-dangerous-goods-data-in-sap-s4-hana-not-as-simple-as-it-seems/ Wed, 15 Feb 2023 11:05:22 +0000 https://www.erpqna.com/?p=72125 When switching from one system to another, it’s important to make sure that all data is transferred correctly. We ran into some interesting challenges when migrating our own dangerous goods data. Unfortunately, due to the fact that these processes are relatively new in S/4HANA only a small amount of information about it could be found […]

          The post Migrating dangerous goods data in SAP S4/HANA: not as simple as it seems! appeared first on ERP Q&A.

          ]]>
          When switching from one system to another, it’s important to make sure that all data is transferred correctly. We ran into some interesting challenges when migrating our own dangerous goods data.

          Unfortunately, due to the fact that these processes are relatively new in S/4HANA only a small amount of information about it could be found online. This inspired us to write this blog post to contribute to the community and help others who are struggling with the same difficulties. Think of it as a step-by-step guide for your own data migration.

          Below we explain each step in the process of migrating dangerous goods data.

          The first step is to create a migration project.

          We use the Migrate Your Data app in the Migration Cockpit.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 1: Migrate Your Data – Migration Cockpit

          Once you’ve created your migration project and its various migration objects, you can start migrating your dangerous goods data.

          Make sure you upload this data in the following order:

          • PC – Product compliance info
          • DG – Assessment for unpackaged products (content-based)
          • DG – Assessment for packaged products

          Select your template from the drop-down list for the migration object you want to upload.

          Select Download Template -> Download CS Files.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 2: Download the data migration template

          Remark

          Make sure the material is compliance-relevant before you start uploading. You can change this setting in the material master data. Open the Manage Product – Master Data app.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 3: Manage Product – Master Data

          Select your material and click Edit. You can now make changes in the Product Compliance tab.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 4: Decide if your product is compliance-relevant or not

          Product compliance Info

          The first two tabs (Introduction and Field List) contain useful information about this process, so be sure to read through it carefully. Because this information is so thorough, I won’t discuss it here.

          The most important tab in this file is Chemical Compliance View, which contains the following columns.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 5: Excel file Product Compliance Info

          Start by entering the Internal Number. This is a unique code containing letters (A-Z) and up to 80 characters. This number is for internal use only and will not be shared with clients or external parties.

          You can enter the number in the following path:

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 6: Path to customising internal number range
          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 7: Define internal number range

          The second column (Responsible Unit) shows the group of employees responsible for the specific packaged or unpackaged product. You can configure this information in the following path:

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 8: Path to customising responsible units for dangerous goods
          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 9: Define responsible units for dangerous goods

          The third column (Responsible Unit for Dangerous Goods) can have the same value as the previous column.

          The next tabs (until the Product Master Assignment) can be populated with information about the product itself.

          In the tab Product Master Assignment you can link the internal number you just entered to the product number. This product number is a unique code derived from the product master that can be shared with clients.

          In the next column you can choose to enter an X or to leave it blank. If you enter an X, it will retrieve the name from the product master linked to the product number.

          The next three tabs contain optional information that does not have to be entered for a successful upload process. In the last tab you can assign a ‘goal’ to your product. Enter the unique internal number (A-Z) and link it to a goal. This can be a goal defined in the system.

          Once you have double-checked all entered data, save it as an XML file. You can also save it as an XLS file if you want to make changes to it later.

          Now you can start the upload process.

          Go to the migration object and click Upload File.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 10: Step 1 in the data upload

          Click Upload.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 11: Upload your XML file

          Select the file you just created.

          If you selected the right template, a successful transfer notification will appear.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 12: Success notification

          When you return to the main menu, the system will automatically recommend the next step:

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 13: Step 2 in the data upload

          Click on Prepare. The following message will appear:

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 14: Preparing staging tables

          Click on Prepare Staging Tables.

          The following message will appear:

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 15: Information message during the preparing of the staging tables

          Click on the Monitoring tab to see the status.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 16: How to open the monitoring view

          If any errors appear, they can be viewed in detail here.

          These errors must first be resolved before the process can continue.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 17: Success notification

          If the staging tables were successfully uploaded, the system will suggest the next step:

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 18: Step 3 in the data upload

          Click here to confirm any statuses. This is not always necessary.

          Once you’ve completed this process, the system will suggest the next step:

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 19: Step 4 in the data upload

          Select Start Simulation.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 20: Starting the simulation

          A message will appear when the process has started.

          Return to the Monitoring tab to view the status and check for any errors.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 21: Simulation notification

          If there are no errors, the following notification will appear:

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 22: Success notification

          In the Options column, click Show Messages for a detailed overview of the upload process.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 23: Additional option to see all the details

          Open the My Unpackaged Dangerous Goods app

          and click To Be Classified for an overview of your materials.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 24: App to see result after uploading the product compliance file
          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 25: Result after uploading the product compliance file

          Unpackaged product

          Once you complete this step, you can continue uploading the next file: Assessment for Unpackaged Products (Content-Based).

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 26: Excel file Assessment for Unpackaged Products (Content-Based)

          Before the release of SAP S/4HANA 2020, you had to upload the text-based regulation and enter the dangerous goods texts in different languages. The current content-based regulation contains all information for the UN numbers.

          Like the previous file, this one also includes an introduction tab. Be sure to read this information carefully.

          In the first tab (Product), enter the internal number of the unpackaged product. This is a unique code containing letters (A-Z) and up to 80 characters. This number is for internal use only and will not be shared with clients or external parties. It is the same number as the one in the first tab of the Product Compliance file (Chemical Compliance View). The same number must be entered in the Purpose Management tab.

          In the second tab (Basic Classification), enter the same internal number as in the previous tab.

          In the second column (Compliance Requirement Version ID), enter an R value for the transport type. For example: R00925 is linked to ADR R00926 with IMDG.

          This information can be found in the Manage Compliance Requirements app. The field containing these values is called the ID of the Business Configuration Object. This is hidden by default and can be opened in the Settings tab (see Figure 28).

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 27: Manage Compliance Requirements – Dangerous Goods app
          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 28: How to add the ID of the business configuration object

          The third column must be populated according to the process status of the classification. This is a required field with two possible options: RE for ‘released’ and IP for ‘in progress’. In the Transport Permissions column, 01 stands for ‘allowed’ and 02 stands for ‘not allowed’.

          The next column concerns whether a product can be classified as a dangerous good. If you enter 01, it will not be classified as a dangerous good. If you enter 02, it will be classified as a dangerous good. The ID is a four-digit number used to identify dangerous substances during transport.

          In the next column, enter the prefix that precedes the identification of the dangerous good. This is usually UN, NA or ID.

          In the Packaging Group field, enter a Roman numeral from the Product Safety Data Sheet. Finally, enter the variant from the Dangerous Goods List. This is only required if a UN number is linked to multiple variants in the regulation (e.g. if a product is transported by road and water, it is subject to two different regulations). You can find this information in the Manage Compliance Requirements – Dangerous Goods app.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 29: Manage Compliance Requirements – Dangerous Goods app

          Go to the tab Dangerous Goods List. This setting is hidden by default but can be made visible in settings.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 30: How to add the variant

          Repeat the same migration steps you took for the Product Compliance Info file. Use the migration object DG – Assessment for Unpackaged Product (Content-Based).

          The technical product name can also be added. In our example we chose not to do this. This information can be found in the Product Safety Data Sheet.

          You can find your material in the My Unpackaged Dangerous Goods app.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 31: My Unpackaged Dangerous Goods app

          Go to the app Analyse Unpackaged Dangerous Goods.

          Here you will find your material and the associated data you uploaded.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 32: Analyse Unpackaged Dangerous Goods app
          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 33: Details unpackaged dangerous goods

          Packaged product

          When this data has been successfully migrated, you can move on to the final step: migrating the packaged product. Just like the previous files, the first tab contains an introduction with a lot of useful background information.

          The Product tab contains the following fields:

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 34: Excel file assessment for packaged product

          Enter the internal number here. In the next column you can describe the outer packaging. The column after that displays the total quantity of the outer packaging.

          Enter the units of measure in the next table. You can also enter a description of the inner packaging, if applicable. Sometimes, dangerous goods are transported in single-layer packaging, depending on the quantity and the mode of transport.

          In this case, there is no need to mention the outer packaging. This information can be found on the Product Safety Data Sheet.

          As with the outer packaging, enter the quality and units here, as well as the number of units in a single outer packaging.

          Go to the Regulations tab.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 35: Excel file: Assessment for Packaged Product

          In the first column, enter the internal number to create the link. The other columns can be filled in the same way using the information about the unpackaged file.

          Go to the Modes of Transport tab.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 36: Modes of Transport tab in the file Assessment for Packaged Product

          Create a link to the internal number. Enter the ID number, depending on the transport mode. See the previous sections above to find out where this information can be located.

          Complete the other columns as described.

          Once you have entered all information, you can start the upload process. Follow the same steps as the previous files. Use the migration object DG – Assessment for Packaged Product.

          For results, go to the app My Packaged Dangerous Goods – to Be Classified.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 37: My Packaged Dangerous Goods – To be Classified

          Here you can follow your progress. Once everything has been uploaded correctly, it will be removed from the My Packaged Dangerous Goods – To Be Classified list.

          SAP S/4HANA Migration Cockpit, SAP S/4HANA Cloud Master Data
          Figure 38: Progress during upload packaged products
          Rating: 0 / 5 (0 votes)

          The post Migrating dangerous goods data in SAP S4/HANA: not as simple as it seems! appeared first on ERP Q&A.

          ]]>
          72125
          Integrate Onboarding 1.0 data to Custom MDF object / portlet in Employee Central – COVID-19 Vaccine Tracking portlet https://www.erpqna.com/integrate-onboarding-1-0-data-to-custom-mdf-object-portlet-in-employee-central-covid-19-vaccine-tracking-portlet/ Tue, 01 Mar 2022 11:39:34 +0000 https://www.erpqna.com/?p=60286 Context: Many of us might have had customer requirements to integrate Onboarding 1.0 data to Custom MDF object / portlet in Employee Central. For example., Integrate COVID 19 Vaccine Tracking data from Onboarding 1.0 to an MDF COVID 19 Vaccine Tracking portlet. While this requirement is identified as a product enhancement, there is an alternate […]

          The post Integrate Onboarding 1.0 data to Custom MDF object / portlet in Employee Central – COVID-19 Vaccine Tracking portlet appeared first on ERP Q&A.

          ]]>
          Context:

          Many of us might have had customer requirements to integrate Onboarding 1.0 data to Custom MDF object / portlet in Employee Central. For example., Integrate COVID 19 Vaccine Tracking data from Onboarding 1.0 to an MDF COVID 19 Vaccine Tracking portlet. While this requirement is identified as a product enhancement, there is an alternate way to integrate the data across modules and to the Custom MDF object / portlet in Employee Central.

          This post explains about the workaround solution that can be implemented to integrate data collected during Onboarding process to a Custom MDF object / portlet in Employee Central with a reference to COVID-19 vaccination tracking. This is achieved by leveraging Intelligent Services for ‘Employee Recruitment’ Event and Integration Centre.

          Prerequisites:

          1. Enable Employee Central
          2. Enable Intelligent Services Centre using the Upgrade Centre(ISC)
          3. Integration Centre
          4. COVID-19 Vaccine Tracking MDF is either enabled (SAP standard portlet) or configured as a custom object through Admin Center> Configure Object Definitions
          5. COVID-19 Vaccine Tracking portlet is configured through Admin Center> Manage Configuration UI and added to People Profile.
          6. COVID-19 Vaccine Tracking portlet fields are configured in Onboarding panels.

          Configuration:

          1. Configure necessary fields in one of the HRIS elements in Employee Central:

          • Admin Center> Manage Business Configuration> Select [HRIS element]

          In the example below, the fields are created under Personal Info because COVID-19 data are identified personal.

          Note: The above fields are not granted RBP permissions for any role to keep it hidden in Employee Profile. These fields will appear while hiring a candidate under Manage Pending Hires and can be hidden through a business rule (KBA: https://launchpad.support.sap.com/#/notes/0002080655).

          2. Map the fields from Onboarding 1.0 to respective fields in Employee Central through Admin Center > Field Mapping tool for integration with Employee Central

          3. Configure a business rule in Job Information portlet – To trigger Intelligence Service during New Hire

          • Create a Basic rule with ‘Employment Information Model’ as the Base Object
          • Set the rule condition to execute ‘Trigger New Hire Event’

          4. Assign the Business rule

          • Navigate to Admin Center> Manage Business Configuration> ‘jobInfo’ element
          • Scroll all the way down to the ‘Trigger Rules’ section
          • Add the business rule as a ‘onPostSave’under ‘Job Information Model’ and Save

          5. Configure Intelligent Services for ‘Employee Recruitment’ event – Create and assign Integration Centre flow

          • Navigate to Admin Center> Intelligent Services Centre (ISC)
          • Search and select the Event> Employee Recruitment
          • Under ‘Activities’, Select ‘Integration’ tab
          • Click ‘OK’ and ‘Create a new integration’ in the pop-up window
          • In the ‘Choose Integration Type’ pop-up, select the following:
            • Destination Type: SuccessFactors
            • Format: OData v2
          • Click ‘Create’
          • The system redirects to the Integration Centre
          • Enter an Integration Name and Description
          • Click ‘Next’
          • Switch to Field Mapping view
          • Click ‘+’ in the ‘Destination Fields’ section
          • Search, Select, and Add ‘[Custom MDF object]’ as starting Node in Integration Centre. For example., COVID-19 vaccine tracking
          • Map and Assign the fields
          • This is completed by drag and drop from Source Fields to Destination Fields in the custom MDF
          • The other field mappings should have the following associations: userNav/empInfo/personNav/personalInfoNav/*/

          In my experience, never search for the fields as the system picks the logged in users mapping than the userNav. Hence ensure that you navigate the tree entity one by one to reach userNav/empInfo/personNav/personalInfoNav/*/[custom fields you have configured]

          Note: If you have a picklist field to be mapped then ensure that the externalCode is selected as shown below:

          • Once all fields are mapped; Click ‘Next’
          • Skip ‘Filter’, Click ‘Next’ and move to ‘Review and Run’
          • Click Save> Save

          6. Set the Intelligent Services Flow Rule

          • Navigate to Admin Center> Intelligent Services Centre (ISC)
          • Search and select the Event> Employee Recruitment
          • Under ‘Activities’, Select ‘Integration’ tab
          • In the pop-up window, Select the integration definition configured and click ‘Add Integration’
          • This adds the new integration to the Flow Rule.
          • Set the ‘Timing’ to ‘When event is published’
          • Make sure you have saved the changes under Actions > Save Flow

          Every time a new hire happens in Employee Central, the ISC will trigger the Integration Centre job. This then sync and map over the data stored under personal information portlet to the custom MDF portlet.

          Rating: 0 / 5 (0 votes)

          The post Integrate Onboarding 1.0 data to Custom MDF object / portlet in Employee Central – COVID-19 Vaccine Tracking portlet appeared first on ERP Q&A.

          ]]>
          60286