SAP Cloud Platform Archives - ERP Q&A https://www.erpqna.com/category/sap-cloud-platform/ Trending SAP Career News and Guidelines Thu, 27 Nov 2025 10:36:47 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP Cloud Platform Archives - ERP Q&A https://www.erpqna.com/category/sap-cloud-platform/ 32 32 Filtering Data Dynamically in SAP CPI using XSLT Mapping https://www.erpqna.com/filtering-data-dynamically-in-sap-cpi-using-xslt-mapping/?utm_source=rss&utm_medium=rss&utm_campaign=filtering-data-dynamically-in-sap-cpi-using-xslt-mapping Sat, 19 Aug 2023 06:16:58 +0000 https://www.erpqna.com/?p=76912 You may have encountered various scenarios where you configured more than three filters and created different paths yet performed the same task in all these paths. The first filter has the department equal to HR, while the others correspond to Finance, Sales, and Audit, respectively. Even though the data processing occurs in different paths, the […]

The post Filtering Data Dynamically in SAP CPI using XSLT Mapping appeared first on ERP Q&A.

]]>
You may have encountered various scenarios where you configured more than three filters and created different paths yet performed the same task in all these paths. The first filter has the department equal to HR, while the others correspond to Finance, Sales, and Audit, respectively. Even though the data processing occurs in different paths, the same task is being executed in each path.

Complex Integration

As you can see above, the integration appears complex. The architect responsible for the integration would be able to debug it, but it might be a bit time-consuming for other architects to understand the process of the above integration. Validating all the integration steps, by reviewing each one and either removing or adding filter criteria, would also increase the complexity of the integration.

It will also impact the performance of the integration, as it involves multiple steps that need to be executed. This will subsequently increase the processing time of the integration. The API calls made by the above integration will raise the load on the SuccessFactors system due to multiple API calls being made for performing the same task.

Simplified Integration

Solution:

To reduce the complexity of integration and decrease the processing time, as well as to alleviate the load on the SuccessFactors API, we can utilize the ‘XSLT Mapping’ function within the integration. This function enables us to filter data, map fields, and execute necessary distinct functions.

What is XSLT Mapping?

XSLT (Extensible Stylesheet Language Transformations) mapping in SAP CPI refers to the process of using XSLT to transform and map data between different formats during integration scenarios. XSLT is a language used to transform XML documents into various formats.

In the context of SAP CPI, XSLT mapping allows you to:

  1. Transform Data: Convert XML data from the source format to the target format using XSLT templates. This is particularly useful when you need to map data from one XML structure to another.
  2. Perform Complex Mapping: XSLT provides a powerful way to perform complex transformations, conditional mapping, and data enrichment during integration.

Overall, XSLT mapping is a powerful tool in SAP CPI that enables you to manage complex data transformations and mappings in integration scenarios, easing seamless communication between different systems with varying data formats.

Methodology:

In this scenario, we are going to filter out cost centers and calculate the sum of course costs for all the users, storing it as Total Cost. Generally, most customers have multiple cost centers within their organization, and many new cost centers are created and removed from the system at regular intervals.

Ideally, it is not feasible to have multiple filters and add them manually when new cost centers are created or deleted post Go-Live.

We are going to use “group by” function which is a mechanism in XSLT mapping that allows you to group and aggregate data based on specific criteria. This is particularly useful when you’re dealing with XML data and need to perform operations on grouped data sets.

Here’s how the “group by” function works in XSLT within the context of SAP CPI:

  1. Grouping Data : The “group by” function lets you group XML elements based on a common attribute or element value. This means that elements with the same value for the specified attribute or element are treated as a group.
  2. Aggregating Data : Once the data is grouped, you can use aggregate functions (like sum, count, average, etc.) to perform calculations on the elements within each group. This allows you to calculate values for each group separately.
  3. Output Transformation : After grouping and aggregating, you can define how the resulting grouped and aggregated data should be transformed into the desired output format.

Here’s a simplified example of how you might use the “group by” function in XSLT within SAP CPI:

<!-- Group employees by Cost Center -->
<xsl:for-each-group select="{Entity Name} /{Entity Name}" group-by="{Enter Field to Filter}">
                <CostCenter ><xsl:value-of select="current-grouping-key()" /></CostCenter>
<!-- Calculate the Total Cost within the Cost Center -->
                <Total_Cost><xsl:value-of select="sum(current-group()/{Enter Field to Sum})"/></Total_Cost>

Suppose you have the below input XML data having Cost Center Details:

Input XML :
<CostDetails>
    <CostDetails>
        <externalCode>1982</externalCode>
        <cust_costcenter>3748</cust_costcenter>
        <course_cost>960</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1984</externalCode>
        <cust_costcenter>3748</cust_costcenter>
        <course_cost>850</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1986</externalCode>
        <cust_costcenter>3746</cust_costcenter>
        <course_cost>740</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1988</externalCode>
        <cust_costcenter>3746</cust_costcenter>
        <course_cost>630</course_cost>
    </CostDetails>
</CostDetails>

If you want to group the cost centers and calculate the cost of the course for each user, you could use the following XSLT code for your requirements.

Code :
<xsl:stylesheet version="2.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
    <xsl:template match="/">
        <output>
            <xsl:for-each-group select="CostDetails/CostDetails " group-by="cust_costcenter">
                <Center><xsl:value-of select="current-grouping-key()"/></Center>
                <Total_Cost><xsl:value-of select="sum(current-group()/course_cost)"/></Total_Cost>
                <employees>
                    <xsl:for-each select="current-group()">
                        <employee>
                            <userId><xsl:value-of select="externalCode "/></userId>
                            <cost><xsl:value-of select="course_cost"/></cost>
                            <costcenter><xsl:value-of select="cust_costcenter"/></costcenter>
                        </employee>
                    </xsl:for-each>
                </employees>
            </xsl:for-each-group>
        </output>
    </xsl:template>
</xsl:stylesheet>	

The response payload below has filtered 2 cost centers, calculated the cost of the course, and provided us with the required output.

Output XML :
<?xml version="1.0" encoding="UTF-8"?>
<output>
    <Center>3748</Center>
    <Total_Cost>1810</Total_Cost>
    <employees>
        <employee>
            <userId>1982</userId>
            <cost>960</cost>
            <costcenter>3748</costcenter>  
        </employee>
        <employee>
            <userId>1984</userId>
            <cost>850</cost>
            <costcenter>3748</costcenter>   
        </employee>
    </employees>
    <Center>3746</Center>
    <Total_Cost>1370</Total_Cost>
    <employees>
        <employee>
            <userId>1986</userId>
            <cost>740</cost>
            <costcenter>3746</costcenter>   
        </employee>
        <employee>
            <userId>1988</userId>
            <cost>630</cost>
            <costcenter>3746</costcenter>
        </employee>
    </employees>
</output>

Scenario 2:

If you want to apply a filter within another filter, you can perform a ‘group by’ operation on both the cost center and status in XSLT mapping. To achieve this, you can utilize nested ‘’ elements for each level of grouping.

<!-- Within each Cost Center, group by Status -->
          <xsl:for-each-group select="current-group()" group-by="{Enter Field to Filter}">
                    <Status><xsl:value-of select="current-grouping-key()" /></Status>

Suppose you have the below input XML data having Cost Center Details and Status:

Input XML :
<CostDetails>
    <CostDetails>
        <externalCode>1982</externalCode>
        <course_status>Active</course_status>
        <cust_costcenter>3748</cust_costcenter>
        <course_cost>960</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1984</externalCode>
        <course_status>Active</course_status>
        <cust_costcenter>3748</cust_costcenter>
        <course_cost>550</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1981</externalCode>
        <course_status>Completed</course_status>
        <cust_costcenter>3748</cust_costcenter>
        <course_cost>950</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1983</externalCode>
        <course_status>Completed</course_status>
        <cust_costcenter>3748</cust_costcenter>
        <course_cost>250</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1986</externalCode>
        <course_status>Completed</course_status>
        <cust_costcenter>3746</cust_costcenter>
        <course_cost>740</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1988</externalCode>
        <course_status>Completed</course_status>
        <cust_costcenter>3746</cust_costcenter>
        <course_cost>630</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1987</externalCode>
        <course_status>Active</course_status>
        <cust_costcenter>3746</cust_costcenter>
        <course_cost>330</course_cost>
    </CostDetails>
    <CostDetails>
        <externalCode>1989</externalCode>
        <course_status>Active</course_status>
        <cust_costcenter>3746</cust_costcenter>
        <course_cost>430</course_cost>
    </CostDetails>
</CostDetails>

The XSLT code will group employees first by cost center and then within each cost center, group them by status. It calculates the total cost for each status within the cost center and outputs a structured XML accordingly.

Code :
<xsl:stylesheet version="2.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
    <xsl:template match="/">
        <output>
            <xsl:for-each-group select="CostDetails/CostDetails " group-by="cust_costcenter">
                <Center><xsl:value-of select="current-grouping-key()"/></Center>
                <xsl:for-each-group select="current-group()" group-by="course_status">
                    <Status><xsl:value-of select="current-grouping-key()"/></Status>
                    <Total_Cost><xsl:value-of select="sum(current-group()/course_cost)"/></Total_Cost>
                    <employees>
                        <xsl:for-each select="current-group()">
                            <employee>
                                <userId><xsl:value-of select="externalCode "/></userId>
                                <cost><xsl:value-of select="course_cost"/></cost>
                                <costcenter><xsl:value-of select="cust_costcenter"/></costcenter>
                                <status><xsl:value-of select="course_status"/></status>
                            </employee>
                        </xsl:for-each>
                    </employees>
                </xsl:for-each-group>
            </xsl:for-each-group>
        </output>
    </xsl:template>
</xsl:stylesheet>

As you can see below, the XSLT performed the ‘group by’ function for both the Cost Center and Status successfully.

Output XML :
<?xml version="1.0" encoding="UTF-8"?>
<output>
    <Center>3748</Center>
    <Status>Active</Status>
    <Total_Cost>1510</Total_Cost>
    <employees>
        <employee>
            <userId>1982</userId>
            <cost>960</cost>
            <costcenter>3748</costcenter>
            <status>Active</status>
        </employee>
        <employee>
            <userId>1984</userId>
            <cost>550</cost>
            <costcenter>3748</costcenter>
            <status>Active</status>
        </employee>
    </employees>
    <Status>Completed</Status>
    <Total_Cost>1200</Total_Cost>
    <employees>
        <employee>
            <userId>1981</userId>
            <cost>950</cost>
            <costcenter>3748</costcenter>
            <status>Completed</status>
        </employee>
        <employee>
            <userId>1983</userId>
            <cost>250</cost>
            <costcenter>3748</costcenter>
            <status>Completed</status>
        </employee>
    </employees>
    <Center>3746</Center>
    <Status>Completed</Status>
    <Total_Cost>1370</Total_Cost>
    <employees>
        <employee>
            <userId>1986</userId>
            <cost>740</cost>
            <costcenter>3746</costcenter>
            <status>Completed</status>
        </employee>
        <employee>
            <userId>1988</userId>
            <cost>630</cost>
            <costcenter>3746</costcenter>
            <status>Completed</status>
        </employee>
    </employees>
    <Status>Active</Status>
    <Total_Cost>760</Total_Cost>
    <employees>
        <employee>
            <userId>1987</userId>
            <cost>330</cost>
            <costcenter>3746</costcenter>
            <status>Active</status>
        </employee>
        <employee>
            <userId>1989</userId>
            <cost>430</cost>
            <costcenter>3746</costcenter>
            <status>Active</status>
        </employee>
    </employees>
</output>
Rating: 0 / 5 (0 votes)

The post Filtering Data Dynamically in SAP CPI using XSLT Mapping appeared first on ERP Q&A.

]]>
Using Google Workspace SDK to process purchase order information and create a sales order in SAP S/4HANA Cloud https://www.erpqna.com/using-google-workspace-sdk-to-process-purchase-order-information-and-create-a-sales-order-in-sap-s-4hana-cloud/?utm_source=rss&utm_medium=rss&utm_campaign=using-google-workspace-sdk-to-process-purchase-order-information-and-create-a-sales-order-in-sap-s-4hana-cloud Tue, 22 Nov 2022 11:40:10 +0000 https://www.erpqna.com/?p=70409 Use Case: Using Google Workspace SDK to process purchase order information and create a sales order in SAP S/4HANA Cloud. Prelude: SAP Process Automation enables automation of google workspace products such as Gmail, Google drive, Google calendar, Google sheet, Google slides, Google documents and Google cloud storage. In addition to that it has the capabilities […]

The post Using Google Workspace SDK to process purchase order information and create a sales order in SAP S/4HANA Cloud appeared first on ERP Q&A.

]]>
Use Case: Using Google Workspace SDK to process purchase order information and create a sales order in SAP S/4HANA Cloud.

Prelude:

SAP Process Automation enables automation of google workspace products such as Gmail, Google drive, Google calendar, Google sheet, Google slides, Google documents and Google cloud storage. In addition to that it has the capabilities to automate ML based services such as, Google document AI and vision AI. The below SDKs can be used in automations to automate Google services:

  1. Google Authorization SDK
  2. Google Workspace SDK
  3. Google Cloud Storage SDK
  4. Google Document AI SDK
  5. Google Vision AI SDK

In this blog I will use Google Authorization SDK and Google Workspace SDK to talk about how you can automate google applications such as google drive, Gmail and google sheet. I am using Sales Order creation use case in SAP S/4HANA System to explain these features.

Detailed Use case for our automation:

  • Uses Google Workspace SDK to Search GMAIL account for Purchase Order Emails
  • Uses Google Workspace SDK to Read the email and download the attachments
  • Use SAP DOX service to read PDF and create an output schema
  • Uses Google Workspace SDK to Fill values from PDF into a pre-created Google Sheet
  • Uses Google Workspace SDK to Read values from Google Sheets to create Sales Order
  • Uses Pre-built SAP Sales Order Creation Bot to create Sales Order using the above values
  • Uses Google Workspace SDK to Reply to the original email with the created Sales Order ID

Recommended SDK’s version: 1.31.49

Pre-requisites:

  • A Purchase Order in PDF Format
  • An Email in a GMAIL Account containing a PO (Purchase Orders) in the above format
  • A Sales Order Input template in Google Sheets
    Sales Order Creation Automation that takes input from the above template

I have a prebuilt sales order automation bot and I am using the same. The bot expects a Data structure as shown below and gives Standard Order id as an output.

Sample expected Input by the prebuilt Sales Order Automation

Step – 1: Create a new Document Template as shown in the figure below and define the schema for the same document template to be used in the subsequent activities

Sample Purchase Order document Template

Step – 2: I will use a pre-defined template, for which I am populating the data read from purchase order and upload this to Google Drive for the BOT to use.

Sample sales order input template

Now, we are ready with the pre-requisite steps. Let us now start building our Automation.

Step – 3: Create a new automation to read Purchase Order data from Google Products and Add Google Packages as dependencies in this Automation.

New Automation

Now Select Dependencies tab and click on Add Dependency, then click on Add a Business Process project Dependency and from the drop down, select Google Authorization SDK & Google Workspace SDK.

Step – 4: Add following steps into the automation to Search Purchase Order Email and extract data from the attached PDF.

Our automation will search for an email in the users GMAIL account based on query input and then read the email contents. It will download the attachments to a local File System and then extract Data from the PDF Attachment to fill into Sales Order Template.

Then, the Bot will use the data from the template to create Sales order using our pre-built Sales order Automation

  • Add Google Authorization activity with proper scopes. For Google Sheets, the Google Drive scope is sufficient.
  • Add Search Email activity to search for Purchase Order Email. This activity uses GMAIL API to search for emails based on the search filters as specified in the search query Parameters. You can optionally also specify if the search should include spam and trash folders or not. Output of this activity is a list of Metadata of the emails that match search criteria.
  • Add Read Email activity to read the contents of the email. This activity needs Message Id and a location to download media attachments (if any). Using this activity, you can download the attachments of an email into your local file system, or you can upload the attachments to Google Drive. We can use the output of Search activity (previous step) to give messageId input of this activity. Output of this activity is the content and details of the targeted message.
  • Add Extract Data activity to extract the Purchase Order details from the PDF attachment found in the previous step.This activity needs a Document Schema and a Document Template which we created in our pre-requisites. It will also need a document path which we can fetch from our previous Read Email step as shown in the fig. The output of this activity is the extracted data as per the schema defined in the Prerequisite document template.

Now our 1st use case of extracting Purchase Order data out of user’s Email is complete. In the next step, we will use this data to fill the Sales Order template that is required by our Pre-built Sales Order Bot.

  • Add a Custom Data Type Variable of type any and initialize it with an empty list. As shown in the figure. This is a custom data type which will be used as an array of string arrays (2D string array) for input to the set activity of Google Sheets in the further steps.
  • Add a foreach loop to iterate on the extracted Data’s line Items (output of 4th Activity)
  • Add activity inside for loop to Add Item (List) to add an item in the above data structure variable. Set the value as [currentMember.materialNumber.value, Step6.currentMember.quantity.value]Here we create the row values as to be inserted in the template in a string array (representing 1 row of the spreadsheet)
  • Add Set Cell Values (Google Sheet) activity to push values from our data structure variable into Google spreadsheet template for our Sales Order bot.
  • Add output parameter ThreadId in the automation for passing the Gmail thread Id for further steps. Map this output parameter with the thread Id which is output of Search Email activity.

With this step, we have an automation to search for an email and extract purchase order data from it using Google Workspace SDK and fill it in a Sales Order Template in Google spreadsheet residing in G-Drive.

Now, Lets make an automation that reads the data from the sales order template and creates a data model which is required by our pre-built Sales order automation.

Step – 5: Create a new Automation and add following steps into the automation to read Sales Order information from a template and trigger Sales order Creation bot.

  • Add a Get Cell Values (Google sheet) activity to fetch Header-Data values of the template. This activity expects Range and Spreadsheet ID as mandatory inputs. In order to get spreadsheet Id of our targeted template, open it in Google drive and copy the id from the Web URL as shown in the figure below.

The output of this activity is a 2-dimensional Array of string which can be row major (i.e., each list of string represents a Row in spreadsheet), or it can be Column Major (i.e., each list of string represents a Row in spreadsheet) based on the input against the majorDimension input in the activity.

  • Add a Get Cell Values (Google sheet) activity to fetch Item-Data values of the template.
  • Create a Custom Script to map the values into the declared custom variable as shown below. Edit the script and add the below mapping code into the script. This custom script maps the data read from sales order sheet through the above activities and fills into the data model type as expected by the Sales order creation prebuilt bot.
Sample Script:{
“SalesOrderType”: “OR”,
“SalesOrganization”: “1010”,
“DistributionChannel”: “10”,
“Division”: “00”,
“SoldToParty”: “10100007”,
“OrderReason”: “002”,
“Items”: [
{
“Product”: “TG11”,
“RequestedQuantity”: “11”
},
{
“Product”: “TG12”,
“RequestedQuantity”: “12”
}
]
}
  • Call the pre-built automation with the output of the custom script above
  • Add output parameter in the automation and map the output of the pre-built automation step.

With this, we now have an automation that can read sales order data from a Google Spreadsheet and create a sales order with this data in the SAP S/4HANA cloud.

In an ideal state, these two bots can run as trigger to keep searching for emails and creating sales order with the requested purchase order.

Now, lets create a master automation and reply to the email with the sales order Id that we get as an output of the previous automation.

Step – 6: Create a master automation to test and demo our bots. Add the below steps in this automation

  • Add “Extract Purchase Order From Google Workspace” automation as created in step 4 above.
  • Add “Create Sales Order using template” automation as created in step 5 above.
  • Add Reply All (Gmail) activity and add custom input data as below in the reply parameter input.

With this our demo use case is complete, we can run the master automation to test and create a Sales Order in SAP S/4HANA cloud using Google workspace SDK to extract purchase order Information and then replying to the customer with the created Standard Order ID.

Running the master automation:

On executing the master automation that we just created, an email reply is done by the automation to the requestor’s email on the same thread, as shown in the figure below.

Successful Response on execution
Rating: 0 / 5 (0 votes)

The post Using Google Workspace SDK to process purchase order information and create a sales order in SAP S/4HANA Cloud appeared first on ERP Q&A.

]]>
C_CPE_13: Tested Study Guide to Get Your SAP Cloud Platform Certification https://www.erpqna.com/c-cpe-13-study-guide-master-sap-cloud-platform/?utm_source=rss&utm_medium=rss&utm_campaign=c-cpe-13-study-guide-master-sap-cloud-platform Fri, 03 Dec 2021 12:21:16 +0000 https://www.erpqna.com/?p=57413 What's the path to passing the C_CPE_13 exam? Should you rely on PDF Dumps to score better or enroll for practice tests? Get the best study guide through this article and ace the C_CPE_13 exam.

The post C_CPE_13: Tested Study Guide to Get Your SAP Cloud Platform Certification appeared first on ERP Q&A.

]]>
What’s the path to passing the C_CPE_13 exam? Should you rely on PDF Dumps to score better or enroll for practice tests? Get the best study guide through this article and ace the C_CPE_13 exam.

What Is the C_CPE_13 Certification All About?

C_CPE_13 or the SAP Certified Development Associate – SAP Extension Suite certification exam proves that the candidate is versed with the fundamental and core knowledge needed to work on the SAP Extension Suite profile.

The C_CPE_13 certification also confirms that the candidate has an overall understanding and in‐depth technical skills to join as a project team member under a mentor’s guidance. The knowledge of the C_CPE_13 certification gets boosted up through SAP training.

What Do You Need to Learn through the Certification Preparation?

The C_CPE_13 exam covers the following topics-

  • Web development standards
  • Connectivity
  • UI
  • SAP Continuous Integration and Delivery
  • SAP BTP, Cloud Foundry environment
  • SAP Cloud Application Programming model
  • Tools
  • SAP BTP
  • SAP Authorization and Trust Management

Preparation Tips for the C_CPE_13 Exam:

Enroll Your Name for the C_CPE_13 Exam:

Enroll your name with Pearson Vue for taking the C_CPE_13 exam. Many candidates think about appearing for the C_CPE_13 exam, and they leave the preparation halfway due to several reasons. Getting registered makes your approach better towards exam preparation.

Get Your Grip Stronger on C_CPE_13 Syllabus Topics:

Exploring the syllabus of SAP Cloud Platform is very important. Here, the topics are mentioned in this article, but one must go through the official website to get complete details. If your grip is stronger on the syllabus topics, taking the exam becomes relatively simple. If you look closer at the syllabus distribution, there is not much difference in the topic distribution. Every section holds equal importance, and a candidate must learn all sections.

Use A Schedule to Keep Track of Your Preparation:

Syllabus completion could feel to be a task for many aspirants. But, if you follow a schedule, syllabus completion could feel better. You must write about the topic names to be covered on a particular day, sections you are facing difficulty in, topic-related links, and more in the schedule.

If you focus on daily learning, pressure will be low during the last two to three weeks before the exam. Develop the habit of learning two to three topics daily and write down notes out of the topics. The habit of learning something and writing it down would boost your recalling power during the exam.

Move towards Your C_CPE_13 Certification Dream through Practice Test:

If you are studying for the C_CPE_13 exam, your dream must be to become the SAP Certified Development Associate – SAP Extension Suite professional. To make this dream true, candidates should take the practice exam attempts and learn more about their weaknesses and strength regarding the syllabus areas. The main purpose of using the C_CPE_13 practice test is to evaluate a candidate’s preparation level and gradually boost it through rigorous practice.

Overview of SAP Cloud:

Intelligent enterprises need nothing but the convergence of skillsets and strengths owned by humans and machines, making the building blocks of an innovative business model. To earn the intelligent state, enterprises come across challenges such as integrating data and all the processes, security, compliances, security, transparency, and immediate availability of new applications and methods. SAP cloud platform comprises all the elements needed to address these challenges and paves the path towards digital transformation.

Benefits of Using SAP Cloud Platform:

Improve Developers with Various Tools:

SAP Cloud, an open standard platform, offers an opportunity to IT experts to leverage the platform with a range of development tools. The platform is much easier for developers to build customized business services and apps by using their technical skills. Another important feature of the cloud platform is the SAP Cloud Application Programming Model, which allows the user to create business applications based on your domain logic quickly.

Get Trouble-Free Interactions:

In addition, to offer smooth and seamless interaction between SAP’s software as service offerings (SaaS) and core SAP system, the SAP cloud platform also blends SAP HANA Data Management Suite tools. The solution ensures a hassle-free retrieval of data from any disparate system. Flawless integration of the Hybrid system landscape with the existing solution is another benefit of SAP Cloud Platform.

C_S4HDEV1909 Exam: 5 Proven Tips to Ace It In No Time

Get Easy Access to Preconfigured Applications:

The user gets to use multiple preconfigured applications through the SAP Cloud Platform. The process brings in an opportunity to build and integrate innovative enterprise-class applications with the availability of in-built application components. These things can be done with minimum effort and less time. Due to SAP Cloud Platform, they enjoy the benefit of being a part of SAP’s complete ecosystem where they can access partner solutions.

Get User-Friendly Experience:

Any business’s success is dependent on user acceptance. Therefore, a user-friendly application interface becomes the crucial part an app should offer across all devices. The SAP cloud platform came to the rescue with all the software development kits needed to build a customized interface according to their business demands.

Bottom Line:

Make the path to earn more knowledge on the SAP Cloud Platform, and build up your career as an associate. The C_CPE_13 exam needs you to score a 64% mark and helps you to gravitate towards a better job.

Rating: 0 / 5 (0 votes)

The post C_CPE_13: Tested Study Guide to Get Your SAP Cloud Platform Certification appeared first on ERP Q&A.

]]>
C_CPE_12 Certification Practice Test to Make You an SAP Associate in No Time https://www.erpqna.com/c-cpe-12-certification-shortcut-to-sap-mastery/?utm_source=rss&utm_medium=rss&utm_campaign=c-cpe-12-certification-shortcut-to-sap-mastery Tue, 20 Jul 2021 10:08:40 +0000 https://www.erpqna.com/?p=51058 Earning the C_CPE_12 certification and using SAP Cloud Platform could lead to making up an intelligent enterprise. But combining the C_CPE_12 study guide with a reliable practice test is the key to success.

The post C_CPE_12 Certification Practice Test to Make You an SAP Associate in No Time appeared first on ERP Q&A.

]]>
Earning the C_CPE_12 certification and using SAP Cloud Platform could lead to making up an intelligent enterprise. But combining the C_CPE_12 study guide with a reliable practice test is the key to success.

SAP Cloud Platform brings together the strengths and skills of people and machines, thus creating the foundation for innovative business services and models.

Overview of the C_CPE_12 Certification:

The C_CPE_12 or SAP Certified Development Associate – SAP Extension Suite certification exam confirms that the candidate possesses the core and fundamental skills needed for the SAP Extension Suite profile.

The C_CPE_12 certification also confirms that the candidate has the overall skills and in‐depth technical knowledge to work as a project team member under any mentor’s guidance. This certificate builds on the basic knowledge gained through related SAP training.

What Topics Are Covered under the C_CPE_12 Exam:

A candidate needs to study the following topics-

  • Side-by-side extensibility
  • SAP Cloud SDK
  • In-App extensibility
  • APIs
  • Security
  • The Intelligent Enterprise
  • Cloud-Native Application Development

Explore the Study Guide to Pass the C_CPE_12 Exam:

Put the First Effort with Registration

Pearson Vue conducts the C_CPE_12 exam. A candidate can register directly from Pearson Vue’s site for the exam. If you start with the C_CPE_12 exam registration first, it will make you aware of the exam date, and making the preparation would be easy.

Learn about the C_CPE_12 Exam Structure:

It is always great to explore the exam before jumping for exam preparation. Learning the exam objectives and exam structure keeps you mentally prepared for facing the exam swiftly.

The C_CPE_12 exam is 180 minutes long exam and comprises 80 multiple-choice questions. A candidate needs to obtain a 63% mark to pass the exam.

Completion of the C_CPE_12 Syllabus Is Essential:

How many questions you can face in the C_CPE_12 exam solely depends on completing the syllabus. SAP syllabus is always distributed in almost equal percentages, so a candidate must chalk out and plan how many topics he wants to cover every day. For sure success, following the syllabus completion schedule is essential. Whatever learning speed you have, or whatever study situation you are at, take at least two months to get ready for the C_CPE_12 exam.

Learn from the Experts:

SAP offers training for all related exams, and it is highly suggested to include hands-on training to enhance your knowledge and crack the exam.

Use C_CPE_12 Practice Test to Assess Yourself:

What is your preparation level? Getting the answer and detailed analysis is very important to improve further. A candidate can evaluate himself quickly through the C_CPE_12 practice test. The practice tests are expertly designed to point out the weakness and strengths of a candidate. They offer valuable insights into different sections, and if you have time, you can work on those areas to score better in the future.

What Is SAP Cloud Platform?

Intelligent business models and methods will be the driving force behind successful organizations in the future. SAP Cloud Platform can help accelerate growth and simplify the digital transformation to a smart enterprise offering multiple benefits in the process.

Here Are the Benefits of SAP Cloud Platform:

Effortless Integration of Existing Solutions and Hybrid System Scenarios:

SAP Cloud Platform ensures the smooth interaction between core SAP systems and SAP’s software-as-a-service (SaaS) offerings, but the SAP HANA Data Management Suite tools and SAP Data Hub solution likewise integrate effortlessly. SAP Cloud makes it possible for users to call up data from disparate systems and solutions in a minimal time span. SAP Cloud Platform also brings heterogeneous solutions and hybrid system landscapes together under one roof.

C_SRM_72 Certification: Earn It to Manage Your Supplier Relation

Faster Deployment of Preconfigured Services and Applications:

With SAP Cloud Platform, organizations have access to a wide range of preconfigured business methods. Getting ready-made application components and integrated functions for provisioning and authentication, customers can now build and blend innovative applications in next to no time at all. But that’s not everything; they can also consume partner-built solutions and apps at the touch of a button on the platform, getting the maximum benefit from the growing SAP ecosystem.

Simple Development of Essential Business Applications:

SAP Cloud Platform is dependent on open standards and offers IT experts a range of development tools and possibilities. Programmers, for example, can leverage their existing ABAP know-how with agile methods and DevOps principles. They can build new business services and apps due to the integrated programming framework Cloud Application Programming Model for SAP Cloud Platform, concentrate exclusively on the business logic for new applications and processes.

Final Words:

Associate-level certifications are essential to build your knowledge and push you ahead in the career ladder. Therefore grab the C_CPE_12 certificate to stay ahead of your peers.

Rating: 0 / 5 (0 votes)

The post C_CPE_12 Certification Practice Test to Make You an SAP Associate in No Time appeared first on ERP Q&A.

]]>
Part 2: SAP Cloud Platform Integration for Data Services (CPI-DS) https://www.erpqna.com/part-2-sap-cloud-platform-integration-for-data-services-cpi-ds/?utm_source=rss&utm_medium=rss&utm_campaign=part-2-sap-cloud-platform-integration-for-data-services-cpi-ds Fri, 02 Jul 2021 09:19:51 +0000 https://www.erpqna.com/?p=50107 This is the second blog post on CPI-DS after the Introduction post. Refer to the Part 1 for configuring Data Services Agent, creation of Datastore and importing Metadata objects. Tasks, Processes, and Projects: Follow the integration development workflow to set up and test moving data to the cloud. Task: A task is a collection of […]

The post Part 2: SAP Cloud Platform Integration for Data Services (CPI-DS) appeared first on ERP Q&A.

]]>
This is the second blog post on CPI-DS after the Introduction post. Refer to the Part 1 for configuring Data Services Agent, creation of Datastore and importing Metadata objects.

Tasks, Processes, and Projects:

Follow the integration development workflow to set up and test moving data to the cloud.

Task: A task is a collection of one or more data flows that extract, transform, and load data to specific targets, and the connection and execution details that support those data flows.

Process: A process is an executable object that allows you to control the order in which your data is loaded.

Available Actions in Processes and Tasks: Some actions are possible for both processes and tasks, but some actions are possible only for one or the other.

Replicate a Task or Process: You can replicate an existing task or process to the same or different project.

Edit a Task or Process: An administrator or developer can edit a task or process as needed within the UI.

Create a Project:

Select Projects tab -> Create New Project.

You can Select Save & Close, which will create the Project. Or You can select Save and create Task, which will create a Task along with the Project.

Add a Task to the Project:

A task is a collection of one or more data flows that extract, transform, and load data to specific targets, and the connection and execution details that support those data flows. You can create tasks from scratch or from predefined templates.

Tasks contain the following information: Name, description, and project they belong to (Details tab).

Source and target datastores to be used in the task’s data flows (Connections tab).

One or more data flows (Data Flows tab). Scripts and global variables applicable to all data flows in the task (Execution Properties tab).

Select Source Datastore and Target Datastore. Click on “Test Connection” button. Click on “Save” and select “Save and Define Dataflow”.

Add a Data Flow to the Task:

A data flow defines the movement and transformation of data from one or more sources to a single target. Within a data flow, transforms are used to define the changes to the data that are required by the target. When the task or process is executed, the data flow steps are executed in left-to-right order.

Although a data flow can have more than one data source, it can have only one target. This target must be an object in the target datastore that is associated with the data flow’s parent task.

Defines how many times each transform within a data flow replicates to process a parallel subset of data. Default value is 2.

Source & Target Mapping:

Maps all columns from the Input pane to columns with the same name in the Output pane (target).

Note: If the Input pane contains more than one source, Automap cannot be used.

Transform Operations:

A transform step applies a set of rules or operations to transform the data. You can specify or modify the operations that the software performs.

  • Data transformation can include the following operations:
  • Map columns from input to output
  • Join data from multiple sources
  • Choose (filter) the data to extract from sources
  • Perform functions on the data
  • Perform data nesting and unnesting
  • Construct XML Map iteration rules
  • Define a web service response

Join Tables:

You can use the Join tab to join two or more source tables. You specify join pairs and join conditions based on primary/foreign keys and column names.

To join two or more tables:

  • In the Edit Data Flow view, select the transform in which you want to perform the join.
  • If the tables you want to join are not already available in the Input pane, click New to add additional tables.
  • In the Transform Details, in the Join tab, click the plus icon to add a new join.
  • Select the tables you want to join and the join type.
  • Type a join condition.
  • Click Save.
  • If needed, create additional join conditions. Subsequent join pairs take the results of the previous join as the left source.

Note: In an ABAP Query, mixed inner and left outer joins are not supported.

Sort Data:

You can sort the order of your data by using the Order By tab.

To sort your data:

  • In the Edit Data Flow wizard, select the transform in which you want to sort your data. Sorting is supported in the Query, ABAP Query, and XML Map transforms.
  • Click the Order By tab.
  • From the Input pane, drag the column containing the data you want you use to sort and drop it into the Order By table.
  • Specify whether you want to sort in ascending or descending order.
  • Add additional columns to the Order By tab and arrange them as necessary.
  • For example, you might choose to sort your data first by country in ascending order, and then by region in descending order.

Note: The data will be sorted in the order that the columns are listed in the Order By tab.

Filter Data:

You can filter or restrict your data using the Filter tab.

To filter your data:

  • In the Edit Data Flow wizard, select the transform in which you want to add a filter.
  • Click the Filter tab.
  • From the Input pane, drag the column containing the data you want you filter and drop it in the Filter field.
  • As needed, type filter conditions or use the built-in functions.

Examples of filter conditions are shown in the following table:

Open the Data Flow Editor:

  • Open the data flow editor to design and debug data flows.
  • Follow the steps below to open a data flow for editing.
  • From the Projects tab, expand the project that contains the task and data flow you want to edit.
  • Select the task that contains the data flow you want to edit and click Edit.
  • From the Data Flows tab of the task, select a data flow and click Actions Edit.
  • The data flow editor opens.

Transform Types in Dataflow:

Validate the Data Flow:

Select Validate.

Run the Task:

Select the Task and Click on “Run Now”.

Replicate a Task or Process:

You can replicate an existing task or process to the same or different project.

To replicate a task or process, select the task in the Projects tab and choose Replicate from the More Actions menu.

When you replicate a task, copies of the task and all data flows that it contains are created and added to the target project you select as the replication target.

When you replicate a process, copies of the process (including references to data flows), scripts and execution properties are created and added to the target you select as the replication target.

Versioning Tasks and Processes:

A new version is created each time you promote a task or process. You can also create a custom version if needed.

Versions allow you to keep track of major changes made to a task or process. You can consult the version history and return to a previously promoted or saved version to roll back unwanted or accidental changes.

It is recommended that you give each version a unique name and a meaningful description. They can remind you of the changes you made to the task or process, help you decide whether you want to roll back to a previous version, and decide which version you want to roll back to.

Caution: After you roll back to a previous version of a task, it is recommended that you check all processes that reference the task’s data flows to ensure that the references were maintained.

Scripts and Global Variables:

Scripts and global variables can be used in tasks and processes.

Scripts are used to call functions and assign values to variables in a task or process.

Global variables are symbolic placeholders. When a task or process runs, these placeholders are populated with values.

Create Custom Calendars:

Custom calendars allow you to specify a customized schedule for running tasks or processes.

In the Administration tab, click Calendars. Click the plus button + to create a new custom calendar. Enter a name and optionally a description for your calendar.

Add the dates you want a task or process to run by doing one of the following:

  • Manually enter the dates,
  • Select dates by using the calendar button,
  • Upload a Calendar File,
  • Click Save.

Configure Email Notification (Task, Process & Agent):

Tasks and processes must be set up to run on a scheduled basis.

Note:

  • Email notifications for tasks or processes can be set for the Production environments. Notifications are not available for Sandbox. Enter email addresses to receive notification. Use a semicolon to separate multiple email addresses.
  • Agent downtime notifications are sent for all environments including sandbox, production, and additional environments such as development or test. Downtime is a period of five minutes or longer. The server checks every 15 minutes.

Monitoring Production Status:

In the Dashboards, the production status displays whether your production tasks and processes succeeded or failed over a given period of time.

From the production status, you can: Set the time period for which you want to analyze results.

Click on an area of the pie chart to filter tasks and processes displayed in the table.

Click on a task or process in the table to view its history and log data.

Rating: 0 / 5 (0 votes)

The post Part 2: SAP Cloud Platform Integration for Data Services (CPI-DS) appeared first on ERP Q&A.

]]>
Part 1: SAP Cloud Platform Integration for Data Services (CPI-DS) https://www.erpqna.com/part-1-sap-cloud-platform-integration-for-data-services-cpi-ds/?utm_source=rss&utm_medium=rss&utm_campaign=part-1-sap-cloud-platform-integration-for-data-services-cpi-ds Tue, 25 May 2021 04:45:04 +0000 https://www.erpqna.com/?p=47860 Introduction: SAP Cloud Platform Integration for data services allows you to efficiently and securely use ETL (extract, transform, load) tasks to move data between on premise systems and the cloud. SAP Cloud Platform Integration interacts with your local SAP landscape via the SAP Data Services Agent and secure HTTPS and RFC connections. SAP ERP Systems […]

The post Part 1: SAP Cloud Platform Integration for Data Services (CPI-DS) appeared first on ERP Q&A.

]]>
Introduction:

SAP Cloud Platform Integration for data services allows you to efficiently and securely use ETL (extract, transform, load) tasks to move data between on premise systems and the cloud.

SAP Cloud Platform Integration interacts with your local SAP landscape via the SAP Data Services Agent and secure HTTPS and RFC connections. SAP ERP Systems are registered with the HCI Agent as data stores via RFC.

Before you can use SAP Cloud Platform Integration, your administrator needs to prepare your environment.

Agents:

The SAP Data Services Agent provides connectivity to on-premise sources in your system landscape.

At design-time, the agent is used to provide metadata browsing functionality for on-premise sources to the web based user interface. At run-time, the agent manages the secure data transfer from your on-premise sources to your cloud-based target application.

Agent groups ensure high-availability by clustering one or more agents and making sure tasks and processes get assigned only to available agents in the group.

Download & Install the Agent:

Create a New Agent, download the configuration file (AgentConfig.txt).

Select Agents tab -> Click on “New Agent”.

Enter the Name and Description for the Agent. Select the Agent Group.

An agent group is a collection of agents that are logically grouped. When scheduling or running tasks and processes, an agent group is specified. At run-time, tasks and processes are automatically assigned to an available agent within the group.

Download the Configuration File. Save the configuration file in your local system where you want to install the agent.

After downloading the Configuration file, select Save and Close. Once you click on Save and Close, you can see the Agent Information under the group “GBOBBILI”.

Install above agent by right click on it and run it with elevated/admin rights.

After the above step, you will see below screen. You need to provide your Windows credentials and password over there and click install.

After clicking Install button you will see below screen click finish and YES.

Then you will see below screen. Provide your credentials of CPIDS and password. Attach the AgentConfig.txt file and provide the proxy host & port and then click Upload and yes.

Login to CPIDS (from the above link) and check if the agent is active or not.

Create Datastore:

Datastores are the objects that connect SAP Cloud Platform Integration for data services to your cloud and on-premise applications and databases.

Through these connections, SAP Cloud Platform Integration for data services can access metadata from and read and write data to your applications and databases.

Within the Datastores tab, you can create and manage datastores, which connect SAP Cloud Platform Integration for data services to your applications and databases.

From this tab, you can:

  • Create and delete datastores
  • Test the connection to a datastore
  • View and edit a datastore’s configuration options (Configuration)
  • Browse a datastore’s metadata objects (File Formats or Tables)
  • Import and manage file format or table metadata objects for a datastore
  • View data loaded to a table in a target datastore to ensure it is correct

Select Datastores tab -> Create “+” icon to create New Datastore.

SAP CPI – DS supports datastores the following types of applications and databases:

  • SAP Business Suite applications
  • SAP BW sources
  • SAP HANA application clouds
  • SAP HANA cloud applications such as SAP IBP and SuccessFactors BizX
  • Applications that have pre-packaged or user-written adapters
  • Databases
  • File format groups
  • SOAP and REST Web services

Once the Datastore is created Test the connection.

Import Metadata Objects:

Importing metadata objects adds the table and file names from your source and target databases and applications to your datastores.

In the Datastores area, select a datastore.

Open the Tables or File Format tab (which one appears depends on the datastore type).

Do one of the following:

If the datastore has a Tables tab, click Import Objects or Import Object by Name and select the tables whose metadata you want to import. (To import a web service object, the web service must be up and running.)

If it has a File Formats tab, click Create File Format and select the option you want to create.

Select the Type of Meta data to be imported and enter the Name.

Table: A table is a collection of related data held in a table format within an SAP (or non-SAP) system. It consists of columns, and rows.

Extractor: An extractor is a pre-defined SAP program that gathers data from various tables in an SAP source system (typically SAP ECC) then processes this data to create specific business content for insertion into another SAP system (such as SAP BW or SAP IBP).

Function: An SAP Function (or Function Module) is a pre-written custom program that typically extracts data from an SAP system and writes this to output fields or tables (that can be read by SAP CPI-DS).

Once the table meta data is imported we can see the column Names and data types in details tab, as shown below.

Create the Target Datastore by following the above steps and import the target table meta data.

Rating: 0 / 5 (0 votes)

The post Part 1: SAP Cloud Platform Integration for Data Services (CPI-DS) appeared first on ERP Q&A.

]]>
ABAP Units in ABAP RESTful Programming Model https://www.erpqna.com/abap-units-in-abap-restful-programming-model/?utm_source=rss&utm_medium=rss&utm_campaign=abap-units-in-abap-restful-programming-model Wed, 03 Mar 2021 10:01:17 +0000 https://www.erpqna.com/?p=44013 Objective and Advantages of writing ABAP Units Setup and Creation of ABAP Unit Test Doubles In behavior implementation class, navigate to test classes section and type “test” and then press ctrl+space. Select “testClass” and press enter. Provide test class name. For our e.g. we will test class name as “ltcl_scarr”. As implementation methods for determinations, […]

The post ABAP Units in ABAP RESTful Programming Model appeared first on ERP Q&A.

]]>
Objective and Advantages of writing ABAP Units
  1. ABAP units helps in testing an individual and independent unit of code without the availability/need of full application.
  2. Testing can be done even if fields are not exposed in UI and UI application is not available.
  3. Mock data is created in ABAP units which means that the actual data which is in tables is not modified. So, you can test on same data multiple times without any locking issues and prevent data inconsistency in actual data. This is achieved by creating test double environment.
  4. ABAP units are transported from development system till the production system. So, you can run the ABAP units on production system as well without modifying actual production data to test individual units of code. This is very helpful in case of incident resolution where we can’t change production data but we need to debug and analyze the root cause.
  5. Few ABAP objects such as instance methods of class cannot be directly executed (independently) in SAP. With the help of ABAP units, you can execute and test these independent units of code.

Setup and Creation of ABAP Unit Test Doubles

In behavior implementation class, navigate to test classes section and type “test” and then press ctrl+space. Select “testClass” and press enter.

Provide test class name. For our e.g. we will test class name as “ltcl_scarr”.

As implementation methods for determinations, validations and actions are private methods of behavior implementation class, we will have to make our test class as friend class of behavior implementation class so that we can call private implementation methods inside our test class.

Now we will create the test double environment for our entities and tables. This will be done in setup and teardown methods of test class. Test double environment for entities will be created using interface “if_cds_test_environment” and for tables apart from the tables in entities such as customizing tables or any other tables from which we will be selecting data in implementation methods, we will use interface “if_osql_test_environment”.

In the above test class, we have created test double environment for parent entity “SCARR” and it’s child entity “BOOKING”. We have also created test double environment for table “EKPO”. Here, we have used the consumption and TP layer for creating test double environment, but you can also use TP and basic layer. But, in this case you may not be able to access extended fields from extension view of these entities.

ABAP Units for determination, validation and action in ABAP RESTful Programming Model (RAP)

Now, we will create unit test method for a determination “setBookingTime” in which we will be determining the booking time.

Now, we will create unit test method for a validation “checkBookingTime” in which we will check if the booking time is not initial.

Now, we will create unit test method for a action “markComplete” in which we will mark the current booking as completed.

Handle draft cases in ABAP RESTful Programming Model (RAP)

The above cases will prepare mock data for active tables only and won’t work with the draft tables. There can be some cases where we explicitly need to check if the draft (%is_draft) is on, then only perform the operation. In this case we will use the “Edit” action to create draft for our mock data and then use rollback in teardown method to rollback all the changes.

Common mistakes which we make while writing ABAP units

  1. We skip the creation of test doubles in ABAP units and pass the keys directly from actual tables while testing in development and testing environment. In this case, when ABAP units are executed, as the records are there in development and testing environment and keys are valid, it reads data from actual tables and performs action on actual data. Please refrain from passing actual table keys in ABAP units from development and testing environment. If we do so, we will face following issues:
    • You will be able to test the complete functionality in development and test environment and then the code is transported to integration/quality environment before transporting to production environment. You won’t be able to test the ABAP units in integration/quality environment as the keys don’t exists.
    • In integration/quality environment as the keys don’t exists, most of the ABAP units will fail which will result in ATC warnings.
  2. We usually provide only one test case while writing ABAP units. Make sure if there are various conditions in your code such as if/case statements, then provide multiple test cases for each condition and loop the method call for each test case. The idea behind this is to have the maximum possible code coverage percentage. In production environment, you will have restricted debugging access. You won’t be able to use functions such as go to statement and edit variable values. If code coverage percentage is low, then you won’t be able to test most of the code in production and won’t be analyze the root cause of the incident.
  3. We skip the call of standard ABAP unit methods such as assert_equals, assert_initial, etc of class “cl_abap_unit_assert”. These methods compares the actual values with the expected values. Skipping these methods will always result in successful execution of ABAP units even if your expected and actual results are not same.
  4. We pass the same variable in expected and actual results in standard ABAP unit methods such as assert_equals which will always result in successful execution of ABAP units even if your expected and actual results are not same.
  5. We use statements such as “commit work” or “commit entities” in their ABAP unit methods. Refrain from using these statements and you always work on mock data and you are not trying to change the actual data in tables. Instead use “rollback” or “rollback entities” to rollback all the changes done in ABAP units. This can be done in teardown method.
Rating: 0 / 5 (0 votes)

The post ABAP Units in ABAP RESTful Programming Model appeared first on ERP Q&A.

]]>
Accessing Roles, Groups and Role Collections in NEO https://www.erpqna.com/accessing-roles-groups-and-role-collections-in-neo/?utm_source=rss&utm_medium=rss&utm_campaign=accessing-roles-groups-and-role-collections-in-neo Mon, 22 Feb 2021 10:06:00 +0000 https://www.erpqna.com/?p=43516 It should be noted that it is not the intention of SAP Cloud Platform to implement a Governance , Rick Management & Compliance (GRC) solution but rather enterprises should consider using their existing Single Sign-On solutions which will have all the reports and controls in place to manage and report user access and roles to […]

The post Accessing Roles, Groups and Role Collections in NEO appeared first on ERP Q&A.

]]>
It should be noted that it is not the intention of SAP Cloud Platform to implement a Governance , Rick Management & Compliance (GRC) solution but rather enterprises should consider using their existing Single Sign-On solutions which will have all the reports and controls in place to manage and report user access and roles to the cloud platform, either directly or through the Identity and Authentication Services (IAS)

Terminology

In essence there are three pieces to authorization on the NEO platform. This is the user, the role and the group.

Central to this is the User. A user can have roles assigned to them manually or they can be assigned to zero or more groups then within those groups they will then inherit the roles defined in the group. Each group thus is composed of users and roles.

A Role is defined by the scopes that it contains. SAP provide a number of default roles which can be further customized in the “Platform Roles” panel

If you look at assigning roles you will see there is another account name and an application name. Getting your head around what are termed provider and consumer account can be challenging and can be quite confusing initially.

SAP provides a number of services as part of the platform such as WebIDE, Workflow or Integration. These services are managed with what is termed a provider subaccount. When they are deployed to your subaccount, when activated by a contract or entitlements, your subaccount becomes the consumer account. As part of this, the applications that go with the provider account are deployed to your tenant and so access to these applications needs to be controlled. This is done through the use of the roles. Adding a role to a user or a group requires the provider account and the application name. This will give a set of roles that can then be assigned to the user or group.

For services that require a provider account they will expose a number of applications in your cockpit (some visible, some not). These applications will then have a set of roles that define operations for the specific application that a user having those roles are allowed to do. What this means is that when adding a role to a user this requires the provider subaccount and the application. These two define the roles that are available to be assigned to the user and they will be checked by the applications when the user attempts to execute an operation that is protected by one of those roles.

The API for authorization management consist of API endpoints to extract all 3 of these. From the User API you can get the group and roles. From the Group API you can get the users and roles and from the Roles API you can get the assigned users and groups for the specific provider subaccount and application.

Since I have already explained how to create an oauth token in the two previous posts, I will skip the processes as I a sure you are all already bored of it.

Extracting Groups

As per the above document with all the API’s, to access the groups we go to

https://api.<data centre>.hana.ondemand.com/authorization/v1/accounts/<technical account name>/groups

This will return a list of the group names in the current sub account.

What is slightly unclear in the documentation is how to access the users and the roles in the group collection itself

This is passed as a parameter called groupName in the URL as so:

https://api.<data centre>.hana.ondemand.com/authorization/v1/accounts/<technical account name>/groups/users?groupName=<group name>

So if we use the previous example above, we can get all the users in the group by using the group name “Integration Dev”

Likewise we can repeat the exercise to get all the roles assigned to the group

https://api.<data centre>.hana.ondemand.com/authorization/v1/accounts/<technical account name>/groups/roles?groupName=<group name>

Here we get the “Integration Dev” group’s roles that are defined on the sub account.

Extracting Users

With a user it is possible to get their roles and their groups through the API. Here the query parameter name is userId and not groupName.

Similar to what we had above, we can use the URL:

https://api.<data centre>.hana.ondemand.com/authorization/v1/accounts/<technical account name>/users/roles?userId=<user id>

Here you can see the list of roles that are assigned to the user.

Likewise you can repeat the exercise with the groups the user belongs to.

https://api.<data centre>.hana.ondemand.com/authorization/v1/accounts/<technical account name>/users/groups?userId=<user id>

And as would be expected this matches what you see in the cockpit.

Extracting Roles

The last point on the triangle is the Roles. This is the most complex and for the most part you will not need it. A role is defined by both the application that has defined it and the provider account if the role is managed by an SAP service. As you can see when you assign a role to a user in the cockpit you need to select both the provider account (service) and the application (in your subaccount) then select the role from that.

Right now I have not found an API to extract the list of applications or provider accounts so to find the roles you will need to supply the provider subaccount and the application name in order to extract the roles and groups assigned to that application and provider. You will thus need to look them up in the cockpit. Fortunately these do not change.

As you can see the URL

https://api.eu1.hana.ondemand.com/authorization/v1/accounts/<technical account name>/apps/<appname>/roles?providerAccount=<provider account>

will provide the list of roles that the provider and application define for consumption in the consumer account

Likewise to get a list of the users assigned that role on your tenant you can use the following URL

https://api.eu1.hana.ondemand.com/authorization/v1/accounts/<technical account name>/apps/<appname>/roles/users?providerAccount=avrhcid&roleName=IntegrationOperationServer.developer

will provide a list of the users who have that role in your subaccount

Finally we can repeat the exercise to get a list of the groups that have that role assigned to them

https://api.eu1.hana.ondemand.com/authorization/v1/accounts/a40abb808/apps/d0209tmn/roles/groups?providerAccount=avrhcid&roleName=IntegrationOperationServer.developer

As you can see from above there is a group called workshopp that has the role defined in it.

Rating: 0 / 5 (0 votes)

The post Accessing Roles, Groups and Role Collections in NEO appeared first on ERP Q&A.

]]>
SAP Intelligent Robotic Process Automation | Bot which Adds and Imports Transport Request in SAP System https://www.erpqna.com/sap-intelligent-robotic-process-automation-bot-which-adds-and-imports-transport-request-in-sap-system/?utm_source=rss&utm_medium=rss&utm_campaign=sap-intelligent-robotic-process-automation-bot-which-adds-and-imports-transport-request-in-sap-system Mon, 15 Feb 2021 04:43:43 +0000 https://www.erpqna.com/?p=43113 Introduction With SAP Intelligent Robotic Process Automation, you can automate your workflows and bring together data from disparate systems in real time. Adding machine learning capabilities enables SAP Intelligent Robotic Process Automation to learn from previous experiences, make more accurate decisions, and ultimately accelerate business processes. Employees are then able to focus on higher-value activities […]

The post SAP Intelligent Robotic Process Automation | Bot which Adds and Imports Transport Request in SAP System appeared first on ERP Q&A.

]]>
Introduction

With SAP Intelligent Robotic Process Automation, you can automate your workflows and bring together data from disparate systems in real time. Adding machine learning capabilities enables SAP Intelligent Robotic Process Automation to learn from previous experiences, make more accurate decisions, and ultimately accelerate business processes. Employees are then able to focus on higher-value activities such as defining business strategies and building customer relationships.

SAP Intelligent Robotic Process Automation is a hybrid solution made of three components:

● Design the automation processes with the on-premise Desktop Studio.

● Orchestrate the automation processes with the cloud Factory.

● Execute the automation processes with the on-premise Desktop Agent(s).

Bot Title

An idea of a basic Bot which Adds and Imports Transport Request in SAP System.

Bot Overview

The bot will do the following:

  1. It opens the SAP Logon pad and selects a particular connection.
  2. It will then put credentials to login the system.
  3. Enter transaction ‘STMS_IMPORT’ and open its session.
  4. Add a Transport Request and put data from a source (Excel).
  5. Once TR is Added, it will set a filter and Import the Transport Request.

Bot Benefits

SAP Intelligent Robotic Process Automation can help you automate tasks which requires lot of manual efforts. With this Bot:

  1. Manual Efforts can be reduced.
  2. Human Errors and multiple clicks can be avoided.
  3. Smooth & Speedy process execution.

Designing a Workflow

Overall Workflow

How to Start making your Bot?

1. Open Studio. Go to File Tab. Select New Project.

2. Enter the details & Save.

3. Go to File Tab. Select Edit Project. Click ‘Libraries’. Check ‘Excel Integration’ and ‘SAP GUI Scripting’.

4. Add Application. Select Technology- ‘UI Automation’. Click on ‘SAP Logon 760’. Then ‘Save and Capture Page.

Next Step is to Starting Capturing your pages which will be covered in next step.

Capturing Pages

This Bot contains 9 pages. They are captured as below:

Page Capturing

Page:1 pLOGONPAD

Page:2 pLOGINDETAILS

Page:3 pHOMEPAGE

Page:4 pImportQueue

Page:5 pAddTransport

Page:6 pAddTRconfirmation

Page:7 pSetFilter

Page:8 pImportSelectedTR

Page:9 pStartImporting

Please note that you have to set unique criterias for every items. Criterias are rules set by the user and used by the Desktop connectors to recognize defined elements (applications, pages or items) in the running entities (Web pages, Windows items, and so on). Well, you can have multiple alternatives to capture and define criterias.

Understanding Workflow

Step 1: Bot starts SAP Logon Application.

Step 2: It identifies a system, select it and then it clicks on Logon Button.

Step 3: The values which are set for User and Password are inserted by Bot and then it redirects to Home Page.

Step 4: T-code- ‘STMS_IMPORT’ is run by the bot and opens its session.

Step 5: ‘Add TR’ option is selected from Menu.

Step 6: It starts Excel operation.

Step 7: It opens an Excel file which you have set in its Properties.

Step 8: A Start Loop is set for Excel data.

Step 9: Exit condition is applied.

Step 10 : Activation of a particular sheet of Excel file.

Step 11: Get one value from Excel sheet as per the Start and Exit conditions.

Step12: The value is then set to ‘Add Transport’ page.

Step 13: It clicks on ‘Yes’ and Transport is added to the queue.

Step 14: ‘Wait until Application is closed’ method is used. So, it should not proceed with the next step until previous step is completed.

Step 15: Now it clicks on ‘Request Column’ and then ‘Set Filter’ option is selected.

Step 16: The same value is then set to the filter and and clicks on ‘OK’.

Step 17: ‘Wait until method’ is used again to avoid errors while execution of Bot.

Step 18: Now, the bot selects the TR and clicks on ‘Import Queue’ option.

Step 19 : It displays a page and the bot confirms it as per the set conditions.

Step 20: Then it click on “yes” to start importing the request.

Step 21: ‘Wait until app. is closed method’ is used to avoid errors.

Step 22: Again Loops to the start block.

Step 23: It ends the Excel file if exit condition is reached.

Step 24: It closes the SAP Logon Application.

Note: To Edit activities, you need to Double-Click a particular page, drag & drop activity and set values of items according to the requirements.

Build and Execution

After you have completed your workflow. Build your Project.

Execute your Project through Desktop Agent.

Deployment

1. Go to file in the Desktop Studio and select Export Project.

2. Enter your login credentials of SAP Intelligent RPA Cloud Factory account. It will redirects you to Home Page.

3. Go to Packages and click on Import

4. Enter the details of desktop package.

5. Go to Environment Tab. Select your Environment or create it. Then deploy your package.

Bot is now successfully deployed on SAP Intelligent RPA in Cloud Factory

Rating: 0 / 5 (0 votes)

The post SAP Intelligent Robotic Process Automation | Bot which Adds and Imports Transport Request in SAP System appeared first on ERP Q&A.

]]>
Setting up SAP Cloud Platform Business Rules Service in the Cloud Foundry Environment using SAP Business Application Studio https://www.erpqna.com/setting-up-sap-cloud-platform-business-rules-service-in-the-cloud-foundry-environment-using-sap-business-application-studio/?utm_source=rss&utm_medium=rss&utm_campaign=setting-up-sap-cloud-platform-business-rules-service-in-the-cloud-foundry-environment-using-sap-business-application-studio Wed, 03 Feb 2021 04:35:34 +0000 https://www.erpqna.com/?p=42365 Overview This blog post will cover the step-by-step process for setting up SAP Cloud Platform (SCP) Business Rules in the Cloud Foundry (CF) environment using SAP Business Application Studio (BAS). In this sample scenario, I will be creating an application using BAS that will allow users to access the Business Rules service in the CF […]

The post Setting up SAP Cloud Platform Business Rules Service in the Cloud Foundry Environment using SAP Business Application Studio appeared first on ERP Q&A.

]]>
Overview

This blog post will cover the step-by-step process for setting up SAP Cloud Platform (SCP) Business Rules in the Cloud Foundry (CF) environment using SAP Business Application Studio (BAS).

In this sample scenario, I will be creating an application using BAS that will allow users to access the Business Rules service in the CF environment and let consuming applications (such as a REST client) call the Business Rules REST API to invoke the rules.

At a high-level, the following topics will be covered:

  1. Enable SCP Business Rules service and BAS in the SCP Cloud Foundry environment
  2. Create Role Collections and assign to the user
  3. Create a Business Rules service instance and service key
  4. Create a project in BAS and configure Business Rules
  5. Create a sample Business Rules project
  6. Test the service with Postman

Prerequisites:

  • You are a global administrator in your SCP global account (required for subscriptions)
  • You are an org and space member (required for configuring entitlements and creating service instances)
  • You are a security administrator (required for role assignments/creation)

NOTE: I am using a production account in the Australia, Sydney (ap10) region.

1. Enable SCP Business Rules service and BAS in the SCP Cloud Foundry environment

Configure Entitlement for the Business Rules Service

  • Navigate to the organization > Entitlements > Configure Entitlements > Add Service Plan
  • Select Business Rules Service and select a service plan (basic) > Add and Save

Subscribe to Business Application Studio

  • Subscriptions > SAP Business Application Studio > Subscribe

2. Create Role Collections and assign to the User

Assign the relevant BAS Developer Role Collection

  • Security > Role Collections > Business_Application_Studio_Developer
  • Edit > Add User (enter ID based on configured IdP. eg: email if using the default SAP ID service) > Save

Assign relevant Business Rules Roles/Role Collection

We need to add roles with the following application identifier: bpmrulebroker*.

There are two roles with such application identifier: RuleRepositorySuperUser and RuleRuntimeSuperUser.

There are two ways to assign these roles to your user:

1. If your account has a subscription to the Workflow Management service, these role collections will be automatically created. Assign your user to one of these role collections (if applicable):

2. Else, create a custom role collection and add the required roles (RuleRepositorySuperUser and RuleRuntimeSuperUser)

3. Create a Business Rules Service Instance and Service Key

Create a Business Rules Service instance

  • Go to your space > Services > Service Instances > Create Instance

Service: Business Rules

Service Plan: basic

Instance Name: <any name>, in my case: businessrules

Leave Parameters blank (step 2), and ‘Create Instance’

Create a service key for the created service instance

This service key will be used later to call the business rules API

  • Open the created service instance > Service keys (Create) > Enter a Service Key Name > Create
  • Open the created service key and copy the credentials in a separate file for now. We will use these credentials when testing the Business Rules service in Step 6.

4. Create a project in BAS and configure Business Rules

  • Go to subscriptions > Business Application Studio > Go to Application

Create a new Fiori workspace

Create a project from a template

I will be using a standalone SAP Fiori Free Style Project to generate the boilerplate code for the application.

The most important configuration objects are the approuter (xs-app.json) and the mta.yaml. We will remove modules that are not required for this demo.

NOTE: This step can be done in a different way (eg: manual creation of objects from a basic MTA project template).

  • In your empty workspace > Create new project from a template
  • Select SAP Fiori Freestyle Project

Target Environment: Cloud Foundry

Template: SAPUI5 Application

Project Name: <any>, rulesmanager in my case

HTML5 application runtime: Standalone Approuter

HTML5 module name: <any>

Do you want to add authentication: Yes

Enter a namespace: <any valid namespace>

Enable Karma Test: No

View Name: <any>

Add data service: No

Configure the Business Rules service in the application

  • After the project is created, open the mta.yaml file.
  • Remove unrequired modules and resources (highlighted in yellow). You can also delete the generated HTML5 folder in the project
  • Add the created Business Rules Service instance (businessrules, Step 3) as a required resource

My mta.yaml file:

_schema-version: "3.2"
ID: rulesmanager
version: 0.0.1
modules:
- name: rulesmanager-approuter
  type: approuter.nodejs
  path: rulesmanager-approuter
  requires:
  - name: rulesmanager_html_repo_runtime
  - name: uaa_rulesmanager
  - name: businessrules
  parameters:
    disk-quota: 256M
    memory: 256M
resources:
- name: rulesmanager_html_repo_runtime
  type: org.cloudfoundry.managed-service
  parameters:
    service: html5-apps-repo
    service-plan: app-runtime
- name: uaa_rulesmanager
  type: com.sap.xs.uaa
  parameters:
    path: ./xs-security.json
    service: xsuaa
    service-name: rulesmanager-xsuaa-service
    service-plan: application
- name: businessrules
  type: org.cloudfoundry.existing-service
build-parameters:
  before-all:
  - builder: custom
    commands:
    - npm install
  • Enter this configuration to the xs-app.json, of the approuter.
{
    "welcomeFile": "comsapbpmrule.ruleeditor/index.html",
    "authenticationMethod": "route",
    "routes": [
    ]
}

Build and Deploy the Application

  • Right click on the mta.yaml file > Build MTA
  • Set the correct target CF space. To set the the correct space, run cf login in the BAS terminal or use the highlighted shortcut below.
  • Once the build is complete and the correct target has been set, open the mta_archives folder and deploy the generated MTAR archive to the CF space.

After deployment, check Applications in the space and open the newly created application.

You should be able to see the Business Rules landing page.

5. Create a sample Business Rules project

Next, we will create sample Business Rules project. This is not the focus of the blog so I will just provide a summary of the steps required and some screenshots for reference.

The service we will be creating will read a User’s organization and return the Approver details based on the decision table.

Procedure:

  1. Create a project
  2. Create data objects (ie. define input/output structure)
  3. Create a rules service (which reference the data objects)
  4. Create rules/decision tables
  5. Create rulesets (and assign rules)

Create data objects (ie. define input/output structure)

NOTE: after creation each object must be Validated and Activated (ie: Status = Active).

Create a rules service (which assign the data objects as Input or Result)

Create rules/decision tables

eg: if organization of the User is C then the approver group is SCPGroupC and the approver name is ApproverC

Create rulesets (and assign rules and rule service)

After creating and activating all the entities, Activate the whole project.

6. Test the service with Postman

Once the Business Rules project is activated, we can test the rules service using a REST client such as Postman.

To do this, we will need the service key credentials which we created in Step 3.

NOTE: any value enclosed in <> should be taken in the service key credentials.

Operation: POST

URL: <rule_runtime_url>/rules-service/rest/v2/workingset-rule-services

Authorization

Type: OAuth 2.0

Token Name: any

Grant Type: Client Credentials

Access Token URL: <url>/oauth/token

Client ID: <clientid>

Client Secret: <clientsecret>

Client Authentication: Send as Basic Auth header

Get a new Access Token and use it

Body

{
  "RuleServiceId": "5641e858cf2341e6b380849d47aed340",
  "Vocabulary": [
    {
      "User": {
        "organization" : "C"
      }
    }
  ]
}

where

  • “User” and “organization” is the Input structure and attribute respectively.
  • RuleServiceId: ID of the rule service. (Add the ID to the visible columns as this is not visible by default)

Result:

As you can see, when we pass a User with ‘C’ as organization, we get the expected Approver per the decision table.

Hooray!

Rating: 0 / 5 (0 votes)

The post Setting up SAP Cloud Platform Business Rules Service in the Cloud Foundry Environment using SAP Business Application Studio appeared first on ERP Q&A.

]]>