SAP Cloud Application Programming Model Archives - ERP Q&A https://www.erpqna.com/category/sap-cloud-application-programming-model/ Trending SAP Career News and Guidelines Sat, 04 Jan 2025 09:40:13 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP Cloud Application Programming Model Archives - ERP Q&A https://www.erpqna.com/category/sap-cloud-application-programming-model/ 32 32 Fiori Elements Analytical Table with CAP: Does it work with V2 and V4? https://www.erpqna.com/fiori-elements-analytical-table-with-cap-does-it-work-with-v2-and-v4/?utm_source=rss&utm_medium=rss&utm_campaign=fiori-elements-analytical-table-with-cap-does-it-work-with-v2-and-v4 Mon, 03 Apr 2023 09:52:34 +0000 https://www.erpqna.com/?p=73370 Introduction In this blog post, I will show you how to create an analytical table using a CAP OData service. Please note that this blog post is focused on List Report, and not Analytical List Report (ALP). Recently, I was tasked with displaying an analytical table based on an OData V4 service built with CAP. […]

The post Fiori Elements Analytical Table with CAP: Does it work with V2 and V4? appeared first on ERP Q&A.

]]>
Introduction

In this blog post, I will show you how to create an analytical table using a CAP OData service. Please note that this blog post is focused on List Report, and not Analytical List Report (ALP).

Recently, I was tasked with displaying an analytical table based on an OData V4 service built with CAP. While I was able to display the data on the table using some annotations, I encountered an issue with displaying totals at the bottom. Unfortunately, this issue is still unresolved.

Nevertheless, I would like to share what I have accomplished so far and would appreciate any advice on how to address this issue.

Analytical table with OData V4 (total is not displayed)
Analytical table with OData V2 (total is displayed)

CAP Project

Below is the CAP project with minimum data models and annotations for showing an analytical table. The code is available at GitHub.

Data model

The data model is created with cap add samples command and an entity BooksAggregate was created on top of it. This entity will be used for displaying analytical tables.

namespace my.bookshop;

entity Books {
  key ID : Integer @title: 'ID';
  title  : String @title: 'Title';
  category: String @title: 'Category';
  stock  : Integer @title: 'Stock';
}

entity BooksAggregate as projection on Books {
  ID,
  title,
  category,
  stock
}

Service

The service simply exposes the two entities in read-only mode.

using my.bookshop as my from ‘../db/data-model’;

service CatalogService {
@readonly entity Books as projection on my.Books;
@readonly entity BooksAggregate as projection on my.BooksAggregate;
}

Annotations

The following annotations were added to make the analytical table work. The first block is for OData V4, and the second is for OData V2.

using CatalogService from './cat-service';

//aggregation annotations
// v4
annotate CatalogService.BooksAggregate with @(
  Aggregation.ApplySupported
){
  stock @Analytics.Measure @Aggregation.default: #SUM
}

// v2
annotate CatalogService.BooksAggregate with @(
  sap.semantics: 'aggregate'
){
  ID @sap.aggregation.role: 'dimension';
  category @sap.aggregation.role: 'dimension';
  title @sap.aggregation.role: 'dimension';
  stock @sap.aggregation.role: 'measure';

};

Without these annotations, the analytical table for OData V4 will throw an error and will not display any records. The error occurs at the start of the application, before sending a $batch request.

OData V4 without annotations leads to an error

On the other hand, the analytical table for OData V2 shows records, but does not display totals.

OData V2 without annotations show no totals

List Report (V4)

A List report was created on OData V4 service with the table type set to AnalyticalTable.

"targets": {
        "BooksAggregateList": {
          "type": "Component",
          "id": "BooksAggregateList",
          "name": "sap.fe.templates.ListReport",
          "options": {
            "settings": {
              "entitySet": "BooksAggregate",
             ...
              "controlConfiguration" : {
                "@com.sap.vocabularies.UI.v1.LineItem" : {
                     "tableSettings": {
                          "type": "AnalyticalTable"
                      }
                }
           }
            }
          }
        },

When you execute the application, a $batch request is sent and a response is returned successfully, but totals is not displayed.

$batch request (V4)

The response contains each record but not the totals.

$batch response

List Report (V2)

Next, another List report was created on OData V2 service with the table type set to AnalyticalTable.

"pages": {
      "ListReport|BooksAggregate": {
        "entitySet": "BooksAggregate",
        "component": {
          "name": "sap.suite.ui.generic.template.ListReport",
          "list": true,
          "settings": {
            ...
            "tableSettings": {
							"type": "AnalyticalTable"
						},
            "dataLoadSettings": {
              "loadDataOnAppLaunch": "always"
            }
          }
        },

When you execute the application, you will see that two requests are contained in a single $batch request. The total is displayed at the bottom of the table.

$batch request (V2)

The first request retrieves the sum of the stock.

Getting the sum of stock

The second request retrieves each record.

Getting each record

Closing

While the analytical table for OData V2 retrieves totals and records in a $batch request, the analytical table for OData V4 does not seem to attempt to retrieve totals. This leads me to doubt whether the analytical table for OData V4 actually supports totals. If you have information on how to display totals in the analytical table for OData V4, please share it in the comment section below.

Rating: 0 / 5 (0 votes)

The post Fiori Elements Analytical Table with CAP: Does it work with V2 and V4? appeared first on ERP Q&A.

]]>
Complex OData Annotation Editing using CDS Graphical Modeler https://www.erpqna.com/complex-odata-annotation-editing-using-cds-graphical-modeler/?utm_source=rss&utm_medium=rss&utm_campaign=complex-odata-annotation-editing-using-cds-graphical-modeler Thu, 05 Jan 2023 11:00:04 +0000 https://www.erpqna.com/?p=71562 In this blog post, we will demonstrate how to annotate CDS models with OData annotations using the CDS Graphical Modeler. Complex OData Annotation Editing using CDS Graphical Modeler In this blog post, we will show you how to create complex OData annotations for CDS projections in a separate app/annotations.cds file using the CDS Graphical Modeler […]

The post Complex OData Annotation Editing using CDS Graphical Modeler appeared first on ERP Q&A.

]]>
In this blog post, we will demonstrate how to annotate CDS models with OData annotations using the CDS Graphical Modeler.

Complex OData Annotation Editing using CDS Graphical Modeler

In this blog post, we will show you how to create complex OData annotations for CDS projections in a separate app/annotations.cds file using the CDS Graphical Modeler so that as a user you don’t need to remember the complex details of the OData annotation terms.

Let’s say we have an ESPM model and we have a service model defined in a service CDS file that defined all the projections:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

and we can open the ESPM service model using the CDS modeler:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

and we want to annotate the projections and manage the annotations in a separate CDS file, for example, app/annotations.cds.

First create app/annotations.cds and open the file using the CDS modeler, and you will see an empty CDS model because the CDS file doesn’t contain any CDS objects:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

Now let’s import the service model to the annotations.cds file. Click “Import” -> “External File” from the toolbar:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

In the file selection dialog, select srv/cat-service.cds which is your service CDS file:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

After the import is complete, the modeler will show the details of the imported service:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

Let’s say we want to annotate the service entity “Customers” in the service, we click the entity “Customers” in the canvas and select “Pin to the Main Namespace” context menu. By doing so, CDS modeler will place the imported entity “Customers” to the main namespace so that we can directly operate on it:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

After the “pin” is done, you will be able to see the entity “Customers” from the CatalogService directly in the main canvas in annotations.cds, just like any other regular entities:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

Now since we want to annotate this entity, we click “Manage Annotation” context menu as shown in the above screenshot and the annotation dialog shows up:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

And you can see both entity/projection as well as its properties are shown in the annotation editor, and you would be able to annotate both the entity and the properties directly from the app/annotations.cds file.

Now let’s show how to create complex OData annotations for this projection. First click “+” button and show all the annotation terms in the list for you to choose:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

Let’s try to add term @ApplySupported from the Aggregation vocabulary defined in https://oasis-tcs.github.io/odata-vocabularies/vocabularies/Org.OData.Aggregation.V1.html:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

and try to provide the values for @Aggregation.ApplySupported term:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

Please note that the annotation editor will take care of all the OData annotation schema itself and facilitate the user as much as possible, and the use just need to pick up the value from the list and doesn’t need to remember the details of the term definition.

Click “Update” button to close the dialog, and we can now check the app/annotations.cds:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

You will see that this complex annotation “@Aggregation.ApplySupported” has been created successfully in the CDS file.

Now let’s reopen the annotation editor and choose “@Capabilities.ChangeTracking” for this entity:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model
SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

Now provide values for this term:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

until we get below:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

and the app/annotations.cds:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

You can see we can generate very complex OData annotations very easily in the annotations.cds file.

In addition to annotating the entity, we can also annotate its properties using either OData annotations or CAP annotations:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model

and app/annotations.cds has:

SAP Business Application Studio, SAP Business Technology Platform, SAP Cloud Application Programming Model
Rating: 0 / 5 (0 votes)

The post Complex OData Annotation Editing using CDS Graphical Modeler appeared first on ERP Q&A.

]]>
Creating a Tax Decision Tree to Automate Determination of Tax Code https://www.erpqna.com/creating-a-tax-decision-tree-to-automate-determination-of-tax-code/?utm_source=rss&utm_medium=rss&utm_campaign=creating-a-tax-decision-tree-to-automate-determination-of-tax-code Thu, 03 Nov 2022 09:44:07 +0000 https://www.erpqna.com/?p=69350 Tax Decision Tree is a feature which helps in creating custom rules for the automatic determination of the tax code. It uses a tree-like model of decisions and their possible consequences to determine the tax code. This blog explains how to create a tax decision tree using SAP Cloud Applications Studio. Prerequisites You have installed […]

The post Creating a Tax Decision Tree to Automate Determination of Tax Code appeared first on ERP Q&A.

]]>
Tax Decision Tree is a feature which helps in creating custom rules for the automatic determination of the tax code. It uses a tree-like model of decisions and their possible consequences to determine the tax code. This blog explains how to create a tax decision tree using SAP Cloud Applications Studio.

Prerequisites

  • You have installed the latest SAP Cloud Applications Studio release for creating the tax decision tree.
  • The country/region you want to configure is available in the tax country table already delivered by SAP.

Steps to create a tax decision tree

Following are the steps to create a tax decision tree:

a. Open SAP Cloud Applications Studio, go to Project/ Solution Explorer and right click on the solution in which you want to add the decision tree. A menu will appear, click the Add option, and then open New Item.

b. A new window will open where you have to select the tax decision tree option and give a name for the tax decision tree,

1. Go to Business Configuration option.
2. Select Tax Decision Tree option.
3. Enter a name for your tax decision tree. (Do not change the .bcctax part)
4. Click the Add button.

After clicking the Add button, the decision tree create wizard will open.

c. On the first screen, under basic information,

1. Select the Withholding Tax checkbox if the decision tree is to be created for the withholding tax scenario.
2. Enter a Name for the decision tree.
3. Enter a Description for the decision tree.
4. Select the Country/Region for which the decision tree is applicable.
5. Choose a Valid From date.
6. Click Next.

d. A new window opens where you can define the decision tree structure.

1. Click Add Question.
2. Click on the First Question button. A question form will open, here we can configure the question.
3. In the question form select the Select Type of the question. Depending on the select type, options will be shown in the form. For this illustration, we will select the Select Type as ‘Question‘.
4. Enter a text for the Question.
5. Choose the Test Parameter.
6. Further there are options to configure Context Parameters and Comparison Values (shown in colored box in the image below)
7. Click Next.

e. Click Finish.

Steps for deploying the tax decision tree

1. Activate the created tax decision tree from the Solution Explorer.

2. Right click on the solution folder and select the option Deploy Business Configuration.

More Information on the test parameters for the selection type ‘Question’

For Product Tax

1. DirectionIndicator

DirectionIndicator test parameter is used to include a check on the kind of business transaction being processed in the decision tree.

It has two comparison values:

  • 1 – Sale/ Outgoing, for sales scenarios.
  • 2 – Purchasing/ Incoming, for purchase scenarios.

Sample Question: Is the transaction a sale or purchase ?

Depending on the result, further processing of the decision tree continues.

2. ProductTaxability

ProductTaxability test parameter is used to include a check on the kind of products being processed in the decision tree.

It does not have context parameters but has five comparison values:

001 – Good / Material

002 – Service

003 – Asset / Individual Material

004 – Warranty

006 – Entitlement Product

Sample Question: Is the transaction a sale of goods or a purchase of services ?

3. Country

Country test parameter is used to include a check on the location of the business partner or the service provider or the Ship-To / Ship-From party in the decision tree.

It has one context parameter called PartnerRole which can take the following values:

CF – Contract-From party

CT – Contract-To party

SF – Ship-From party

ST – Ship-To party

SL – Service Point Location

Please note: You must use the country ISO as the comparison values.

Sample Question: Is the Ship-To party in DE (Germany) ?

4. ServicePointTaxableIndicator

ServicePointTaxableIndicator test parameter is used to include a check to see whether the service is taxable at the point of service or not in the decision tree.

It has three context parameters:

  1. TaxType
  2. TaxableCountry
  3. TaxableRegion (remains empty if the tax is not region specific)

Please note: For these context parameters, a value help is available based on the SAP delivered content and partner created content.

Sample Question: Is the service taxable at the point of service ?

5. ProductTaxExemptionReasonCode

ProductTaxExemptionReasonCode test parameter is used to include a check to see whether there is any tax exemption for the product in the decision tree.

It has three context parameters:

  1. TaxType
  2. TaxableCountry
  3. TaxableRegion (remains empty if tax exemption is not region specific)

Please note: For these context parameters, a value help is available based on the SAP delivered content and partner created content.

As comparison values we must use the tax exemption reason codes already defined in the system. There is no value help for this.

Sample Question: Is the service subject to reverse charge ?

6. PartnerTaxability

PartnerTaxability test parameter is used to check the business partner taxability.

It has one context parameter called the PartnerRole which can have the following values:

  • CF – Contract-From party
  • CT – Contract-To party
  • SF – Ship-From party
  • ST – Ship-To party
  • SL – Service Point Location

Sample Question: Is the service recipient a company ? (In this case the Partner Role must be either CT or ST and the comparison value must be 100)

7. PartnerTaxGroup

PartnerTaxGroup test parameter is used to include a check on the rate type assigned to the business partner in the decision tree.

It has four context parameters:

  1. TaxType
  2. TaxableCountry
  3. TaxableRegion (remains empty if tax exemption is not region specific)
  4. PartnerRole

Please note: For these context parameters, a value help is available based on the SAP delivered content and partner created content.

Use ‘X’ as the comparison value in case the question is answered positive.

PartnerRole can have the following values:

  • CF – Contract-From party
  • CT – Contract-To party
  • SF – Ship-From party
  • ST – Ship-To party
  • SL – Service Point Location

Sample Question: Is the supplier classified for tax rate type code 1 ?

8. PartnerTaxNumber

PartnerTaxNumber test parameter is used to include a check on the partner tax number type in the decision tree.

It has three context parameters

  1. CountryRole
  2. PartnerRole
  3. PartnerTaxNumberType

In the comparison value field, enter the tax number type which needs to be checked for the partner.

Please note: PartnerRole and CountryRole can have the following values:

  • CF – Contract-From party
  • CT – Contract-To party
  • SF – Ship-From party
  • ST – Ship-To party
  • SL – Service Point Location

For Withholding Tax

1. ProductTaxTypeIndicator

ProductTaxTypeIndicator test parameter is used to include a check on the tax type assigned to the product in the decision tree.

It has three context parameters:

  1. TaxType
  2. TaxableCountry
  3. TaxableRegion (remains empty if the tax is not region specific)

Please note: For these context parameters, a value help is available based on the SAP delivered content and partner created content.

Use ‘X’ as the comparison value in case the question is answered positive.

Sample Question: Is the product classified for tax type 3 ?

2. PartnerTaxGroup

PartnerTaxGroup test parameter is used to include a check on the rate type assigned to the business partner in the decision tree.

It has four context parameters:

  1. TaxType
  2. TaxableCountry
  3. TaxableRegion (remains empty if the tax is not region specific)
  4. PartnerRole

Please note: For these context parameters, a value help is available based on the SAP delivered content and partner created content.

Use ‘X’ as the comparison value in case the question is answered positive.

PartnerRole can have the following values:

  • CF – Contract-From party
  • CT – Contract-To party
  • SF – Ship-From party
  • ST – Ship-To party
  • SL – Service Point Location

Sample Question: Is the supplier classified for tax rate type code 1 ?

3. IncomeType

IncomeType test parameter is used to include a check on the income type.

It has two context parameters:

  1. TaxableCountry
  2. TaxType (assigned to the product or the business partner)

Comparison values use income types defined already in the system for the corresponding country code and tax type provided in the context parameter.

Sample Question: Is the product classified for income Type A or B or C ?

More Information on Other Select Types

In addition to the select type Question, we have two more select types: Result and Error.

1. Result

Result select type is used to include a result node in the decision tree. It has four parameters :

1. Result
2. TaxEvent
3. TaxFromRole
4. TaxToRole
5. Please note:

In the Result parameter we need to provide a text string highlighting a brief about the result represented by the node.

For the TaxEvent parameter, a value help is available based on the SAP delivered content and partner created content.

TaxFromRole and TaxToRole can have the following values:

  • CF – Contract-From party
  • CT – Contract-To party
  • SF – Ship-From party
  • ST – Ship-To party

2. Error

Error select type is used to include an error node in the decision tree. It accepts texts as error message to be shown.

Summary

You can now create a tax decision tree by following the steps in this blog in order to automate the tax code determination. To verify if the new rules are being considered for the determination, we can check the Tax Trace for the business transaction (for example, Supplier Invoice creation).

Below is a sample tax decision tree.

Rating: 0 / 5 (0 votes)

The post Creating a Tax Decision Tree to Automate Determination of Tax Code appeared first on ERP Q&A.

]]>
How to get authenticated user information with CAP in three different ways – Setup project for development https://www.erpqna.com/how-to-get-authenticated-user-information-with-cap-in-three-different-ways-setup-project-for-development/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-get-authenticated-user-information-with-cap-in-three-different-ways-setup-project-for-development Wed, 20 Jul 2022 10:19:36 +0000 https://www.erpqna.com/?p=65240 Introduction Secure cloud software should always rely on some sort of authentication and authorization mechanism to let users benefit from its functionality and protect it from attackers and/or malicious usage. Based on such mechanism, users must first authenticate using login and password (safer systems even use two-factor authentication with some extra identity check technique, such […]

The post How to get authenticated user information with CAP in three different ways – Setup project for development appeared first on ERP Q&A.

]]>
Introduction

Secure cloud software should always rely on some sort of authentication and authorization mechanism to let users benefit from its functionality and protect it from attackers and/or malicious usage.

Based on such mechanism, users must first authenticate using login and password (safer systems even use two-factor authentication with some extra identity check technique, such as security tokens), so they can prove their identity, and, then, the application checks whether they are authorized to use its functionality and, if yes, which functionality – from the whole application – they are allowed to use: this is basically the definition of authentication (first part) and authorization (second part).

Authorizations are provided through some role-based mechanism through which a set of “roles” are assigned to users, granting them authorization to perform operations in the system (i.e. create, read, and/or update some data). So, a user with a role of, let’s say, “Viewer” would be allowed to access data in “read-only mode”, whilst a user with a role of “Admin” would also be allowed to write data to the system.

Therefore, it’s not unusual that, often times, the application should know who the authenticated user is (based on some user ID) and use that information to protect its functionality from unauthorized usage.

Most common example of such requirement, is an application that executes transactions triggered by the end-user (i.e. list orders, create, update and/or delete items, etc.). In such scenario, it would be nice to adjust the UI according to the user’s permissions (roles) and provide/display the appropriate controls which trigger only the authorized transactions.

And that’s why having information about the current authenticated user is definitely useful.

Access SAP Business Application Studio

We are going to start by accessing the selected IDE for development (in this case, SAP Business Application Studio).

Access your trial subaccount

Figure 1 – BTP trial access

On the left-hand side expand Services and click on Instances and Subscriptions

Figure 2 – Instances and Subscriptions

On the right, under Subscriptions, click on the small icon next to the SAP Business Application Studio subscription to go to the application

Figure 3 – Go to Application

Start your previously created Dev Space by clicking on the Start Icon on the right

Figure 4 – Start Dev Space

After the Dev Space has started (status RUNNING), go into it by clicking on the Space Name

Figure 5 – Go into Dev Space

Login to Cloud Foundry

Before starting the development, you need to make sure to be properly logged in to the Cloud Foundry environment on your BTP subaccount.

From the top menu, click on View and select Find Command…

Figure 6 – Find Command

In the Find Command dropdown, search for CF: Login to Cloud Foundry and select it

Figure 7 Login to Cloud Foundry

Confirm the Cloud Foundry endpoint, by pressing Enter

Figure 8 – Cloud Foundry endpoint

Enter the e-mail address you have used to create your trial account and press Enter

Figure 9 – Enter e-mail address

Enter the password you have used to create your trial account and press Enter

Figure 10 – Enter password

Select your trial organization and press Enter

Figure 11 – Select organization

Select your Cloud Foundry dev space and press Enter

Figure 12 – Select Cloud Foundry space

Click on the small notifications icon in the bottom right corner to view the login notifications to make sure the login has been successful

Figure 13 – Confirm successful login

Create the CAP Project

Now it’s time to create the CAP Project.

From the top menu, click on Terminal and select New Terminal

Figure 14 – New Terminal

In the terminal window at the bottom, type cd projects and press Enter. The terminal should look like this:

Figure 15 – Terminal at projects folder

Now type cds init user-info and press Enter. The output should look like this:

Figure 16 – Project initialization

Notice that user-info is the project and application name and also the project folder name. Now, let’s open the project as a workspace.

From the top menu, click on File and select Open Workspace…

Figure 17 – Open Workspace

In the dialog, select the user-info folder under projects and the VS Code Workspace (*.code-workspace) from the Format dropdown, then click Open

Figure 18 – Set Workspace

SAP Business Application Studio will restart and, after a few seconds, you should see the following structure at the bottom of the left-hand side pane:

Figure 19 – Project Structure

As the Dev Space has restarted, execute the procedure to open a New Terminal again.

Create an Index Page

To test each method for getting the authenticated user information, let’s create an index page (index.html) which will represent our application’s basic front-end (UI).

In a new Terminal, type the following commands pressing Enter after each one.

mkdir app/root
touch app/root/index.html

The index.html file will be created under the app/root folder:

Figure 20 – index.html created

Now, open-up that file and copy & paste the following content into it:

<!DOCTYPE html>
<html>
    <head>
        <meta http-equiv="X-UA-Compatible" content="IE=edge" />
        <meta charset="UTF-8" />
        <meta name="viewport" content="width=device-width, initial-scale=1.0" />
        <title>User Info Demo App</title>
    </head>
    <body>
        Three differnt ways for getting the authenticated user info:
        <ul>
            <!-- TODO: Add the links to each respective method -->
            <li>1. Directly from the UI (HTML5 app)</li>
                <ul>
                    <li>1a. Using the <b>/currentUser</b> endpoint</li>
                    <li>1b. Using the <b>/attributes</b> endpoint</li>
                </ul>
            <li>2. Using the CAP request object</li>
            <li>3. Using the XSUAA API</li>
        </ul>
    </body>
</html>

Create a Standalone AppRouter

To get user information, first, the user must be authenticated into your application. For that to happen the authentication flow must be completed. This flow follows the OAuth 2.0 protocol and is taken care by the so-called AppRouter. So, let’s add it to the project!

We will start by creating a package.json file into the app folder.

In the Terminal type these two command pressing Enter after each one:

cd app
npm init

For the second one provide the following information:

  • package name: app
  • version: 1.0.0
  • description: User Info Demo App
  • entry point: index.html
  • test command: node node_modules/@sap/approuter/approuter.js
  • author: <your name>
  • license: ISC

Leave all the other parameters blank.

Open-up the newly created package.json under the app folder and replace the “test” script by “start“. Your final content should look like this:

Figure 21 – app/package.json content

The next step is to install the AppRouter node package, by running the following command:

npm install @sap/approuter

OK, so, now the AppRouter should know what’s the application entry point and which routes it should map to such application. For that it uses a configuration file named xs-app.json. Therefore, we need to create that file under the app folder by running the following command:

touch xs-app.json

Now, open-up that file and copy & paste the following JSON content into it:

{
    "welcomeFile": "index.html",
    "authenticationMethod": "route",
    "routes": [
        {
            "source": "/user-info/(.*)",
            "destination": "srv-api",
            "csrfProtection": true,
            "authenticationType": "xsuaa"
        },
        {
            "source": "/(.*)",
            "localDir": "root",
            "authenticationType": "xsuaa"
        }
    ]
}

For now, in this configuration, we will just map the root folder as the location for static content such as de index page (second route) and the CAP backend service as a destination named “srv-api” (although you might not need it if you don’t want to experiment with any of the two approaches based on it – request object or XSUAA API). A new route will be added in the instructions of the blog post regarding the first approach.

Create an XSUAA Service Instance

We shall remember that, in Cloud Foundry, security (meaning authentication and authorization) is governed by the UAA (User Account and Authentication) mechanics which in SAP BTP is implemented by the XSUAA service (Authorization and Trust Management Service).

This means that only having the AppRouter properly set up in the application is not enough to achieve the complete OAuth 2.0 authentication flow as it depends on the XSUAA service for that.

Therefore, we need to create an XSUAA service instance on BTP, but, first, it will require a configuration file that’s provided upon instance creation. That file is named xs-security.json and it will be created under the project’s root folder.

In the Terminal type these two commands pressing Enter after each one:

cd ..
touch xs-security.json

Now, open-up that file and copy & paste the following JSON content into it:

{
    "xsappname": "user-info",
    "tenant-mode": "dedicated",
    "description": "Security profile of User Info Demo App",
    "scopes": [
        {
            "name": "uaa.user",
            "description": "UAA"
        }
    ],
    "role-templates": [
        {
            "name": "Token_Exchange",
            "description": "UAA",
            "scope-references": [
                "uaa.user"
            ]
        }
    ],
    "oauth2-configuration": {
        "redirect-uris": [
            "https://*.hana.ondemand.com/**",
            "https://*.applicationstudio.cloud.sap/**"
        ]
    }
}

The uaa.user scope associated to the Token_Exchange role template is required to perform the OAuth 2.0 authentication flow executed by the approuter. Basically, before loading the HTML5 application (in this case just the index.html page) the router authenticates the user logged in to SAP BTP via OAuth 2.0 and gets a JWT (JSON Web Token) which is, then, forwarded to the HTML5 application and, from there, to the CAP service, making sure that only authenticated users have access to the application.

The “redirect-uris” in the “oauth2-configuration” is just a list of allowed endpoints (hosts and paths which can use wildcards) to be used as callbacks to which the OAuth flow redirects when authentication is completed. URIs that are not listed are considered “invalid” by the OAuth flow, so the above configuration makes sure that the authentication flow is well executed either when the application is deployed to Cloud Foundry (*.hana.ondemand.com) as well as when running locally on Business Application Studio (*.applicationstudio.cloud.sap).

Now, we can create the XSUAA service instance by running the command below:

cf create-service xsuaa application user-info-xsuaa -c xs-security.json

This should be the expected outcome:

Figure 22 – XSUAA service instance created

Bind the XSUAA Service Instance to the AppRouter

Upon successful creation of the service instance, on the left-hand pane of BAS click on the Cloud Foundry icon (small light bulb), then on the refresh button at the top right of the pane.

The newly created instance should be displayed at end of the list, like demonstrated below:

Figure 23 – Newly created service instance

Right click on the user-info-xsuaa service instance and select “Bind service to a locally run project” and, then, in the dialog, make sure the app folder is selected and click on the “Select folder for .env file” button.

Figure 24 – Select app folder

Click again on the Explorer icon in the left-hand pane. You should see the .env file that has been created like demonstrated below:

Figure 25 – .env file for services binding

Now, we need to rename that file to default-env.json and transform its contents into JSON format (as it’s just defining a regular environment variable called VCAP_SERVICES which is not in the expected JSON format), like demonstrated below:

Figure 26 – Rename .env and adjust to JSON format

NOTE: if you do not intend to experiment with any of the backend service approaches (request object or XSUAA API) you can safely skip all the next steps.

Configure the Backend (CAP) Service Destination

You may recall that, in the xs-app.json configuration, the first route which was added to it is pointing to a destination named “srv-api“, right? Therefore, the AppRouter needs to read that destination’s configuration and it will look for it in the same default-env.json file, but in an environment variable (key) named destinations apart from the VCAP_SERVICES.

To create such configuration just add the following line right before the “VCAP_SERVICES” in the default-env.json file:

"destinations" : [{ "name" : "srv-api", "url" : "http://localhost:4004", "forwardAuthToken" : true }],

After the modification, your default-env.json file should look like this:

Figure 27 – Destination added to default-env.json

You may have noticed that the URL points to the locally run CAP service (http://localhost:4004) and the router should forward the authentication token (JWT) to it after execution of the OAuth 2.0 flow.

One important thing to mention is that everything that’s configured in the default-env.json is only effective in a local development environment as this file is not deployed to BTP with the application and shall not be included in version control repositories (such as Git) as it holds sensitive information like credentials.

Create the Service Definition

The second and third approaches to get the authenticated user information from this blog posts series are done in the backend service of the solution, meaning the CAP service.

Therefore, the first thing that’s necessary is to create the usual CDS service definition in the “srv” folder. For that, just run the command below in the Terminal:

touch srv/user-info-service.cds

Open-up the user-info-service.cds file, then copy and paste the following content into it:

service UserInfoService @(path : '/user-info', requires: 'authenticated-user') {
    function userInfo() returns String; // using req.user approach (user attribute - of class cds.User - from the request object)
    function userInfoUAA() returns String; // using the XSUAA API
}

In the CAP service, we’ll not actually define any OData entities (as usually expected), but just two functions (one for each approach). You may notice that the service is being served in the “/user-info” path and requires authentication (as expected).

Create the Function Handler

Now, we need to create a “skeleton” for the implementation of the two functions described in the service definition. In the Terminal, run the command:

touch srv/user-info-service.js

Open-up the user-info-service.js file, then copy and paste the following content into it:

module.exports = cds.service.impl(async function () {
    // using req.user approach (user attribute - of class cds.User - from the request object)
    this.on('userInfo', req => {
        return "";
    });

    // using the XSUAA API
    this.on('userInfoUAA', async () => {
        return "";
    });
});

As you can see, in this “skeleton”, both functions are just returning an empty string, thus waiting for the corresponding implementation of each approach.

Install Project Dependencies

Both approaches will require the CAP backend service to handle security (basically the JWT – JSON Web Token – provided after authentication via XSUAA), thus the need to install the appropriate dependencies for it to accomplish that goal.

In the Terminal, run the following commands:

npm install @sap/xssec
npm install passport

And to install all other dependencies from the generated package.json just run:

npm install

Bind the XSUAA Service Instance to the CAP Service

The binding of the XSUAA service instance to the CAP service is slightly different from the approach taken to bind it to the AppRouter. For a local development environment, CAP leverages what’s called hybrid testing approach through the cds bind command.

In the Terminal, run the following command:

cds bind uaa --to user-info-xsuaa:key --kind xsuaa

This should be the expected outcome:

Figure 28 – XSUAA service bound to CAP project

The command generates a new configuration file in the project: cdsrc-private.json.

Figure 29 – cdsrc-private.json generated

Now, we need to make an enhancement to this file for the service authentication to work properly. So, open it and add the following block right after the block “uaa” : { … }:

,
            "auth": {
                "kind": "jwt-auth"
            }

After that modification your cdsrc-private.json file should look like this:

Figure 30 – Modified cdsrc-private.json

Notice that the [Hybrid] feature, defined at the very beginning, states that the configuration must emulate the VCAP_SERVICES environment variable. However, it does not dismiss the usage of the default-env.json file.

So, make sure to copy that file from the app folder to the root folder of your project (same folder as the cdsrc-private.json file):

Figure 31 – default-env.json together with cdsrc-private.json

And with that, we completed the setup of the project to start the development of the different approaches to the get the authenticated user information with CAP and we are now good to go!

Rating: 0 / 5 (0 votes)

The post How to get authenticated user information with CAP in three different ways – Setup project for development appeared first on ERP Q&A.

]]>
Importing an external data model using CDS Graphical Modeler for Visual Studio Code https://www.erpqna.com/importing-an-external-data-model-using-cds-graphical-modeler-for-visual-studio-code/?utm_source=rss&utm_medium=rss&utm_campaign=importing-an-external-data-model-using-cds-graphical-modeler-for-visual-studio-code Fri, 13 May 2022 11:23:11 +0000 https://www.erpqna.com/?p=63015 In this blog post, we’ll demonstrate how to import an external OData data model from an edmx file to an existing CDS model, and create a relationship to one of the OData entities from this external data model through the CDS Graphical Modeler for Visual Studio Code. Preparing the External Data Model First install the […]

The post Importing an external data model using CDS Graphical Modeler for Visual Studio Code appeared first on ERP Q&A.

]]>
In this blog post, we’ll demonstrate how to import an external OData data model from an edmx file to an existing CDS model, and create a relationship to one of the OData entities from this external data model through the CDS Graphical Modeler for Visual Studio Code.

Preparing the External Data Model

First install the CDS Graphical Modeler for VSCODE from the VSCODE Marketplace at https://marketplace.visualstudio.com/items?itemName=SAPSE.vscode-wing-cds-editor-vsc.

After preparing your CDS project at VSCODE, open your cds model through the CDS Graphical Modeler. Then try to create an entity by clicking the “Add Entity” toolbar button:

Create a “Customers” entity on the entity dialog in the model:

Click “Create” button and close the dialog, and you will see that the new “Customers” entity has been created:

Now before importing the external data model, copy the edmx file that contains the OData model to your project:

You can also look into the OData model through the OData CSDL Modeler which is also a VSCODE extension by installing the tool at https://marketplace.visualstudio.com/items?itemName=SAPSE.vsc-extension-odata-csdl-modeler.

Open the edmx file using the CSDL modeler:

Since the imported edmx file contains a complex OData model, you can locate the external model by clicking the “Search” button from the toolbar that launches the entity search dialog:

Suppose you’re importing the “BusinessPartner” entity into your CDS model, now type “businesspartner” to locate your entity:

Click the entity type in the list:

The entity will be shown in the canvas. You can then check the details of this entity from both the canvas as well as the property sheet:

Importing an External Data Model using CDS Graphical Modeler

Let’s go back to the CDS model. Click “Import” button on the model:

Click “External Files” menu item:

Select the edmx file and click “Open” button to close the dialog:

You can optionally provide an alias for the imported data model, otherwise click “OK” button to finish the import. CDS model will navigate to the imported namespace automatically:

Click the “Home” button to go back to your main CDS namespace:

If we try to create an association from the newly created “Customers” entity to the business partner entity, select the entity and click “Add Relationship” context menu:

You will see the linkage line moving along with your mouse.

This will launch the relationship dialog:

In the “Target Entity Type” drop down list, type in “businesspartner” to filter the entity types:

Select the business partner entity from the drop down list:

Click “Create” button and close the dialog:

Now we have successfully created a managed association from our data model to an external entity that is from an imported OData model in an edmx file.

Rating: 0 / 5 (0 votes)

The post Importing an external data model using CDS Graphical Modeler for Visual Studio Code appeared first on ERP Q&A.

]]>
Creating entity relationship through CDS Graphical Modeler for VSCODE https://www.erpqna.com/creating-entity-relationship-through-cds-graphical-modeler-for-vscode/?utm_source=rss&utm_medium=rss&utm_campaign=creating-entity-relationship-through-cds-graphical-modeler-for-vscode Tue, 03 May 2022 10:26:20 +0000 https://www.erpqna.com/?p=62683 In this blog post, we’ll demonstrate how to create entity relationships for CDS models by using the CDS Graphical Modeler for Visual Studio Code. CDS Entity Relationship Modeling using CDS modeler for VSCODE CDS Graphical Modeler is now available on Visual Studio Code as well, and if you want to use the modeler in VSCODE […]

The post Creating entity relationship through CDS Graphical Modeler for VSCODE appeared first on ERP Q&A.

]]>
In this blog post, we’ll demonstrate how to create entity relationships for CDS models by using the CDS Graphical Modeler for Visual Studio Code.

CDS Entity Relationship Modeling using CDS modeler for VSCODE

CDS Graphical Modeler is now available on Visual Studio Code as well, and if you want to use the modeler in VSCODE platform you can download and install the modeler extension from https://marketplace.visualstudio.com/items?itemName=SAPSE.vscode-wing-cds-editor-vsc.

Now we have a new way of creating entity relationships much more easily by just using simple drag and drop, which offers the developer better productivity in creating complex entity relationships very quickly even for complex CDS models.

Supposed we have below CDS model for espm:

namespace com.sap.cds.espm.db;

using
{
    cuid,
    managed,
    temporal
}
from '@sap/cds/common';

entity Customers : cuid, managed
{
    name : String;
    officialName : String;
    companyName : String;
    address1 : String;
    address2 : String;
    industryRanking1 : String;
    industryRanking2 : String;
    industryRanking3 : String;
    sector1 : String;
    sector2 : String;
    sector3 : String;
    revenue : Decimal;
    forecast1 : Integer;
    forecast2 : Integer;
    payment1 : String;
    payment2 : String;
    payment3 : String;
    vertical1 : String;
    vertical2 : String;
    certical3 : String;
}

entity Orders : cuid, managed
{
    name : String;
    name1 : String;
    name2 : String;
    paymentMethod1 : String;
    paymentMethod2 : String;
    deliveryAddress1 : String;
    deliveryAddress2 : String;
    deliveryAddress3 : String;
    total1 : Double;
    total2 : Double;
}

aspect OrderItems : cuid, managed
{
    name1 : String;
    name2 : String;
    name3 : String;
    listPrice1 : String;
    actualPrice1 : String;
    pic1 : String;
    pic2 : String;
    pic3 : String;
}

entity Products : cuid, managed
{
    name : String;
    officialName : String;
    longName1 : String;
    longName2 : String;
    title1 : String;
    title2 : String;
    image1 : String;
    image2 : String;
    manufacture : String;
    isOem : Boolean;
    validTo : Date;
    validTo2 : Date;
}

entity ProductDetails : temporal
{
    internalName1 : String;
    internalName2 : String;
    internalName3 : String;
    designDate : Date;
    designDate2 : Date;
    detail1 : String;
    detail2 : String;
    detail3 : String;
}

We now want to create the entity relationships among those entities using the CDS modeler. Open the CDS file using CDS Graphical Modeler for Visual Studio Code:

Now if we want to create a to-many association relationship from the Customers entity to the Orders entity, click the Customers entity and select “Add Relationship” context menu:

You will see a pointer line that moves along with your mouse pointer:

Move the line end point to the target Orders entity and release your mouse:

The relationship dialog shows up:

Change the “Relationship type” to “To-Many”, you will notice that a backlink property is automatically filled:

Click “Create” button to dismiss the dialog, you will see the relationship has been created between the Customers entity and the Orders entity:

Checking the CDS file content you will see the corresponding associations have been created as well:

entity Customers : cuid, managed
{
    name : String;
    officialName : String;
    companyName : String;
    address1 : String;
    address2 : String;
    industryRanking1 : String;
    industryRanking2 : String;
    industryRanking3 : String;
    sector1 : String;
    sector2 : String;
    sector3 : String;
    revenue : Decimal;
    forecast1 : Integer;
    forecast2 : Integer;
    payment1 : String;
    payment2 : String;
    payment3 : String;
    vertical1 : String;
    vertical2 : String;
    certical3 : String;
    orders : Association to many Orders on orders.customers = $self;
}

entity Orders : cuid, managed
{
    name : String;
    name1 : String;
    name2 : String;
    paymentMethod1 : String;
    paymentMethod2 : String;
    deliveryAddress1 : String;
    deliveryAddress2 : String;
    deliveryAddress3 : String;
    total1 : Double;
    total2 : Double;
    customers : Association to one Customers;
}

Now we want to create a to-many composition from the Orders to the OrderItems entity:

and the relationship dialog pops up:

Since the OrderItems is an aspect, so the only relationship that is allowed here is Composition. change the relationship type to “To-Many”:

Click “Create” button to close the dialog, and you will see the relationship has been created between the Orders and OrderItems:

We can also verify that by checking the CDS file content:

entity Orders : cuid, managed
{
    name : String;
    name1 : String;
    name2 : String;
    paymentMethod1 : String;
    paymentMethod2 : String;
    deliveryAddress1 : String;
    deliveryAddress2 : String;
    deliveryAddress3 : String;
    total1 : Double;
    total2 : Double;
    customers : Association to one Customers;
    orderItems : Composition of many OrderItems;
}

aspect OrderItems : cuid, managed
{
    name1 : String;
    name2 : String;
    name3 : String;
    listPrice1 : String;
    actualPrice1 : String;
    pic1 : String;
    pic2 : String;
    pic3 : String;
}

Now let’s create a to-one managed association from the OrderItems to Products entity:

Keep everything default and click “Create” button to dismiss the dialog, and the relationship is setup:

If we want to keep some of the product detail information to a separate ProductDetails entity to keep the Products entity clean, we need to setup a bi-directional association relationship between the 2 entities. First try to create a to-one managed association from the ProductDetails entity to the Products entity:

In the popped up relationship dialog as below, leave everything as default:

Click “Create” button to finish the relationship setup:

Now make the relationship bi-directional so that we can navigate from the Products to ProductDetails by creating a to-one association from the Products entity to the ProductDetails:

In the relationship dialog, keep everything as default:

Click “Create” button to finish the relationship setup:

The last thing we want to do is to make the “products” association in the ProductDetails entity the key property. Click the “products” property and click “Hide/Show Property Sheet” toolbar button:

You will see the property sheet has shown up:

Check the “Key” checkbox and notice the products to-one managed association has now become the key property for the entity:

Now the entire relationships for the CDS model have been successfully by using the CDS Graphical Modeler:

Rating: 0 / 5 (0 votes)

The post Creating entity relationship through CDS Graphical Modeler for VSCODE appeared first on ERP Q&A.

]]>
Consuming SOAP Web Services in SAP Cloud Application Programming Model (CAP) https://www.erpqna.com/consuming-soap-web-services-in-sap-cloud-application-programming-model-cap/?utm_source=rss&utm_medium=rss&utm_campaign=consuming-soap-web-services-in-sap-cloud-application-programming-model-cap Tue, 12 Apr 2022 10:38:19 +0000 https://www.erpqna.com/?p=61817 NOTE: This blog post is intended for developers who have previous experience in developing CAP applications which consume either OData or REST APIs using SAP Business Application Studio and SAP BTP destinations, as well as handling communication scenarios and communication arrangements in S/4HANA Cloud. Introduction You may have realized that most modern cloud applications are […]

The post Consuming SOAP Web Services in SAP Cloud Application Programming Model (CAP) appeared first on ERP Q&A.

]]>
NOTE: This blog post is intended for developers who have previous experience in developing CAP applications which consume either OData or REST APIs using SAP Business Application Studio and SAP BTP destinations, as well as handling communication scenarios and communication arrangements in S/4HANA Cloud.

Introduction

You may have realized that most modern cloud applications are built to interact with other external solutions via OData and/or REST APIs (and you may have already done it yourself in several opportunities). But, there are some solutions that still do not offer 100% of services as OData or REST, providing some interfaces via SOAP web services – this is the case, for example, of S/4HANA Cloud which exposes a variety of OData services but, for same cases, still sticks with SOAP web services.

If you started programming in the last 5 years, you might not even know what I’m talking about – or if you know, it’s the fault of some legacy system you have to deal with.

SOAP, or Simple Object Access Protocol (not to be confused with SOA), is a protocol for creating distributed and decentralized web services – the famous web services – which allow the exchange of information between systems, using the XML format. I don’t know when this started, but the latest version is from 2007 (1.2) and was the most popular way of connecting systems in the first decade of the 2000s, before the emergence of the REST protocol that encouraged web APIs with JSON and, later, microservices – lighter ways to exchange information between systems over the web.

As previously mentioned almost all the latest web and mobile technologies offer native (or close to) support for handling REST Web APIs, but little is said about SOAP connections. The purpose of this blog post is precisely to help those who need to deal with NodeJS CAP applications on the client and SOAP on the server.

Application Architecture

We are going to build a simple CAP application that reads Business Users from S/4HANA Cloud (which are exposed exclusively via a SOAP web service).

You can view the application architecture bellow:

Figure 1 – Application Architecture

Preparation Steps

Before we move forward with the instructions, let’s first get prepared for development by completing the following two basic steps:

  1. Set up a destination to your S/4HANA Cloud system using a communication user (not a business user). You can name your destination S4HC just to be compatible with the further instructions in the post.
  2. Jumpstart a new CAP project in SAP Business Application Studio: after opening your dev space, open a new terminal window, move to the “projects” directory and initialize a new blank project with cds init business-users. Open the newly created project in a workspace to get ready for development.

Download Web Service Definition

The first thing we need to do is to get the web service definition which is described in WSDL (Web Services Description Language).

WSDL is an XML notation for describing a web service. A WSDL definition tells a client how to compose a web service request and describes the interface that is provided by the web service provider.

We can get such definition from SAP API Business Hub, so access https://api.sap.com.

In the home page, first make sure to login to API Business Hub (NOTE: you must login for the download process to work properly), then, after login, click on the S/4HANA Cloud tile.

Figure 2 – Login to SAP API Business Hub and access S/4HANA Cloud section

In the next page, click on the APIs tab, then, in the find box, type “business user read” and press enter. Finally, click on the Business User – Read tile.

Figure 3 – Locate Business User – Read SOAP API

In the next page, click on API Specification.

NOTE: make sure your communication user (set up in your destination) has access to communication scenario SAP_COM_0193 or SAP_COM_0093, via some communication arramgement, as indicated below:

Figure 4 – Access API Specification

Click on the down arrow next to WSDL and save the QUERYBUSINESSUSERIN.wsdl file in your local computer.

Figure 5 – Download QUERYBUSINESSUSERIN.wsdl

Now, go to your CAP project in Business Application Studio and create a folder named “external” under the “srv” folder, then drag & drop the recently downloaded QUERYBUSINESSUSERIN.wsdl file into it like demonstrated below:

Figure 6 – QUERYBUSINESSUSERIN.wsdl file imported to CAP project

This definition will be used later upon creation of the web service client..

Create Reference to Web Service Host

The next step is to reference the web service in the project’s package,json file. So, open it up and insert the following code snippet right before the last curly bracket:

,
  "cds": {
    "requires": {
      "UserRead": {
        "kind": "SOAP",
        "credentials": {
          "destination": "S4HC",
          "path": "/sap/bc/srt/scs_ext/sap/querybusinessuserin"
        }
      }
    }
  }

After the insertion, your package.json file should look like the screenshot below:

Figure 7 – Reference to web service in package,json

Notice that the reference is pointing to the destination that has been previously created in BTP with the path to the actual service location.

Although the destination already points to the S/HANA Cloud host, as the WSDL file has been downloaded from SAP API Business Hub (not directly from a communication arrangement in the actual S4/HANA Cloud tenant), the service location within it is “generic”, meaning it’s just a “placeholder” in the format of “https://host:port/<service path>”.

Therefore, we still need to replace such placeholder with the actual S/4HANA Cloud tenant host. So, open-up the WSDL file and search for “location” (it’s usually at the end of the file) and replace all occurrences of “host:port/” with your S/4HANA Cloud tenant host (set up in your destination), like demonstrated below:

Figure 8 – Replace host:port with actual tenant host

Now, the project is fully set up to communicate with the web service via the application code.

Bind Destination and XSUAA Services

Before moving on to binding the services to your project, make sure to login to Cloud Foundry either via the BAS command palette (View > Find Command) or command line interface (CLI) in the terminal.

After login, create the destination and XSUAA service instances using the command lines below:

  • cf create-service destination lite busers-dest
  • cf create-service xsuaa application busers-xsuaa

Upon successful creation of the service instances, on the left-hand pane of BAS click on the Cloud Foundry icon (small light bulb), then on the refresh button at the top right of the pane.

The newly created instances should be displayed at end of the list, like demonstrated below:

Figure 9 – Newly created service instances

Right click on the busers-dest service instance and select “Bind service to a locally run project” and, then, in the dialog, make sure the business-users folder is selected and click on the “Select folder for .env file” button.

Figure 10 – Select business-users folder

Now, repeat the same procedure for the busers-xsuaa service instance.

Click again on the Explorer icon in the left-hand pane and, then, in the refresh button at the right of the project name. You should see the .env file that has been created like demonstrated below:

Figure 11 – .env file for services binding

Now, we need to rename that file to default-env.json and transform its contents into JSON format (as it’s just defining a regular environment variable called VCAP_SERVICES which is not in the expected JSON format), like demonstrated below:

Figure 12 – Rename .env and adjust to JSON format

Install Dependencies

To facilitate the creation of SOAP clients and avoid to much effort preparing a SOAP envelope in XML format to invoke the web service’s methods via HTTP request, we can use a pre-built NodeJS SOAP client manager from the “soap” package available on npm (in Java there’s already native support for SOAP web services).

Also, to easily fetch and manipulate the destination data from BTP we will utilize the “core” node package from the SAP Cloud SDK.

So, tet’s install those dependencies by running the following commands:

  • npm install soap
  • npm install @sap-cloud-sdk/core

NOTE: if you intend to deploy your application to Cloud Foundry later you must also install the “passport” package, as the latest CDS version is enforcing security on bootstrap when deployed to production.

And, finally, to install the other dependencies from the original jump-started package.json we just run:

  • npm install

And with that, we are now fully ready to start the development!

Create Module to Handle SOAP Clients

In a productive scenario, we usually would have more than one web service being manipulated throughout the application code, so it’s convenient to create a module with a function that facilitates the creation and handling of SOAP clients and can be largely reused.

Under the “srv” folder create a folder named “lib” and, in that folder, create a file named “soap-service.js” like demonstrated below:

Figure 13 – Create soap-service.js file

Open-up that file and paste the following code snippet into it:

const soap = require('soap');
const core = require('@sap-cloud-sdk/core')

// Helper to create a soap service through the BTP destination service
async function getSoapService(service, wsdl) {
    // Get the service reference
    var definition = cds.env.requires[service];
    // Get the destination data from BTP
    const dest = await core.getDestination(definition.credentials.destination);
    // Create an httpClient which connects over the BTP destination
    var httpClient = {
        request: async function (url, data, callback, exheaders, exoptions) {
            core.executeHttpRequest(dest, {
                method: 'POST',
                url: url,
                data: data,
                headers: exheaders
            }, exoptions).then((result) => {
                callback(null, result, result.data);
            }).catch((e) => {
                callback(e);
            });
        }
    }

    // Instantiate the service using that http client
    return soap.createClientAsync(wsdl, { httpClient: httpClient });
}

module.exports = {
    getSoapService
}

The code is quite simple: the getSoapService function receives the service name (defined in package.json) and the wsdl file location with the service description, then fetches the destination data from BTP and uses it to create an HTTP client, which is then passed to the createClientAsync function from the soap package, that returns a promise to get the actual SOAP client later in the application code.

Create Handler to Read Business Users

Now, we must create a handler function which will invoke the web service method that returns the business users from S/4HANA Cloud as response.

In the “lib” folder, create a file named “handlers.js” like demonstrated below:

Figure 14 – Create handlers.js file

Open-up that file and paste the following code snippet into it:

const { getSoapService } = require('./soap-service');

let userReadServicePromise = null;

(async function () {
    // Connect to external S/4HC SOAP service
    userReadServicePromise = getSoapService('UserRead', './srv/external/QUERYBUSINESSUSERIN.wsdl');
})();

/*** HANDLERS ***/

async function readBusinessUser(req) {
    try {
        // Get the SOAP client for the UserRead service
        const userReadService = await userReadServicePromise;
        // Set the parameters for the QueryBusinessUserIn method of the sevice 
        const param = {
            BusinessUser: {
                PersonIDInterval: {
                    IntervalBoundaryTypeCode: 9,
                    LowerBoundaryPersonID: "0000000000"
                },
                BusinessPartnerRoleCodeInterval: {
                    IntervalBoundaryTypeCode: 9,
                    LowerBoundaryBusinessPartnerRoleCode: "000000"
                }
            },
            QueryProcessingConditions: {
                QueryHitsUnlimitedIndicator: true
            }
        };

        // Invoke QueryBusinessUserIn method asynchronously and wait for the response
        const resp = await userReadService.QueryBusinessUserInAsync(param);

        // Prepare the actual service response
        const busUsers = [];
        if (resp && resp[0] && resp[0].BusinessUser) {
            resp[0].BusinessUser.forEach(busUser => {
                busUsers.push({
                    ID: ((busUser.User) ? busUser.User.UserID : busUser.PersonID),
                    FirstName: busUser.PersonalInformation.FirstName,
                    LastName: busUser.PersonalInformation.LastName,
                    PersonFullName: busUser.PersonalInformation.PersonFullName,
                    BusinessPartnerRoleCode: busUser.BusinessPartnerRoleCode,
                    HasUser: ((busUser.User) ? true : false)
                });
            });
        }

        return busUsers;
    } catch (err) {
        req.error(err.code, err.message);
    }
}

module.exports = {
    readBusinessUser
}

Upon module loading, a promise to the service is created by the getSoapService function, which is, then, used in the readBusinessUser handler to get the actual SOAP client for the UserRead service referenced in package.json. The QueryBusinessUserIn method of the service is invoked asynchronously, passing a few required parameters (basically filters and query processing definitions), and the response is finally formatted into the expected CAP service response.

Create the CAP Service Definition

Now that we have the complete business logic of the application we can create the CAP service definition to expose the data fetched from S4/HANA Cloud via a service entity and attach the handler function to its READ event.

In the “srv” folder, create a file named “busers-service.cds” like demonstrated below:

Figure 15 – Create busers-service.cds file

Open-up that file and paste the following code snippet into it:

namespace sap.extensions.soap.busers;

service BusinessUsers @(path : '/bussiness-users') {
    entity BusinessUser {
        ID : String(12);
        FirstName : String(128);
        LastName : String(128);
        PersonFullName : String(258);
        BusinessPartnerRoleCode : String(6);
        HasUser : Boolean;
    }
}

Here we are just exposing a non-persistent entity named BusinessUser containing only some relevant fields to be fetched from S/4HANA Cloud.

Attach the Handler Function to the READ Event

The last step is to attach the handler function to the READ event of the BusinessUser entity, so the CAP service can be called to retrieve the business users data from S/4HANA Cloud and return it to the client as an OData v4 response.

In the “srv” folder, create a file named “busers-service.js” like demonstrated below:

Figure 16 – Create busers-service.js file

Open-up that file and paste the following code snippet into it:

const cds = require('@sap/cds');
const {
    readBusinessUser
} = require('./lib/handlers');

module.exports = cds.service.impl(async function () {
    /*** SERVICE ENTITIES ***/
    const {
        BusinessUser
    } = this.entities;

    /*** HANDLERS REGISTRATION ***/
    // ON events
    this.on('READ', BusinessUser, readBusinessUser);
});

As you can see, the readBusinessUser handler is attached to the ON READ event of the BusinessUser entity.

Test the Application

Finally, we are all set! So, let’s test the application by running cds watch and. then, clicking on the http://localhost:4004 link to open-up the CAP service home page in a new tab:

Figure 17 – CAP service home page

Now, click on the BusinessUser entity link and, after some seconds, you should get the JSON containing the Business Users from your S/4HANA Cloud tenant, like demonstrated below:

Figure 18 – Business users from S/4HANA Cloud tenant

And that’s it! A fully working CAP service consuming a SOAP web service from S/4HANA Cloud.

Rating: 0 / 5 (0 votes)

The post Consuming SOAP Web Services in SAP Cloud Application Programming Model (CAP) appeared first on ERP Q&A.

]]>
Extend SAP SuccessFactors on SAP BTP with CAP – Add Business Logic https://www.erpqna.com/extend-sap-successfactors-on-sap-btp-with-cap-add-business-logic/?utm_source=rss&utm_medium=rss&utm_campaign=extend-sap-successfactors-on-sap-btp-with-cap-add-business-logic Fri, 25 Mar 2022 10:36:47 +0000 https://www.erpqna.com/?p=61305 Create Code File The business logic of the application is implemented via custom service handlers for the various operations executed on its entities (create, read, update, delete, etc.). Those handlers are defined in a module within a JavasScript file with the same name of the service but with the .js extension. So, let’s create it! […]

The post Extend SAP SuccessFactors on SAP BTP with CAP – Add Business Logic appeared first on ERP Q&A.

]]>
Create Code File

The business logic of the application is implemented via custom service handlers for the various operations executed on its entities (create, read, update, delete, etc.). Those handlers are defined in a module within a JavasScript file with the same name of the service but with the .js extension.

So, let’s create it!

1. On the left-hand pane of SAP Business Application Studio, select the srv folder, then click on the three dots to the right of the project name and select New File

Figure 1 – Create New File

2. On the dialog name the file projman-service.js and click OK

Figure 2 – Set File Name

Service Module Coding

Copy and paste the code snippet below into the recently created file:

const cds = require('@sap/cds');

module.exports = cds.service.impl(async function () {
    /*** SERVICE ENTITIES ***/
    const {
        Project,
        Member,
        SFSF_User,
    } = this.entities;

    /*** HANDLERS REGISTRATION ***/
    // ON events

    // BEFORE events

    // AFTER events
});

Here we import the @sap/cds dependency and reference it as cds. Then, we implement the service module and, inside it, we reference three entities: Project, Member and SFSF_User, as we are supposed to develop handlers for them.

Finally, we make some comments as placeholders to mark where we will further put our code.

Organize Your Code

It is a best practice to have your code organized into files representing the nature of the code (i.e. utility functions should go into some “utils” file, handlers should go into some “handlers” file and so on). Those files represent your “code library”, so it’s appropriate to store them into some “lib” folder.

So, let’s create the lib folder and its contents.

1. In the Terminal press CTRL+C to terminate the cds watch command

Figure 3 – Terminate cds watch

2. Type cd srv and press Enter

Figure 4 – Change to srv directory

3. Type mkdir lib and press Enter

Figure 5 – Create lib directory

4. Type touch lib/handlers.js and press Enter

Figure 6 – Create handlers.js file

Handlers Coding

Now, let’s develop the required service handlers according to the business rules that have been defined in the series introduction and preparation.

SFSF_User Read Handler

1. On the left-hand pane expand the lib folder, then click on the hanlers.js file to open it

Figure 7 – Open handlers.js file

2. Copy and paste the following code snippet into handlers.js:

const cds = require('@sap/cds');

let userService = null;
let assService = null;

(async function () {
    // Connect to external SFSF OData services
    userService = await cds.connect.to('PLTUserManagement');
    assService = await cds.connect.to('ECEmployeeProfile');
})();

/*** HELPERS ***/

// Remove the specified columns from the ORDER BY clause of a SELECT statement
function removeColumnsFromOrderBy(query, columnNames) {
    if (query.SELECT && query.SELECT.orderBy) {
        columnNames.forEach(columnName => {
            // Look for column in query and its respective index
            const element = query.SELECT.orderBy.find(column => column.ref[0] === columnName);
            const idx = query.SELECT.orderBy.indexOf(element);

            if (idx > -1) {
                // Remove column from oder by list
                query.SELECT.orderBy.splice(idx, 1);
                if (!query.SELECT.orderBy.length) {
                    // If list ends up empty, remove it from query
                    delete query.SELECT.orderBy;
                }
            }
        });
    }

    return query;
}

/*** HANDLERS ***/

// Read SFSF users
async function readSFSF_User(req) {
    try {
        // Columns that are not sortable must be removed from "order by"
        req.query = removeColumnsFromOrderBy(req.query, ['defaultFullName']);

        // Handover to the SF OData Service to fecth the requested data
        const tx = userService.tx(req);
        return await tx.run(req.query);
    } catch (err) {
        req.error(err.code, err.message);
    }
}

module.exports = {
    readSFSF_User
}

3. Back to projman-service.js add the following lines right under const cds = require(‘@sap/cds’);

const {
    readSFSF_User
} = require('./lib/handlers');

4. Add the following line right under the comment // ON events

this.on('READ', SFSF_User, readSFSF_User);

Your projman-service.js code should now look like this:

Figure 8 – Current stage of projman-service.js

Let’s quickly analyze what’s been done.

In the lib/handlers.js file we connected cds to the SAP SuccessFactors external services, coded one helper function that removes undesired columns from the “order by” clause of a query’s select statement and, finally, coded the handler function to read the users from SAP SuccessFactors using the PLTUserManagement service.

The code logic is well explained in the detailed comments.

Now, let’s test that handler.

5. In the Terminal type cd .. and press Enter to go back to the project root directory

Figure 9 – Change back to project root

6. Once again type cds watch and press Enter. Then CTRL+Click on the http://localhost:4004 link to launch the application home page

Figure 10 – Application home page

7. Click on the SFSF_User link

Figure 11 – Users from SAP SuccessFactors

Now, you should be able to view the users that are being read from SAP SuccessFactors via the User entity from the PLTUserManagement service.

Other Handlers

That was the most important handler we should first implement as it’s the one responsible for bringing the SAP SuccessFactors’ employees into our application.

Now, we can “fast forward” and implement all the other handlers of our application at once.

1. Open the lib/hanlers.js file, then copy and paste the following code over the current content:

const cds = require('@sap/cds');

let userService = null;
let assService = null;

(async function () {
    // Connect to external SFSF OData services
    userService = await cds.connect.to('PLTUserManagement');
    assService = await cds.connect.to('ECEmployeeProfile');
})();

/*** HELPERS ***/

// Remove the specified columns from the ORDER BY clause of a SELECT statement
function removeColumnsFromOrderBy(query, columnNames) {
    if (query.SELECT && query.SELECT.orderBy) {
        columnNames.forEach(columnName => {
            // Look for column in query and its respective index
            const element = query.SELECT.orderBy.find(column => column.ref[0] === columnName);
            const idx = query.SELECT.orderBy.indexOf(element);

            if (idx > -1) {
                // Remove column from oder by list
                query.SELECT.orderBy.splice(idx, 1);
                if (!query.SELECT.orderBy.length) {
                    // If list ends up empty, remove it from query
                    delete query.SELECT.orderBy;
                }
            }
        });
    }

    return query;
}

// Helper for employee create execution
async function executeCreateEmployee(req, userId) {
    const employee = await cds.tx(req).run(SELECT.one.from('Employee').columns(['userId']).where({ userId: { '=': userId } }));
    if (!employee) {
        const sfsfUser = await userService.tx(req).run(SELECT.one.from('User').columns(['userId', 'username', 'defaultFullName', 'email', 'division', 'department', 'title']).where({ userId: { '=': userId } }));
        if (sfsfUser) {
            await cds.tx(req).run(INSERT.into('Employee').entries(sfsfUser));
        }
    }
}

// Helper for employee update execution
async function executeUpdateEmployee(req, entity, entityID, userId) {
    // Need to check whether column has changed
    const column = 'member_userId';
    const query = SELECT.one.from(entity).columns([column]).where({ ID: { '=': entityID } });
    const item = await cds.tx(req).run(query);
    if (item && item[column] != userId) {
        // Member has changed, then:
        // Make sure there's an Employee entity for the new assignment
        await executeCreateEmployee(req, userId);

        // Create new assignment
        await createAssignment(req, entity, entityID, userId);
    }
    return req;
}

// Helper for assignment creation 
async function createAssignment(req, entity, entityID, userId) {
    const columns =  m => { m.member_userId`as userId`, m.parent(p => { p.name`as name`, p.description`as description`, p.startDate`as startDate`, p.endDate`as endDate` }), m.role(r => { r.name`as role` }) };
    const item = await cds.tx(req).run(SELECT.one.from(entity).columns(columns).where({ ID: { '=': entityID } }));
    if (item) {
        const assignment = {
            userId: userId,
            project: item.parent.name,
            description: item.role.role + " of " + item.parent.description,
            startDate: item.parent.startDate,
            endDate: item.parent.endDate
        };
        console.log(assignment);
        const element = await assService.tx(req).run(INSERT.into('Background_SpecialAssign').entries(assignment));
        if (element) {
            await cds.tx(req).run(UPDATE.entity(entity).with({ hasAssignment: true }).where({ ID: entityID }));
        }
    }
    return req;
}

// Helper for cascade deletion
async function deepDelete(tx, ID, childEntity) {
    return await tx.run(DELETE.from(childEntity).where({ parent_ID: { '=': ID } }));
}

/*** HANDLERS ***/

// Read SFSF users
async function readSFSF_User(req) {
    try {
        // Columns that are not sortable must be removed from "order by"
        req.query = removeColumnsFromOrderBy(req.query, ['defaultFullName']);

        // Handover to the SF OData Service to fecth the requested data
        const tx = userService.tx(req);
        return await tx.run(req.query);
    } catch (err) {
        req.error(err.code, err.message);
    }
}

// Before create/update: member
async function createEmployee(req) {
    try {
        // Add SFSF User to Employees entity if it does not exist yet
        const item = req.data;
        const userId = (item.member_userId) ? item.member_userId : null;
        if (userId) {
            await executeCreateEmployee(req, userId);
        }
        return req;
    } catch (err) {
        req.error(err.code, err.message);
    }
}

// After create: member
async function createItem(data, req) {
    try {
        // Create assignment in SFSF
        console.log('After create.');
        await createAssignment(req, req.entity, data.ID, data.member_userId);
        return data;
    } catch (err) {
        req.error(err.code, err.message);
    }
}

// Before update: member
async function updateEmployee(req) {
    try {
        // Need to check if team member was updated
        if (req.data.member_userId) {
            const ID = (req.params[0]) ? ((req.params[0].ID) ? req.params[0].ID : req.params[0]) : req.data.ID;
            const userId = req.data.member_userId;
            await executeUpdateEmployee(req, req.entity, ID, userId);
        }
        return req;
    } catch (err) {
        req.error(err.code, err.message);
    }
}

// Before delete: project or member
async function deleteChildren(req) {
    try {
        // Cascade deletion
        if (req.entity.indexOf('Project') > -1) {
            await deepDelete(cds.tx(req), req.data.ID, 'Activity');
            await deepDelete(cds.tx(req), req.data.ID, 'Member');
        } else {
            const item = await cds.tx(req).run(SELECT.one.from(req.entity).columns(['parent_ID']).where({ ID: { '=': req.data.ID } }));
            if (item) {
                await deepDelete(cds.tx(req), item.parent_ID, 'Activity');
            }
        }
        return req;
    } catch (err) {
        req.error(err.code, err.message);
    }
}

// After delete/update: member
async function deleteUnassignedEmployees(data, req) {
    try {
        // Build clean-up filter
        const members = SELECT.distinct.from('Member').columns(['member_userId as userId']);
        const unassigned = SELECT.distinct.from('Employee').columns(['userId']).where({ userId: { 'NOT IN': members } });

        // Get the unassigned employees for deletion
        let deleted = await cds.tx(req).run(unassigned);

        // Make sure result is an array
        deleted = (deleted.length === undefined) ? [deleted] : deleted;

        // Clean-up Employees
        for (var i = 0; i < deleted.length; i++) {
            const clean_up = DELETE.from('Employee').where({ userId: { '=': deleted[i].userId } });
            await cds.tx(req).run(clean_up);
        }
        return data;
    } catch (err) {
        req.error(err.code, err.message);
    }
}

// Before "save" project (exclusive for Fiori Draft support)
async function beforeSaveProject(req) {
    try {
        if (req.data.team) {
            // Capture IDs and users from saved members
            let users = []
            req.data.team.forEach(member => { users.push({ ID: member.ID, member_userId: member.member_userId }); });

            // Get current members
            let members = await cds.tx(req).run(SELECT.from('Member').columns(['ID', 'member_userId']).where({ parent_ID: { '=': req.data.ID } }));
            if (members) {
                // Make sure result is an array
                members = (members.length === undefined) ? [members] : members;

                // Process deleted members
                const deleted = [];
                members.forEach(member => {
                    const element = users.find(user => user.ID === member.ID);
                    if (!element) deleted.push(member);
                });
                for (var i = 0; i < deleted.length; i++) {
                    // Delete members' activities
                    await cds.tx(req).run(DELETE.from('Activity').where({ assignedTo_ID: { '=': deleted[i].ID } }));
                    if (req.data.activities) {
                        let idx = 0;
                        do {
                            idx = req.data.activities.findIndex(activity => activity.assignedTo_ID === deleted[i].ID);
                            if (idx > -1) {
                                req.data.activities.splice(idx, 1);
                            }
                        } while (idx > -1)
                    }
                }

                // Process added members
                const added = [];
                users.forEach(user => {
                    const element = members.find(member => user.ID === member.ID);
                    if (!element) added.push(user);
                });
                for (var i = 0; i < added.length; i++) {
                    executeCreateEmployee(req, added[i].member_userId);
                }

                // Process updated members
                const updated = [];
                users.forEach(user => {
                    const element = members.find(member => user.ID === member.ID);
                    if (element) updated.push(user);
                });
                for (var i = 0; i < updated.length; i++) {
                    await executeUpdateEmployee(req, 'Member', updated[i].ID, updated[i].member_userId);
                }
            }
        }
        return req;
    } catch (err) {
        req.error(err.code, err.message);
    }
}

// After "save" project (exclusive for Fiori Draft support)
async function afterSaveProject(data, req) {
    try {
        if (data.team) {
            // Look for members with unassigned elementId
            let unassigned = await cds.tx(req).run(SELECT.from('Member').columns(['ID', 'member_userId']).where({ parent_ID: { '=': data.ID }, and: { hasAssignment: { '=': false } } }));
            if (unassigned) {
                // Make sure result is an array
                unassigned = (unassigned.length === undefined) ? [unassigned] : unassigned;

                // Create SFSF assignment
                for (var i = 0; i < unassigned.length; i++) {
                    await createAssignment(req, 'Member', unassigned[i].ID, unassigned[i].member_userId);
                }
            }
        }
        deleteUnassignedEmployees(data, req);

        return data;
    } catch (err) {
        req.error(err.code, err.message);
    }
}

module.exports = {
    readSFSF_User,
    createEmployee,
    createItem,
    updateEmployee,
    deleteChildren,
    deleteUnassignedEmployees,
    beforeSaveProject,
    afterSaveProject
}

We just added three additional helpers: two for employee creation/update and one for the special assignment creation in SAP SuccessFactors.

Then, we added the required handlers for the before and after events.

The code logic is well explained in the comments details.

2. Open the srv/projman-service.js file, then copy and paste the following code over the current content:

const cds = require('@sap/cds');
const {
    readSFSF_User,
    createEmployee,
    updateEmployee,
    createItem,
    deleteChildren,
    deleteUnassignedEmployees,
    beforeSaveProject,
    afterSaveProject
} = require('./lib/handlers');

module.exports = cds.service.impl(async function () {
    /*** SERVICE ENTITIES ***/
    const {
        Project,
        Member,
        SFSF_User,
    } = this.entities;

    /*** HANDLERS REGISTRATION ***/
    // ON events
    this.on('READ', SFSF_User, readSFSF_User);

    // BEFORE events
    this.before('CREATE', Member, createEmployee);
    this.before('UPDATE', Member, updateEmployee);
    this.before('DELETE', Project, deleteChildren);
    this.before('DELETE', Member, deleteChildren);
    this.before('SAVE', Project, beforeSaveProject); // Fiori Draft support

    // AFTER events
    this.after('CREATE', Member, createItem);
    this.after('UPDATE', Member, deleteUnassignedEmployees);
    this.after('DELETE', Project, deleteUnassignedEmployees);
    this.after('DELETE', Member, deleteUnassignedEmployees);
    this.after('SAVE', Project, afterSaveProject); // Fiori Draft support
});

Here we just attached the handler functions to their corresponding events.

And, with that, we completed the coding of the business logic for our application.

Rating: 0 / 5 (0 votes)

The post Extend SAP SuccessFactors on SAP BTP with CAP – Add Business Logic appeared first on ERP Q&A.

]]>
Exploring SAP Business Application Studio for low-code development and deploying a multichannel sample app to mobile https://www.erpqna.com/exploring-sap-business-application-studio-for-low-code-development-and-deploying-a-multichannel-sample-app-to-mobile/?utm_source=rss&utm_medium=rss&utm_campaign=exploring-sap-business-application-studio-for-low-code-development-and-deploying-a-multichannel-sample-app-to-mobile Tue, 04 Jan 2022 11:27:12 +0000 https://www.erpqna.com/?p=58584 In this blog post I want to showcase the new offering in a hands-on fashion, making use of a multichannel sample application that will be deployed into the cloud as well as on mobile. Prerequisites To get started, you’ll need a productive Cloud Foundry global account on SAP Business Technology Platform, and for the time […]

The post Exploring SAP Business Application Studio for low-code development and deploying a multichannel sample app to mobile appeared first on ERP Q&A.

]]>
In this blog post I want to showcase the new offering in a hands-on fashion, making use of a multichannel sample application that will be deployed into the cloud as well as on mobile.

Prerequisites

To get started, you’ll need a productive Cloud Foundry global account on SAP Business Technology Platform, and for the time being, this account must be located on the EU10 (AWS) Data Center located in Frankfurt, Germany. Other Data Centers are currently not supported. In the coming weeks we will be adding the support for Free-Tier as well. Note that we will not provide this offering on the trial landscape.

In the steps below, I assume you have some exposure to the global account and subaccount cockpit pages and know how to add service entitlements. Please also be aware of potential cost aspects, as we will be using charged services (e.g. HANA Cloud database).

Low-code Booster

To make it easier to configure your subaccount, we provide a Booster that helps you subscribe to the necessary services and defines the roles and role collections required for starting your development journey. You’ll have the option to create a brand new subaccount, or apply the Booster to an existing subaccount. In your global account cockpit, please select Boosters and start the Booster called Prepare an Account for Low-Code / No-Code App Development.

The Booster provides a quick overview, lists the components involved and links to documentation. Select Start.

On the next page, the prerequisites are checked. If your global account does not have the entitlements required, it will alert you about this. You can add the entitlements through the Control Center.

On the next page, select whether you want to create a new subaccount, or select an existing subaccount to configure.

In my case, I choose to Create Subaccount. You will see exactly which entitlements and plans are going to be added, and you can change to generated names if you want.

Go through the remaining pages and the Booster will create the subaccount specified. At the end of the creation phase, a dialog will be shown. Select Go to Application. This will open the Application Development page in a new browser tab. You can now start your low-code / no-code development. However, for deployment of your application, you will need to prepare more on your subaccount. Therefore, please go back to the browser tab where you’ve started the Booster and close the dialog.

Note: if you quickly want to get started developing apps, you can skip the below set up steps. However, please remember to come back here once you are ready to deploy your application.

Workflow Management Booster

The low-code environment also supports the development of workflows. For this you will need to configure your environment with the Workflow Management Booster. In your global account, select Boosters and search for Set up account for Workflow Management.

Select the subaccount you’ve created or updated with the previous Booster.

Once this Booster completes successfully, select Navigate to Subaccount.

HANA Cloud Database

For previewing your application, we will use a ‘locally’ running SQLite database available within the SAP Business Application Studio environment. However, once you are going to deploy your application, you will need a HANA Cloud database to store your application’s data. If you already have a subaccount set up with a HANA Cloud database instance, then you can use this subaccount as deployment target for your applications as well. However, this really depends on how your organisation manages the cloud environments.

In my case, I am going to use the subaccount created earlier as target environment as well. I therefore need to add the HANA Cloud database to this subaccount, using the following steps.

Assign the entitlement SAP HANA Cloud to the subaccount.

And also add the entitlement SAP HANA Schemas & HDI Containers.

In your subaccount’s cockpit, go to the Service Marketplace and search for HANA. Select SAP HANA Cloud and click Create.

In the follow dialog, you will not be able to select Create. Instead, click the link to manage SAP HANA Cloud instances.

This will bring you to another configuration page in the Space you’ve selected. Select Create, and SAP HANA database.

This will bring you to the SAP HANA Cloud Central. Once loaded, select the type of instance you need. In this case, select SAP HANA Cloud, SAP HANA Database.

Ensure the correct Cloud Foundry organisation is selected, provide the instance name and admin credentials. Then, click Create Now.

This will bring you to a page showing all database instances and their status. Take note that the creation will take time (typically more than 10 minutes). This page will not update the status automatically. Hit the refresh button to check the status.

Once the creation is done, the newly created database will be in RUNNING state.

Cloud Foundry Runtime

For the initial development steps, you won’t need this, but in order to deploy and run processes on the SAP Business Technology Platform, you will need to configure Cloud Foundry Runtime (memory). If you had to set up a subaccount for deployment in the previous step, then you will also need to add the runtime here.

Depending on the size of your application, you might need to configure more runtime units. In my case, I have added 2 units (equivalent to 2 GB) to my subaccount.

SAP Mobile Services

The last service we need to add, is SAP Mobile Services. This service requires a separate license. In your subaccount’s cockpit, add the Mobile Services entitlement. We need this service in order to deploy the mobile application as part of the multichannel sample app demonstrated here.

Application Development

Once the subscriptions are in place on your subaccount and the roles are configured for your user, you can access the Application Development page. This is the central entry point for all your low-code and no-code development. From here you are able to create projects in SAP AppGyver as well as SAP Business Application Studio for low-code development.

The top of the page shows a few quick start links that allow you to explore what is possible. For this blog post, we will be using the link to create an e-commerce application that we have published on GitHub. All the code is available for you to explore and modify for your learning. Select the tile Create a sample E-commerce Application.

A dialog appears for creating a Business Application Project, where you provide a project name and a short description that will be visible in the project overview on the Application Development page.

Once you hit the Create button, a special instance of SAP Business Application Studio opens:

SAP Business Application Studio for low-code development

Many of you will already be familiar with Business Application Studio. However, to make it easier for citizen developers to get started, we have created a simplified perspective that is able to gradually grow into a more advanced environment as the developer is learning and growing into it. The first time you start your development environment, it takes a few minutes to load as the container is created and spinning up.

Once fully loaded, you will see a tab called Guided Development, which introduces you to the basic steps in developing your first app. However, since we are opening a sample e-commerce app, we can skip this for now and select the Home tab instead. This will bring you to the Project Homepage.

Project Home Page

The Project Homepage provides a quick overview of the project’s contents. The ESPM project contains a full-stack application comprising a data model, data services, sample data and a user interface application. The parts can be shown in a default tile view, but also in a list view. From here, you are also able to navigate to the Storyboard.

Data Model Editor

Selecting one of the data entities will open the CDS Graphical Modeler in a low-code mode. Those who have read my blog posts covering Mobile Backend Tools will notice that the editor is looking quite similar. Indeed, this editor is based on the very same core and is capable of handling CAP data models and service definitions.

Besides the data model that defines the database tables and relationships, the editor can also be used to create data services to expose this data through OData endpoints. And the code to enable this in your application will be completely generated; no coding required.

Sample Data Editor

Exposing an empty database through data services is not going to help you develop a user interface on top of it. You need some sample data to get started. For this, we have added a sample data editor. Using this editor you can manually add sample data, or let the editor generate it for you. The editor will validate that the content matches the data types. The resulting data is stored in csv files that are used by CAP.

Project Tree

The standard Business Application Studio chrome will show several icons for the various built-in features. For low-code development, we have reduce this to only a few icons on the left hand pane. One of them is called the Project Tree, which you can open to browse a semantical overview of the project’s contents. The Data Models and Services section will show the data entities and service entities. The Mobile-Centric, Freestyle Application section shows the build blocks of the MDK based UI application. From here you can explore and modify the application’s pages, actions and rules, as some of you already know from MDK development. When you select the Overview page, it will open up the MDK editor as shown below.

Preview

After exploring the project, let’s have a quick look at the resulting application. We can do this without any deployment step. On the Project Home page, click the Preview button. This will trigger the installation of some dependent modules, and generate / update some of the project’s files. We will be using a locally running (that is, localyl within the cloud development environment) SQLite database to temporarily store data. The user interface is served from within the development environment as well.

Once the preview is generated, a launchpad is opened, showing the application’s tile as it would be available in a Launchpad. Besides that, you can explore the data services exposed in this application. You can view the data in ‘raw’ JSON format, or consumed through a simple Fiori Elements application, allowing you to filter the information you need.

When you select the ESPM tile, the MDK based web application will be loaded. You will be able to browse the (sample) data and add or modify the data using the application. However, do take note that data is only stored in a temporary database while in Preview.

Deployment

Once we are done with developing the application, it is time to deploy it to Cloud Foundry. For this, go to the Project Home page and hit the Deploy button. If you have not logged into your target subaccount yet, you will be prompted to do this first. In my case, I will use the same subaccount as created for development.

If the deployment fails, you are advised the open the log file available in the user folder and check the reason for the failure. Make sure to complete all prerequisites mentioned in the setup.

Once the application is deployed, you will find 2 HTML5 applications in your subaccount’s cockpit: ESPMLaunchpad which is the same Launchpad as available in Preview, and ESPM, which is the actual UI application.

In this case the application’s data is stored in the HANA Cloud database and changes or additions made through the UI apps are persisted.

Deployment to Mobile Services

What we have running until now, is basically a web application. Using the exact same source code, we can turn this into a native mobile app for iOS and Android.

In the Project Tree, right click on the Application in Mobile-Centric, Freestyle Application. In the context menu, select Deploy to Mobile Services. This will create an application definition in SAP Mobile Services. It also triggers a build that generates a metadata bundle which is uploaded into SAP Mobile Services. From there SAP Mobile Services will take care of the distribution of the app to mobile devices, and act as a proxy to interface between your mobile devices and your OData services.

While waiting for the build and deployment to finish, let’s make sure you have the required SAP Mobile Services Client app on your mobile device. Please visit the Apple AppStore or Google Play store and download the app onto your device.

Once the deployment is finished, you will see the below message, with the revision number. Every time you deploy and update, this revision number will automatically increase.

Open the SAP Mobile Services App on your device.

Using the mobile app, scan the deployed application’s QR code. Where do you find the QR code? In the Project Tree, left-click on the Application to open the Application Editor. You will notice there is a button to show a QR code for onboarding with the mobile client app.

Scan the QR code shown.

Enter your SAP account’s credentials on your device and you will see the e-commerce app as native app on your device. The app is responsive and will adapt its layout depending on your device’s orientation. It supports phones and tablets as well.

Rapid iterations

Now that you have gone through the whole flow and have the mobile app running on your device, you can easily make changes to the app and deploy the updates to Mobile Services. Once the SAP Mobile Services Client app detects a revision change, it will automatically ask whether you want to update the mobile app to the latest revision. The mobile app can also be shared with other users for easy validation.

Rating: 0 / 5 (0 votes)

The post Exploring SAP Business Application Studio for low-code development and deploying a multichannel sample app to mobile appeared first on ERP Q&A.

]]>
Create SAP CAP service with Low-Code No Code (LCNC) https://www.erpqna.com/create-sap-cap-service-with-low-code-no-code-lcnc/?utm_source=rss&utm_medium=rss&utm_campaign=create-sap-cap-service-with-low-code-no-code-lcnc Tue, 07 Dec 2021 10:27:15 +0000 https://www.erpqna.com/?p=57477 In this blog, how to create SAP CAP service using Low-Code No Code from the TechEd 2021 session. Pre-requisites SAP BTP environment with Low-Code/No-Code Entitlements. SAP AppGyver Step 1: Open Development Lobby, Click on Create and choose Business Application and provide the project name and click on Create. Provide the project details as below Step […]

The post Create SAP CAP service with Low-Code No Code (LCNC) appeared first on ERP Q&A.

]]>
In this blog, how to create SAP CAP service using Low-Code No Code from the TechEd 2021 session.

Pre-requisites

  1. SAP BTP environment with Low-Code/No-Code Entitlements.
  2. SAP AppGyver

Step 1: Open Development Lobby, Click on Create and choose Business Application and provide the project name and click on Create.

Provide the project details as below

Step 2: Project will be opened in a separated window as below. Here we can see LCNC platform is providing the capability to add Data Model, Service, Sample Data, UI and Workflow to our project directly.

Step 3: Now let us create a data model by clicking on the + icon under Data Models.

Step 4: It will launch Data Model Editor, with a dialog to provide entity details. Provide entity details for Product as below and click on Create.

We can our entity in the data model editor. We need to follow the same way, to add other entities by choose add entity under + sign.

Supplier Entity

Customer Entity

Order Entity

Step 5: Now we need to create the relationship (association) between these entities. Click on Customer entity, an overlap will come on the right side. Click on the relationship icon (3rd icon from the top), new dialog will open and provide the details as below.

Follow the same for Product, Supplier

Step 6: Now lets us create a service. Go to the home screen and click on the + icon under the service.

It will launch the Service Editor, with the dialog box opened as below. Provide the required details and click on the create.

Also check Draft Editing checkbox if you want to enable draft functionality.

Repeat the same for other services.

Step 7: Lets add some test data by clicking on the + icon under Sample Data.

Below dialog box will open, choose the Create and select the entity from the dropdown and click on the Create.

It will launch the screen as below, click on Add Row on the top right and start entering the details as below. Data will be saved automatically, if not use Ctrl+S.

Customer

Repeat the same for others in a sequence as Product, Supplier & Order

Order

Supplier

Step 8: Lets us run the service now. Go to Home screen, click on the Preview button on the top right corner. After few seconds, server will start, we can see that under the Task Preview. Click on the link showing under Task Preview.

Step 9: Service will be launched in the new window as below. We have multiple actions like metadata, service details, data which are highlighted

Step 10: Lets us add UI to our project. Click on + icon under User Interface.

Enter the UI details as below

Step 11: Now, let us check the generated code so far based on the activities we performed.

Step 12: Lets us create a new service, which we will be using later in the next blog.

Go to Home tab (if closed already, click Ctrl+P and search for Home)

From home Tab ==> Go to Service Editor ==> Click on the New Service on the top

Now click on the Add Service and add the details as below

Step 13: To use this service through AppGyver, we need CORS support. Open the terminal and execute below command

npm i cors

Also we need to add new file server.js in the srv folder with below code

"use strict";
const cds = require("@sap/cds");
const cors = require("cors");
cds.on("bootstrap", app => app.use(cors()));
module.exports = cds.server;

Step 14: Now will the deploy our project to SAP BTP. Open Home tab, on the top right we can see an option to deploy. Click on the Deploy button. If we are not logged in already, it will prompt for CF credentials and project will get deployed.

Step 15: Once app got deployed, we need to default role to our user id so that we can access the service.

Go to SAP BTP cockpit ==> CF subaccount ==> Users ==> Search our user ==> Add Role Collection ==> Search for project name, add that role to the user id.

Go to HTM5 App ==> Search for project name, we should be able to see our project

Rating: 0 / 5 (0 votes)

The post Create SAP CAP service with Low-Code No Code (LCNC) appeared first on ERP Q&A.

]]>