SAPUI5 - ERP Q&A https://www.erpqna.com Trending SAP Career News and Guidelines Tue, 31 Dec 2024 10:37:43 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAPUI5 - ERP Q&A https://www.erpqna.com 32 32 Upload attachments in a Freestyle UI5 and CAP app https://www.erpqna.com/upload-attachments-in-a-freestyle-ui5-and-cap-app/?utm_source=rss&utm_medium=rss&utm_campaign=upload-attachments-in-a-freestyle-ui5-and-cap-app Mon, 22 Jan 2024 09:15:16 +0000 https://www.erpqna.com/?p=81073 Introduction Uploading files in UI5 requires a bit more effort compared to adding an input field on a view. Depending on the backend some specific configuration is needed. In CAP for example, we need to create an entry for each file before we can start uploading. Sometimes the business scenario requires different flows for uploading […]

The post Upload attachments in a Freestyle UI5 and CAP app first appeared on ERP Q&A.

]]>
Introduction

Uploading files in UI5 requires a bit more effort compared to adding an input field on a view. Depending on the backend some specific configuration is needed. In CAP for example, we need to create an entry for each file before we can start uploading.

Sometimes the business scenario requires different flows for uploading files, eg.: when creating new data objects in the backend, you need to create that data object first before you can upload a file that’s related to that data object. The file needs an id of the data object to know where to upload to.

Therefore, I’m sharing one of the most common cases of uploading files with UI5 freestyle and CAP without draft. Why am I referring to draft? In case draft would be enabled, you could directly upload to the draft entry. Without draft, you cannot upload without having an ID of the created entry and therefore have to save the data before you can trigger the upload.

As an example I created a small CAP project with one Book entity and one Attachment entity. The Book and Attachment entity have a relationship to connect attachments to books. On top of that CAP project, I have a UI5 app running for creating a Book with an attachment in one go.

Data model – Backend (CAP)

The data model is quite simple for this demo project, just two entities:

  • Books entity for storing books with a relationship to Attachments
  • Attachments entity for storing attachments that has a relationship to Books and is configured to upload files to.

The relationship will allow us to upload attachments that are related to a book.

OData config in UI (UI5)

Let’s start with the beginning by configuring the OData model in the UI5 app to have full control of the submitChanges function. Therefore, we need to configure it with the type API and use this group id as UpdateGroupId:

UploadSet – View (UI5)

Next we start adding the UploadSet to the view and bind it to the entity for Attachments. The binding is not necessarily needed for the creation/uploading files, this is rather in case of displaying the attachments. The front slash is not here as we use the association for showing the attachments, we only want to show the attachments for the book that’s shown in the detail page.

A few properties need to be changed to make it work with CAP without using drafts:

  • instantUpload: First of all, “instantUpload” needs to be disabled. By disabling this, the file is not immediately uploaded when you select it from your computer. This should only happen after the main Book entry is created. Without the Book entry, we have no book ID and can’t upload it as an attachment for that book.
  • httpRequestMethod: The property “httpRequestMethod” needs to be adapted to the value “Put” because the real upload to CAP will be done during a PUT operation and not during POST.

All the other properties can be configured as you want:

Initialization entry – Controller (UI5)

That’s it for the view, all other magic happens in the controller starting in the onRouteMatched function. This will be used for creating a new binding context for book using the same groupId as defined in the manifest. The bindingcontext will be connected to the view to capture all input fields from the view when saving to the backend.

I also create a binding context for creating attachments when saving a book. We could also create this on the fly. It is just an example how the keep alive context can be used.

Save book and upload attachment – Controller

The “onSave” function, which will be called when they click on Create button in the view, will do the following:

  • trigger a submitBatch using the group id “bookcreate”. This will create an entry for books with the values bound to the view.
  • If successful, the function “createAttachmentEntry” will be triggered for all newly added attachments of the uploadset and wait for the function to create an entry for each file in the attachment entity (just a create, not yet an upload)
  • A second submitBatch will be triggered again for the group id “bookcreate” to send the created attachment entries to the backend. In this case, it will contain only requests to create attachment entries.
  • After the second submitBatch, the function “createAttachmentEntry” will be resolved and start the upload of the files through a PUT operation
  • The upload success message will be called once the upload is completed in the “onUploadCompleted” eventhandler. In case no files are selected, this will never be reached. Therefore, we need to call the success function if no files are selected

The order of the code is not following the logical sequence because the upload of attachments (4) waits for the creation of Attachment (2) which waits for a submitBatch (3) to be resolved.

The “createAttachmentEntry” will be used to create an entry for each file in the attachment entity using the book ID. The book is saved first to have this id, otherwise we wouldn’t be able to create the attachment for it. The function does the following:

  • Create an entry for attachment with the book ID (which is created with the first submitBatch) and the filename. (It uses the binding created in the onRouteMatched function)
  • Wait for the attachment to be created. The create itself will be triggered with the second submitBatch in the “onSave” function (3).
  • Once created, it will set the upload url with the ID of the created attachment

(The check for localhost is just to make the upload work in my local IDE, this should be handled by the UI5 tooling. As I’m using the cds-plugin for running ui5, I’m not able to configure this in the ui5 tooling. )

In the “onUploadCompleted” function we call the success function. Once this function is reached, we are sure that the creation of the book is completed and the uploads are completed. It won’t be reached when no files are being uploaded, that’s why we need to call the success function also in the “onSave” (5) when no attachments are being uploaded.

The onUploadCompleted will be called for every attachment. We need to check if all attachments are being uploaded to avoid having multiple success messages:

The success function showing a messages that the book is created and the list of book is being updates.

Upload finished

And that’s how to use the UploadSet in a UI5 freestyle app using TypeScript on top of CAP.

Rating: 0 / 5 (0 votes)

The post Upload attachments in a Freestyle UI5 and CAP app first appeared on ERP Q&A.

]]>
Developing a CAP Application in VSCode using CSV data https://www.erpqna.com/developing-a-cap-application-in-vscode-using-csv-data/?utm_source=rss&utm_medium=rss&utm_campaign=developing-a-cap-application-in-vscode-using-csv-data Tue, 05 Dec 2023 10:19:47 +0000 https://www.erpqna.com/?p=79948 Pre-requisite: 1. Install Node.js 2. Install CDS Tools for CAP Development ( npm i -g @sap/cds-dk) Step 1: Go to View tab and Click on Command Palette. Step 2: Search for Open Template Wizard Step 3: Select CAP Project and Click on Start. Step 4: Provide your CAP Application name and runtime as Node.js and […]

The post Developing a CAP Application in VSCode using CSV data first appeared on ERP Q&A.

]]>
Pre-requisite:

1. Install Node.js

2. Install CDS Tools for CAP Development ( npm i -g @sap/cds-dk)

Step 1: Go to View tab and Click on Command Palette.

Step 2: Search for Open Template Wizard

Step 3: Select CAP Project and Click on Start.

Step 4: Provide your CAP Application name and runtime as Node.js and click Finish.

The above image is the skeleton developed for my CAP application containing “app.db,srv” till now.

Step 5: Right click on “db” and select new file.

Step 6: Provide your new file name with “.cds” extension.

In my case it is “datamodels.cds”

Step 7: Provide name space. In my case it is “yournamespace”

Step 8: Provide your table name. In my case it is “yourtablename”

Step 9: Now provide the fields with their type and length. Make sure to maintain at least one key field as shown below. It is mandatory to maintain the fields as same as your excel file.

Step 10: Now right click on your “db” and select new folder.

Provide new folder name as “csv”

Step 11: In that csv folder create a file following a naming convention like(“your_namespace-your_tablename.csv”). In my case it is “yournamespace-yourtablename.csv” .So, till now my application skeleton looks like below.

Step 12: Now copy your excel file data with the fields taken from excel file and paste it in the .csv file you have created just below.

Step 13: Now open your application terminal by following the screen shot above.

Step 14: And run the following Command “npm install”.

Step 15: Right click on “srv” folder and select new file.

Step 16: Keep the name of the new file created as “service.cds”

Now write the code accordingly as shown below. Give your entity name as you like.

In my case it is “yourentityname”.

Step 17: Now run the following command to deploy to sqlite database.

Step 18: Run the following command “cds watch” .

Step 19: Now click on Follow link to see the output.

Step 20: Now select Fiori preview.

Step 21: Now select the fields you want to display on your screen.

Step 22: To link our cap app with the Fiori element and display the data in a List Report.

Step 23: Now repeat Step 1 and Step 2 and Select SAP Fiori Application and click start.

Step 24: Select SAP Fiori and select List Report page and select Next.

Step 25: Follow the above image

Step 26: Choose your cap project to link up with and select your service as shown above.

Step 27: Select your entity from drop down and select “Yes” radio button then Next.

Step 28: Now provide your Fiori application details as shown above and click finish.

The skeleton of my fiori app looks like below.

Now to terminate the initial cds watch rendering click “cntl+c”.

So that you came to see above image and run the cds watch command to see the output in a List Report Application.

After running “cds watch” .

Again follow step 19.

Step 29: This time select your web application as shown below.

Below is the output of our list Report containing the data we provided.

Step 30: Click on a particular record it will navigate to object page which looks like below.

Rating: 0 / 5 (0 votes)

The post Developing a CAP Application in VSCode using CSV data first appeared on ERP Q&A.

]]>
Deploying Fiori/UI5 Projects to S/4HANA On-Premise ABAP server via VSCode https://www.erpqna.com/deploying-fiori-ui5-projects-to-s-4hana-on-premise-abap-server-via-vscode/?utm_source=rss&utm_medium=rss&utm_campaign=deploying-fiori-ui5-projects-to-s-4hana-on-premise-abap-server-via-vscode Wed, 23 Aug 2023 14:14:28 +0000 https://www.erpqna.com/?p=76998 In this blog post, I’d like to guide you step by step process of deployment the Fiori/UI5 application to S/4 HANA On-Promise ABAP Repository via Visual Studio Code (VS Code). Before we were deploying our projects via SE38 program with /UI5/UI5_REPOSITIRY_LOAD. We were choosing the Upload Check Box, giving the name, package and description of […]

The post Deploying Fiori/UI5 Projects to S/4HANA On-Premise ABAP server via VSCode first appeared on ERP Q&A.

]]>
In this blog post, I’d like to guide you step by step process of deployment the Fiori/UI5 application to S/4 HANA On-Promise ABAP Repository via Visual Studio Code (VS Code).

Before we were deploying our projects via SE38 program with /UI5/UI5_REPOSITIRY_LOAD.

SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

We were choosing the Upload Check Box, giving the name, package and description of our application and uploading it to the SAP Server.

SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

Now we are using Visual Studio Code (VSCode) and we have another way to deploy our projects to the ABAP Server.

First we are opnening new terminal.

SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

Selecting our project and writing npm run build.

SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

npm run build
SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

VS Code creating dist folder for us in the root.

SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

We are setting uploder for our project with command “npm install nwabap-ui5uploader –save-dev”.

npm install nwabap-ui5uploader --save-dev
SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

Now we need to create .nwabaprc file in the root folder with properties :

SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

{
    "base": "./dist",
    "conn_usestrictssl" : false,
    "conn_server": "http://<hostname>:<port>/",
    "conn_client" : "client",
    "conn_user": "UserName",
    "conn_password": "Password",
    "abap_package": "$TMP",
    "abap_bsp": "ZDEMO_APP",
    "abap_bsp_text": "UI5 Deployment Demo"
}

Then we are changing or adding package.json file “deploy”: “npx nwabap upload” setting.

SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

"deploy": "npx nwabap upload",

As a last step we are writing Run Command “npm run deploy” to console.

npm run deploy
SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5

Now you can go to the SAP GUI and run your application to control with Tcode : SE80.

SAP Fiori, SAP ABAP Development, SAP Fiori for SAP ERP, SAP Fiori Tools, SAPUI5
Rating: 0 / 5 (0 votes)

The post Deploying Fiori/UI5 Projects to S/4HANA On-Premise ABAP server via VSCode first appeared on ERP Q&A.

]]>
Filter in Function import using oData and UI5 https://www.erpqna.com/filter-in-function-import-using-odata-and-ui5/?utm_source=rss&utm_medium=rss&utm_campaign=filter-in-function-import-using-odata-and-ui5 Thu, 08 Jun 2023 10:40:08 +0000 https://www.erpqna.com/?p=75300 I am writing this blog for beginners on how to filter the data using function Import in SAP UI5 using OData which will be helpful for both front end and back end developer. Function imports are used to do the GET and POST operations for the requirements, which are not possible by the standard operations […]

The post Filter in Function import using oData and UI5 first appeared on ERP Q&A.

]]>
I am writing this blog for beginners on how to filter the data using function Import in SAP UI5 using OData which will be helpful for both front end and back end developer.

Function imports are used to do the GET and POST operations for the requirements, which are not possible by the standard operations available in OData.

Sometimes we may want to filter the data based on names or city(non key value), that time we can use function import.

We will implement the function import code inside EXECUTE_ACTION method by redefining this method in the DPC Extension class.

Steps to create Function Import as follows.

First we will create function import in odata.

Step1:- Create a table. Use t-code se11


CREATING A TABLE
CREATING A TABLE
enter the info

here we will create field and we will give data element.

click on data element of phone.

click yes

here you have to give domain name.

here give data type name and no. of character

then activate it.

Table

we have to go to enhancement category

click on more—extras— click on Enhancement category.

enhancement of data
select data class

now activate the table.

After this, Enter the data in table.

more—utilities—table content—create entries.

table entries
table data

Step 2:- Go to SEGW and create a new project.

Give project name and save in local object or package.

project created
now your project will look like this

Step 3:- Right click on Data Model and select import and then select DDIC structure.

Give Name and table name and click next.

Select table name and click next

select is Key and click on finish.

now generate runtime artifacts.

runtime artifacts generated

Step 4:- Right Click on data model, select create and then click on function import.

runtime artifacts generated

Give Function import name and click on continue.

Fill like this.
Now project look like this .

Click on function import folder and fill the following.

What you want to search you can give the Name.

After this, generate runtime artifacts.

Step 5:- Click on DPC_EXT and redine EXECUTE_ACTION.

Write this code in execute_action method.

Come back to segw.

Click on Service maintanence and register. Give Local and select continue and then select Local package.

After this, Click on SAP Gateway client and select yes.

Step 6:- Give Function Import name followed by what you want to display the data

Here I have given Designation=’DEVELOPER’.

So only developer data will be visible.

This is about oData function import.

—————————————————————————————————————————-

Function Import using UI5 in VS code

if you have not installed vs code, install vs code and node js.

open vs code— click on view—then command palette—then explore and install generator.

install all generator

Step 1:- Create new project(click on view—then command palette—click on fiori: open application generator).

select basic

Step 2:- Select service of function import.

data source as connect to a system.

system:- destination configuration

service as segw function import service.

click on next.

Give View name and project name and click on finish.

After this project template will be generated.

step 3:- First we will create a json model to store the data.

click on model—create a file as data.json

json model

Configure in manifest.json

Here “data” is named model.

step 4:- in View.

view name is Home.view.xml

Here I have used Select control to select the key and table to display the data based on selection.

<mvc:View xmlns:core="sap.ui.core" controllerName="functionimport.controller.Home"
xmlns:mvc="sap.ui.core.mvc" displayBlock="true" xmlns="sap.m">
<App >
<Page id="page" title="{i18n>title}">
<customHeader>
<Bar >
 <contentLeft>
<Label />
<Title text="Designation"/>
<Select selectedKey="SELECT" id="designationsearchid" change="onPress">
<items>
<core:Item text="-----Select one------" key="SELECT"></core:Item>
<core:Item text="DEVELOPER" key="DEVELOPER"></core:Item>
<core:Item text="TESTER" key="TESTER"></core:Item>                            
</items>
</Select>
</contentLeft>
</Bar>
</customHeader>
<content>
<Table items="{data>/}">
<columns>
    <Column >
    <Text text="NAME"></Text>
    </Column>
    <Column >
    <Text text="EMAIL"></Text>
    </Column>
    <Column >
     <Text text="PHONE"></Text>
    </Column>
    <Column >
    <Text text="AGE"></Text>
    </Column>
    <Column >
    <Text text="DESIGNATION"></Text>
    </Column>
    </columns>
        <items>
           <ColumnListItem >
              <cells>
                <Text text="{data>Name}"></Text>
              </cells>

              <cells>
                 <Text text="{data>Email}"></Text>
              </cells>

               <cells>
                  <Text text="{data>Phone}"></Text>
               </cells>

               <cells>
                  <Text text="{data>Age}"></Text>
                </cells>

                <cells>
                  <Text text="{data>Designation}"></Text>
                </cells>

            </ColumnListItem>
         </items>
      </Table>

            
        </content>
    </Page>
   </App>
</mvc:View>

step 5:- in Controller

controller name :-Home.controller.js

in controller first we should get the selected key, and get the model.

Then we have to use callFunction where we need to give entityset, method and urlparameters.

onPress:function(){
                debugger
  var selectedKey=this.getView().byId("designationsearchid").getSelectedKey()
                var oModel=this.getView().getModel('data')
                var oContext=this;
                var oDataModel=this.getOwnerComponent().getModel()
                var arr=new Array();

                oDataModel.callFunction(
                    "/ZAP_FunctionDemo", {
                        method:"GET",
                        urlParameters:{
                            DESIGNATION:selectedKey
                        },
                        success:function(oData,response){

                            arr=oData.results

                            oModel.setData(arr)
                            oContext.getView().setModel(oModel);
                            debugger
                        },
                        error:function(oError)
                        {

                        }
                    }
                );
            }
        });

————————————output——————————–

run the program in terminal using npm start.

output view
on select of developer
on select of tester
Rating: 0 / 5 (0 votes)

The post Filter in Function import using oData and UI5 first appeared on ERP Q&A.

]]>
SAP Fiori Standard App configuration in Launchpad: Manage Purchase Orders (Version 2) https://www.erpqna.com/sap-fiori-standard-app-configuration-in-launchpad-manage-purchase-orders-version-2/?utm_source=rss&utm_medium=rss&utm_campaign=sap-fiori-standard-app-configuration-in-launchpad-manage-purchase-orders-version-2 Sat, 03 Jun 2023 10:48:37 +0000 https://www.erpqna.com/?p=75170 Link to Fiori Apps Library: Manage Purchase Orders (Version 2) https://fioriappslibrary.hana.ondemand.com/sap/fix/externalViewer/#/detail/Apps(‘F0842A’)/S22OP This demo is mainly for the beginners who are curious to learn SAP Fiori Launchpad Configurations. Check for the installations under implementation information tab in SAP Fiori Apps Library. As we all know, we can see for the above installed front end and backend […]

The post SAP Fiori Standard App configuration in Launchpad: Manage Purchase Orders (Version 2) first appeared on ERP Q&A.

]]>
Link to Fiori Apps Library: Manage Purchase Orders (Version 2)

https://fioriappslibrary.hana.ondemand.com/sap/fix/externalViewer/#/detail/Apps(‘F0842A’)/S22OP

This demo is mainly for the beginners who are curious to learn SAP Fiori Launchpad Configurations.

Check for the installations under implementation information tab in SAP Fiori Apps Library.

As we all know, we can see for the above installed front end and backend components in gateway system.

Activate the below nodes in SICF and OData Service in n/iwfnd/maint_service

Now open Launchpad Designer using tcode: /n/UI2/FLPD_CUST from gateway system. This opens the browser.

Create a custom catalog.

Open the Technical Catalog: SAP_TC_PRC_COMMON from Fiori Apps Library in Launchpad Designer and create a reference wrt tiles and target mappings to the above custom catalog.

Verify the added tiles and target mappings in the custom catalog:

Create a custom group and select the required tile:

Make sure to capture the Catalog & Group in Customizing TR

You need to have the front and backend roles assigned with the custom catalog and group to your user ID by Security team.

SAP Fiori Launchpad Output:

Rating: 0 / 5 (0 votes)

The post SAP Fiori Standard App configuration in Launchpad: Manage Purchase Orders (Version 2) first appeared on ERP Q&A.

]]>
End to End UI5 application on SAP BTP https://www.erpqna.com/end-to-end-ui5-application-on-sap-btp/?utm_source=rss&utm_medium=rss&utm_campaign=end-to-end-ui5-application-on-sap-btp Mon, 01 May 2023 11:34:14 +0000 https://www.erpqna.com/?p=74311 The blog will take you through all the steps required to deploy a client and server side application on SAP BTP. Pre-requisites: Login/Register on the SAP BTP platform SAP Business Technology Platform (formerly Cloud Foundry) is a PaaS offered by SAP to host your on-premise applications on cloud. To get started, login or register to […]

The post End to End UI5 application on SAP BTP first appeared on ERP Q&A.

]]>
The blog will take you through all the steps required to deploy a client and server side application on SAP BTP.

Pre-requisites:

  • Basic knowledge of UI5
  • Basic node-express knowledge

Login/Register on the SAP BTP platform

SAP Business Technology Platform (formerly Cloud Foundry) is a PaaS offered by SAP to host your on-premise applications on cloud. To get started, login or register to the same using the below url:

https://account.hana.ondemand.com/#/home/welcome

Once you login for the first time, you should be able to see the below prompt appear.

Select “Continue to Trial Home”.

Next, you should see the following pop-up appear.

I will select Singapore as that’s the server closest to my current location and click on “Create Account”. The account creation takes a while, you should be able to see the below screen eventually.

Once you click “Continue”, you should be able to see the BTP Home Screen.

Click on “Go To Your Trial Account”. You will be redirected to a new url. Bookmark this for future use.

A “trial” sub-account was already created while creating our account. We will create our own.

Click on “Create”–>Subaccount. In the pop up that appears, give your sub-account some meaningful name and select the “US East AWS region”. Hit “Create”.

Create Cloud Foundry Environment

SAP BTP offers 2 environments to host/manage application viz. Cloud Foundry and Kyma. We will be using Cloud Foundry here. Go inside the newly created sub-account and click on “Enable Cloud Foundry”. Nothing needs to be changed in the pop up that follows. Hit “Create”.

Once enabled, we will need to create a “space” to deploy our application.

Click on “Create Space” and give it some meaningful name. Now your BTP platform is ready to have applications deployed onto it.

Install Cloud Foundry CLI

Download the package for your respective OS from below URL and install the same on your local machine. We will be using cloud foundry cli to push our applications to btp.

https://github.com/cloudfoundry/cli/wiki/V6-CLI-Installation-Guide

Once installed, open cmd and run cf –help. If installed properly, you should be able to see a list of available commands.

The Application

Our application will consist of 3 layers:

  • Database: PostgreSql
  • Business Logic: Node Js
  • UI: UI5

Provisioning the database

Inside your subaccount, click on Service Marketplace and search for “PostgreSql”.

Come back to cmd and login to your cloud foundry account using “cf login -a https://api.cf.us10-001.hana.ondemand.com/”. Provide your username and password which was used to create the btp account.

Create an instance of postgresql using the following command: “cf create-service postgresql-db trial my-post-db-02”. Here “postgresql-db” is the name of the service, “trial” is the service plan and “my-post-db-02” is the instance name. The instance creation takes about 10 mins. Once completed, run the command “cf services” and you should be able to see the service listed for your account.

Creating the Node backend application

Now that our database instance is provisioned, we need to create our backend node application and bind the same to our database service.

Before initializing our node project, we will need to update our postgre service with a tag name which will be used by our node application to identify and bind to the service.

Open cmd and run the following command: cf update-service my-post-db-02 -t “my-post-db-02”

Create a new Node application using the “npm init -y” command. The main challenge here is to establish binding with the postgre service. We will be using the @sap/xsenv node module to achieve this.

Below is the dbConn.js file.

const promise = require('bluebird')
const xsenv = require('@sap/xsenv')
//const env = require('../default-env.json')

const optionsDbPromise = {
    promiseLib: promise
}
const pgp = require('pg-promise')(optionsDbPromise)

var conn_service = {}

try {
    //let envCredentials = env.VCAP_SERVICES['my-postgre-db'][0].credentials

    xsenv.loadEnv()
    const envCredentials = xsenv.getServices({
        "my-post-db-02": { "tag": "my-post-db-02" }
    })["my-post-db-02"]

    conn_service.host = envCredentials.hostname
    conn_service.port = envCredentials.port
    conn_service.database = envCredentials.dbname
    conn_service.user = envCredentials.username
    conn_service.password = envCredentials.password
    conn_service.ssl = {
        rejectUnauthorized: false,
        ca: envCredentials.sslrootcert
    }

} catch (error) {
    console.log('error')
}

console.log(conn_service)

const db = pgp(conn_service)

db.func('version')
    .timeout(2000)
    .then((data) => {
        console.log('DB connection success ', data)
    }).catch((error) => {
        console.log(error)
    })

module.exports = {
    db,
    pgp
}

Above we are fetching our “my-post-db-02” service details using the tag filter which we updated earlier. Once these details are fetched, a db connection is created using the hostname, port, ssl root certificate, db name, username and password details for the service. The db instance is exported to be used in other files.

Next we need to write the apis which will directly interact with the db. Here we will only showcase creating a table, inserting data, fetching data and updating a record.

Below is our apis.js file.

const { db } = require("./dbConn")

const setupDB = (req, res, db, pgp) => {
    const QueryFile = pgp.QueryFile

    db.any(new QueryFile('./lib/sqlSetup.sql'), [])
        .then(data => {
            res.status(200).send({ "message": "DB setup successful" })
        }).catch(error => {
            res.status(500).send({ "message": error })
        })
}

const getAllData = (req, res, db) => {
    return db.any("select * from ORGANIZATION_MGT.USER_LIST")
        .then(data => {
            res.status(200).send({ "message": data })
        }).catch(error => {
            res.status(500).send({ "message": error })
        })
}

const updateUserData = (req, res, db) => {
    const id = req.params.id.toString()
    const location = req.query.location
    console.log('-------------------------------------------Request----------------------------------------')
    console.log(req)
    return db.any(`update ORGANIZATION_MGT.USER_LIST set baselocation='${location}' where id='${id}'`)
        .then(data => {
            res.status(200).send({ "message": `User ${id} base location updated successfully!` })
        }).catch(error => {
            res.status(500).send({ "message": error })
        })
}

module.exports = {
    setupDB,
    getAllData,
    updateUserData
}

In the above file we have used a sqlSetup.sql file which is used to setup the table schema and insert data first time. Below is the script for the same.

CREATE SCHEMA IF NOT EXISTS ORGANIZATION_MGT;

DROP TABLE IF EXISTS ORGANIZATION_MGT.USER_LIST;
CREATE TABLE ORGANIZATION_MGT.USER_LIST( id VARCHAR(10) NOT NULL PRIMARY KEY, username VARCHAR(40), designation VARCHAR(20), baselocation VARCHAR(20) );
INSERT INTO ORGANIZATION_MGT.USER_LIST VALUES ( '835825', 'Archisman', 'SCON', 'Pune' ), ( '835826', 'Anuj', 'LCON', 'Nagpur' ), ( '835827', 'Vaishali', 'LCON', 'Pune' ), ( '835828', 'Ritwika', 'LCON', 'Pune' );

Now we will setup the routes in our index.js file which is the entry point of our application. Below is the code for the same.

const express = require('express')
const cors = require('cors')
const { setupDB, getAllData, updateUserData } = require('./lib/apis')
const { db, pgp } = require('./lib/dbConn')

const app = express()
app.use(cors(), (req, res, next) => {
    next()
})
app.use(express.json())
const port = process.env.PORT || 3000

app.post('/setup', (req, res) => {
    setupDB(req, res, db, pgp)
})

app.get('/users', (req, res) => {
    getAllData(req, res, db)
})

app.post('/users/:id', (req, res) => {
    updateUserData(req, res, db)
})

app.listen(process.env.PORT || 3000, () => {
    console.log(`App is running on port ${port}`)
})

Next, we will need to create the manifest.yaml file which consists of some essential information required by cloud foundry while creating the application in btp. Below is the code for the same.

applications:
- name: demo-node-app-01
  memory: 512M
  path: ./
  buildpack: nodejs_buildpack
  health-check-type: port
  services:
    - my-post-db-02

Upon creation of all the above files, this is how my project structure looks.

Deploying the Node application to BTP

Now that we have created our node application locally, it is time to push it to the cloud. Open cmd and navigate to the location where the manifest.yaml file (for the same node app) is location.

Run “cf push”. This will push the application to BTP and you should be able to see a route generated as shown below.

As a further check run the command “cf logs <app-name>–recent”. You should be able to see that the connection to the postgre service was successfully established.

Testing our apis using POSTMAN

Now that our backend is established, we should test our apis using postman before integrating with the client side application.

We have 3 rest end points to test.

  • POST /setup
  • GET /users
  • POST /users/:id?location

Below are snapshots for the same in postman.

Integrate api with client side application

Once we are satisfied that the apis are working properly, we can go ahead and integrate them to our client side application. I have provided the controller and view code snippet below showcasing the same.

sap.ui.define([
	'sap/ui/core/mvc/Controller',
	'sap/ui/model/json/JSONModel',
	'sap/m/MessageBox'
], function (Controller, JSONModel, MessageBox) {
	"use strict";

	var oController
	return Controller.extend("Sample.Quickstart.controller.View1", {
		onInit: async function () {
			oController = this
			const oUserModel = new JSONModel()
			oController.getView().setModel(oUserModel, 'oUserModel')
			const sUrl = `https://demo-node-app-01.cfapps.us10-001.hana.ondemand.com/users`
			await oUserModel.loadData(sUrl, {}, true, 'GET')
		},
		_onUpdateLocation: async (oEvent) => {

			const sPath = oEvent.getSource().getBindingContext('oUserModel').sPath
			const oUserModel = oController.getView().getModel('oUserModel')

			const id = oUserModel.getProperty(`${sPath}/id`)
			const location = oUserModel.getProperty(`${sPath}/baselocation`)

			try {
				const oUserUpdateModel = new JSONModel()
				const sUrl = `https://demo-node-app-01.cfapps.us10-001.hana.ondemand.com/users/${id}?location=${location}`
				const oParams = {
					location
				}

				await oUserUpdateModel.loadData(sUrl, oParams, true, 'POST')

				MessageBox.success(oUserUpdateModel.getData()['message'])

			} catch (error) {
				MessageBox.error(error.message)
			}
		}
	});
});
<mvc:View controllerName="Sample.Quickstart.controller.View1"
	xmlns:mvc="sap.ui.core.mvc" displayBlock="true"
	xmlns="sap.m">
	<App id="app">
		<pages>
			<Page id="page" title="Org Data" titleAlignment="Center">
				<content>
					<Table items="{oUserModel>/message}">
						<columns>
							<Column>
								<Label text="User Id" design="Bold"/>
							</Column>
							<Column>
								<Label text="Name" design="Bold"/>
							</Column>
							<Column>
								<Label text="Designation" design="Bold"/>
							</Column>
							<Column>
								<Label text="Location" design="Bold"/>
							</Column>
						</columns>
						<ColumnListItem>
							<cells>
								<Text text="{oUserModel>id}"/>
								<Text text="{oUserModel>username}"/>
								<Text text="{oUserModel>designation}"/>
								<HBox>
									<Input width="80%" value="{oUserModel>baselocation}"/>
									<Button type="Emphasized" text="Udpdate" press="_onUpdateLocation"/>
								</HBox>
							</cells>
						</ColumnListItem>
					</Table>
				</content>
			</Page>
		</pages>
	</App>

</mvc:View>

Before pushing this application to the cloud, we once again need to create a manifest.yaml file to help cloud foundry with the deployment. Below is the code for the same.

applications:
- name: demo-ui5-app-01
  memory: 512M
  path: ./
  buildpack: staticfile_buildpack

The project structure now looks as below.

Pushing the client side application to cloud

In the same manner that we deployed our node application, open cmd and navigate to the location where the manifest.yaml file is situated.

Run “cf push”. Again a route will be generated for the application as shown below.

Now run the route in your browser and you should be able to see your application up and running. (Please note the /webapp/index.html that has been appended to the above route in the screenshot below)

Rating: 0 / 5 (0 votes)

The post End to End UI5 application on SAP BTP first appeared on ERP Q&A.

]]>
Deploy the UI5 Fiori app on SAP ABAP repository with BAS & WEBIDE, create Fiori app Tile using Launchpad Designer https://www.erpqna.com/deploy-the-ui5-fiori-app-on-sap-abap-repository-with-bas-webide-create-fiori-app-tile-using-launchpad-designer/?utm_source=rss&utm_medium=rss&utm_campaign=deploy-the-ui5-fiori-app-on-sap-abap-repository-with-bas-webide-create-fiori-app-tile-using-launchpad-designer Sat, 25 Feb 2023 11:16:42 +0000 https://www.erpqna.com/?p=72336 In this blog we will learn How to deploy the Ui5 Fiori app on SAP ABAP repository with BAS & WEBIDE, also learn about fiori app tile creation in launchpad designer. What is SAP UI5 SAP ui5 is a framework, it consists of Libraries. Used for Creating responsive apps. Fiori Launchpad SAP Fiori launchpad is […]

The post Deploy the UI5 Fiori app on SAP ABAP repository with BAS & WEBIDE, create Fiori app Tile using Launchpad Designer first appeared on ERP Q&A.

]]>
In this blog we will learn How to deploy the Ui5 Fiori app on SAP ABAP repository with BAS & WEBIDE, also learn about fiori app tile creation in launchpad designer.

What is SAP UI5

SAP ui5 is a framework, it consists of Libraries. Used for Creating responsive apps.

Fiori Launchpad

SAP Fiori launchpad is a shell that hosts SAP Fiori apps, and provides the apps with services such as navigation, personalization, embedded support, and application configuration.

Fiori launchpad is responsive because it adapts the layout in the Device Screen.

Prerequisites

  1. User must have access to SAP Logon.
  2. User must have access of TCodes: SE80, /ui2/flp, /ui2/flpd_cust, /ui2/FLPD_CONF, /ui2/_FLPCM_CUST, /ui2/FLCM_CONF, /ui2/semobj, PFCG
  3. You access to SAP BTP Cockpit.

Deploy the Fiori app using Business application Studio (BAS)

SAP Business application studio is a cloud based developer tool to develop UI based applications

  1. Launch the BAS from Cloud cockpit

2. Select your desired project –> open the Terminal –> Go to the Project Dir (cd project dir) –> run the command ‘npm run deploy-config’ or ‘npx fiori add deploy-config’.

3. Give the required information

Choose the Target –> ABAP

Select the Destination (If destination is not created, create a destination on SAP BTP Cockpit)

Provide the SAPUI5 ABAP Repository Name and Description.

Provide the Package detail

Provide the Transport Request and execute the command “npm run deploy”.

You will get a message “SAP UI5 Application has been uploaded and registered successfully”.

4. Go to the given link and check the deployed fiori app

5. Now you will get the Fiori app on SAP system.

Deploy the Fiori app using WEBIDE

1. Go to the WEBIDE, right click on your project –> Select Deploy –> Choose your System (Deploy to SAPUi5 ABAP Repository)

2. You will be on next page (Deployment Option), Here select your required ABAP or Cloud system –> Next.

Note: If you are deploying first time, select Deploy a new application. Else application is already deployed, select update an existing application

3. Provide application Name, Description, Package, in – Deploy a New Application Screen and select Next.

4. If the selected package is local, Choose Finish. If it requires TR, select TR for your application.

A notification message displays once the application is deployed successfully.

5. Go to SAP Logon Pad –> Tcode SE80 –> Select BSP Application from Dropdown –> find your deployed application.

Steps to create a Tile and Target mapping in Fiori launchpad Designer

1. Create Semantic Object using TCode – /n/ui2/semobj.

Click on Edit Button

Select New Entries.

Provide the new Semantic Object, Object Name and Description and Click on Save Button.

2. Go to SAP LogOn –> Execute the TCode – /ui2/FLPD_CUST (Launchpad Designer).

3. Click on Setting Button –> Select the TR –> choose Ok.

4. If catalog information is not available, create a catalog –> Click on Plus Button –> provide the required information –> click on Save.

5. If catalog already exist, select the catalog –> Click on Add tile plus Button.

6. Select a Tile template, here I am going to select App launcher – Static.

7. Provide the Title, information, Semantic Object, Action and other required Information Click on Save Button.

8. Create a Target Mapping à Go to Target Mapping a Click on Create Target Mapping.

9. Provide the required information Semantic Object, Action, Title, URL, ID and Click on Save Button.

Note: How to get URL –> Go to the WEBIDE –> your project –> manifest.json file –> Find the URL info.

How to get ID: Go to the WEBIDE –> your project –> manifest.json file –> find the ID.

10. The Fiori app tile has been created successfully. Go to Fiori launchpadv –> App Finder –> Search your application.

Note: If Fiori app is not found on Fiori Launchpad à we have to assign the PFCG Role for catalog and contact security team to assign the Role.

Rating: 0 / 5 (0 votes)

The post Deploy the UI5 Fiori app on SAP ABAP repository with BAS & WEBIDE, create Fiori app Tile using Launchpad Designer first appeared on ERP Q&A.

]]>
Hosting a server-less UI5 application on AWS https://www.erpqna.com/hosting-a-server-less-ui5-application-on-aws/?utm_source=rss&utm_medium=rss&utm_campaign=hosting-a-server-less-ui5-application-on-aws Wed, 28 Dec 2022 11:41:05 +0000 https://www.erpqna.com/?p=71365 The following blog will guide you to deploy a UI5 application on AWS S3 with Dynamo DB as backend database, Lambda as the business logic layer and API gateway which acts as a connection between the client(UI) and Lambda. Now lets tackle these layers one by one. Head over to your AWS account and search […]

The post Hosting a server-less UI5 application on AWS first appeared on ERP Q&A.

]]>
The following blog will guide you to deploy a UI5 application on AWS S3 with Dynamo DB as backend database, Lambda as the business logic layer and API gateway which acts as a connection between the client(UI) and Lambda.

Architecture

Now lets tackle these layers one by one.

  • Amazon S3:

Head over to your AWS account and search for S3. Click on the Create Bucket option as shown below.

Create Bucket

In the window that appears, type in a unique name for your bucket. Please note that the name must be unique across all accounts and regions, which means that if I choose my bucket name to be demo-app-archisman, you will not be able to use the same name although you are working on your own private account(and maybe in a separate region).

Uncheck “Block All Public Access” as shown below.

Keeping the rest of the settings as default, hit the Create Bucket button. You should be able to see a new public bucket created.

Next, we will navigate to our bucket and edit the bucket policy to “Allow” Get access to all the objects stored on our bucket.

Type Of Policy: S3 Bucket Policy

Effect: Allow

Principal: *

Action: GetObject

ARN: arn:aws:s3:::demo-app-archisman/* (make sure to paste your own unique Amazon Resource Name here and append /* at the end which means we are granting GetObject access to all the objects within our bucket)

Once you have correctly added the above statement, generate the policy.

You should be able to see a window pop up with the generated policy in JSON format. Copy it to clipboard.

Paste the above copied JSON into the bucket policy text area as shown below and save the changes.

Now that we have got the bucket policy out of our way, let us upload our UI5 web application onto S3. Navigate to Objects tab and click on the Upload button.

Select all the files and drag and drop them to the Upload Section.

Click on Upload. Once upload is completed, you will be able to see the uploaded files and folders in the Objects tab.

Now that the web application files are uploaded, we need to explicitly tell S3 that we are going to be hosting a static website.

Navigate to the Properties tab and in the very last section click on the Edit button for “Static Website Hosting”.

Edit the below properties and save changes.

Go back to the same section of Static website hosting and you should be able to see an endpoint Url. Click on the Url and your UI5 application is up and running.

For me, its a simple Chat Application interface.

Now that we have a static website up and running, lets work on the backend in order to make our application interactive.

  • Dynamo DB:

Our choice of database here is Dynamo DB which very much resembles NoSql Mongo DB. Keeping with the theme of our application, this database again is server-less thus we do not need to worry about provisioning the right server to host our database.

Head over to the AWS service Dynamo DB and click on Create Table.

Type in your table name, the partition key(synonymous to primary key from SQL) and sort key(which is optional). If you are unsure about the Read Capacity Units and Write Capacity Units keep them as default. However, it is important to understand these while designing a production ready application as they determine how well your database performs. These units help Dynamo DB decide the number of partitions to create and in turn how efficiently the database returns the output of a query.

Click on Create Table.

Once the table is created, navigate to the table and create a few items (rows).

JSON View for the above item. I wanted to show this Dynamo DB view of the JSON especially as this understanding of the structure will be needed when you write your lambda functions. “S” denotes string, “L” denotes list/array and “M” denotes map/object.

Create as many tables as your application demands and move on to creating the business logic layer i.e lambda functions.

  • Lamda Functions:

Our lambda functions will basically interact with our above database tables to perform CRUD operations. Head over to AWS Lambda service and create your lambda functions as per your application requirement. Here we have used Node as the scripting language.

GET Operation:

The below lambda function shows how to fetch all the records from the database table. The highlighted “unmarshall” function converts the item from a dynamo db object(as shown above) to a normal JSON object that we are familiar with.

GET Operation by Id

The below lambda function shows how to fetch records from the database table based on primary key.

POST Operation

Below lambda function shows how to insert a record to the database.

Now that we have our UI and backend logic set up, the only thing remaining is to connect these 2. That is where the API gateway comes into the picture.

  • API Gateway

Head over to AWS service API Gateway and click on Create API. Choose the REST API service and click on Build.

Give your API a name and make the endpoint type as Edge optimized. Click on Create API.

Once the API is created, click on Actions and hit Create Resource.

Type in the Resource Name and click Create Resource.

Now on this resource we want to create a GET method.

In the following window, type in your lambda function name and hit save.

Once saved, you should be able to see this sort of a pipeline created.

Click on Test. In the following window again click the Test button. You should be able to see the output of your lambda function being returned as shown below.

Now suppose your team is divided into different front end and backend teams and your lambda function is returning some redundant information which you do not want to percolate to your UI layer. In the above scenario, headers is a section that my client side will have no interest knowing about. API Gateway provides modeling feature using which we can reshape/filter our output structure as required by the client side application. It uses Velocity Templating language and a simple version is shown below.

Go to the Models section and create a new model.

Create a JSON schema of the output being returned by the lambda function and save your model.

Now come back to Resources and click on Integration Response.

Add an “application/json” type template and from the Generate Template dropdown select the model that you created in the previous step.

Below is the template generated from the model.

In the above template, I will be removing the headers section ( as i don’t want that information to flow to my UI), I will rename the “body” key to “results” and will replace TODO with body. Also the dummy values need to be replaced with the actual name of the key. The changed template will look as below.

Save the template, return to Method Execution and again test the GET API. The response will be changed from the one that we previously tested.

Click on Action, and Enable CORS so that the API is accessible to the client application.

Again click on Actions and select Deploy API.

A URL will be generated. Plug this URL in your application and voila, you have now created and end to end application.

Rating: 0 / 5 (0 votes)

The post Hosting a server-less UI5 application on AWS first appeared on ERP Q&A.

]]>
Excel File Upload Download in SAPUI5 https://www.erpqna.com/excel-file-upload-download-in-sapui5/?utm_source=rss&utm_medium=rss&utm_campaign=excel-file-upload-download-in-sapui5 Mon, 22 Aug 2022 10:09:38 +0000 https://www.erpqna.com/?p=67133 Introduction On many occasions, users need to upload an excel file from the local system and extract the content from the excel file to send to backend SAP. There are multiple ways to achieve the extraction of data from excel file and send the content to SAP system. In this article I am going to […]

The post Excel File Upload Download in SAPUI5 first appeared on ERP Q&A.

]]>
Introduction

On many occasions, users need to upload an excel file from the local system and extract the content from the excel file to send to backend SAP. There are multiple ways to achieve the extraction of data from excel file and send the content to SAP system. In this article I am going to explain how one can download the content of an SAP table as a form off an excel file. We will also discuss in detail about how one can upload an excel file filled with table content and send the extracted data to back-end SAP. Finally, we will see how to convert the encoded data in back-end SAP. In this article we are going to use SAP gateway stream methods to extract SAP data and upload presentation server data to SAP.

Development Environment

  • SAP UI5 version 1.71.33
  • SAP Gateway version 740, SP0026

Gateway Project Building

Let us first create SAP gateway project and create suitable entities to handle the stream methods. In this object we are creating two entities –

1. TableStructure – properties are – (a). TableName, (b). MimeType

2. TableList – properties are – (a). NameOfTable, (b). TableDesc

Gateway_Project_Entities

We have generated the corresponding entity sets. For this project we need table structure set as a media entity. Hence mark the entity as media enabled on the entity set property.

Marking Entity as Media Type

Also do not forget to update the entity structure with correct operations you need, example given – creatable, updateable etc.

Mark Operations for EntitySet

The first entity – TableStructure has two properties – TableName, MIME Type. The table name property will hold the name of the table which we want to download or upload. The MIME Type property will hold the type of the file e.g. – JPEG, XLS, CSV, XLSX, TEXT. You can find the possible MIME type from the standard table – SDOKFEXTT. In this blog we are going to develop a case on the file type XLS. The corresponding MIME type is – application/vnd.ms-excel.

Service Method Implementations

After generating the project go to the MPC extension class and redefine the DEFINE method like shown below –

DATA: lo_property    TYPE REF TO /iwbep/if_mgw_odata_property,
        lo_entity_type TYPE REF TO /iwbep/if_mgw_odata_entity_typ.

  super->define( ).

  lo_entity_type = model->get_entity_type( iv_entity_name = 'TableStructure' ).

  IF lo_entity_type IS BOUND.
    lo_property = lo_entity_type->get_property('MimeType').
    lo_property->set_as_content_type( ).
  ENDIF.

Now, go to the DPC extension class and redefine method GET_STREAM and CREATE_STREAM to handle download and upload functions, respectively. The relevant implementation of GET_STREAM as shown below.

DATA: ls_stream      TYPE ty_s_media_resource,
        lv_data_string TYPE string,
        lv_xstring     TYPE xstring,
        ls_header      TYPE ihttpnvp,
        lt_fields      TYPE STANDARD TABLE OF dfies.

* Get Tablename passed from frontend
  DATA(ls_key_tab) = VALUE #( it_key_tab[ name = 'TableName' ] OPTIONAL ).

* Get Column Lists from Table name
  IF ls_key_tab-value IS NOT INITIAL.

    CALL FUNCTION 'DDIF_FIELDINFO_GET'
      EXPORTING
        tabname        = CONV ddobjname( ls_key_tab-value )
      TABLES
        dfies_tab      = lt_fields
      EXCEPTIONS
        not_found      = 1
        internal_error = 2
        OTHERS         = 3.

    IF sy-subrc <> 0.

      "Handle Suitable Error Here

    ELSE.


* Build String with table coulmn names
      LOOP AT lt_fields INTO DATA(lt_fields_line).
        IF sy-tabix = 1.
          lv_data_string = |{ lt_fields_line-fieldname }|.
        ELSE.
          lv_data_string = |{ lv_data_string }\t{ lt_fields_line-fieldname }|.
        ENDIF.
        CLEAR: lt_fields_line.
      ENDLOOP.

* Convert String to xstring
      CALL FUNCTION 'HR_KR_STRING_TO_XSTRING'
        EXPORTING
          unicode_string   = lv_data_string
        IMPORTING
          xstring_stream   = lv_xstring
        EXCEPTIONS
          invalid_codepage = 1
          invalid_string   = 2
          OTHERS           = 3.

* Pass back stream values
      ls_stream-value     = lv_xstring.
      ls_stream-mime_type = 'application/msexcel'.


* Set Header for Response
      DATA(lv_filename) = |{ ls_key_tab-value }-{ sy-datum }-{ sy-uzeit }.xls|.
      lv_filename = escape( val = lv_filename format = cl_abap_format=>e_url ).

      ls_header-name = 'Content-Disposition'.
      ls_header-value = |outline; filename="{ lv_filename }"|.
      me->set_header( ls_header ).

      copy_data_to_ref(
        EXPORTING is_data = ls_stream
        CHANGING cr_data = er_stream ).


    ENDIF.

  ENDIF.

Here, we are receiving a table name and extracting the table structure from SAP. We are creating an excel file of type (.XLS) and trying to send that across to SAP UI5. Only the template of the table will be downloaded and there will be no table content.

Note – We have used MIME type from the media type standard table. The file will be downloaded with name – <tableName>-<Date>-<Time>.XLS.

Implement Download Method in SAP UI5

Let’s implement the corresponding SAP UI5 code for this. I have created an Object Page application with 3 simple sections.

  1. Custom Table Selection
  2. Browse Data File
  3. Execute Upload
Final Designed View

Under custom table selection, created an input field for passing the table name for which we want the template to be downloaded. To trigger the download function a button “Download Template” has been introduced.

Let’s see the XML view code for input field and download button.

<ObjectPageSubSection id="customTableSel">
						<blocks>
							<layout:VerticalLayout>
								<m:Input id="tableInput" width="100%" type="Text" placeholder="Enter Table Name ..." showSuggestion="true" suggestionItems="{/TableListSet}"
									showValueHelp="true" valueHelpRequest="onValueHelpRequest" suggest="onSuggest" liveChange="onLiveChange">
									<m:suggestionItems>
										<core:Item text="{NameOfTable}"/>
									</m:suggestionItems>
								</m:Input>
								<m:Button text="Download Template" press="onDownTempPressed" enabled="true"></m:Button>
							</layout:VerticalLayout>
						</blocks>
					</ObjectPageSubSection>

Now, we implemented the button press function on the corresponding controller. On pressing the Download Template button – “onDownTempPressed” method will be triggered. If the input field has values in it then we are triggering the following URL to download the template.

/sap/opu/odata/SAP/XXX_GENTAB_MAINT_SRV/TableStructureSet(‘<table name>’)/$value

Construct the URI as shown below and run it through window.open function.

onDownTempPressed: function (oEvent) {
			var inputTableName = this.getView().byId("tableInput").getValue();
			var mainUri = this.getOwnerComponent().getManifestEntry("/sap.app/dataSources/mainService").uri;

			// Check whether table name field is not initial
			if (inputTableName === "") {
				this.getView().byId("tableInput").setValueState("Error");
				MessageToast.show("Please Provide a Table Name");
			} else {
				//Reset the value state for input field in case it was error
				this.getView().byId("tableInput").setValueState("None");
				var wStream = window.open(mainUri + "TableStructureSet('" + inputTableName + "')/$value");
				if (wStream === null) {
					MessageToast.show("Error generating template");
				} else {
					MessageToast.show("Template Downloaded Successfully");
				}

			}
		},

File Uploading in SAPUI5

Let us understand how to upload XLS file and extract the content of the file in SAP UI5 and make a call to SAP with the data, to have it posted on SAP table. To do that let us design a file uploader control on SAP UI5. The file uploader control comes with a browse function which allows us to select a file from the local computer. While designing the control in SAP UI5 we can specify which file types are to be allowed for upload. in this article we will allow only XLS file type. there are several events associated with file uploader control. One can make use of standard events like typeMismatch, onChange etc. for better user experience. For the brevity of this blog, we have only used typeMismatch, onChange events on file uploader.

My view for the file uploader looks like below –

<ObjectPageSection id="sectionBrowseDataFile" title="Upload Data" importance="Medium" titleUppercase="true">
				<subSections>
					<ObjectPageSubSection id="browseDataFile" title="Browse Data File">
						<blocks>
							<u:FileUploader id="fileUploader" name="myFileUpload" tooltip="" fileType="xls" uploadOnChange="false" typeMissmatch="handleTypeMissmatch"
								change="onChange"/>
						</blocks>
					</ObjectPageSubSection>
				</subSections>
			</ObjectPageSection>

After selection is completed, user needs to select what operation he wants to perform by selecting the radio button designed at the end of the application. Once the operation is selected, when the user presses “execute upload” button, file reading should happen, and the extracted data should go to SAP through a stream.

My view for file selection and executing the operation is designed as shown below –

<ObjectPageSection id="executeUpload" title="Execute Upload" importance="Medium" titleUppercase="true">
				<subSections>
					<ObjectPageSubSection id="executeUploadSubSection" title="">
						<blocks>
							<layout:VerticalLayout>
								<m:RadioButtonGroup id="radioButtonGroup">
									<m:buttons>
										<m:RadioButton id="radioButton1" text="Insert"/>
										<m:RadioButton id="radioButton2" text="Modify"/>
										<m:RadioButton id="radioButton3" text="Delete"/>
									</m:buttons>
								</m:RadioButtonGroup>
								<m:Button text="Execute Upload" press="onExecuteUpload" type="Emphasized" enabled="true"></m:Button>
							</layout:VerticalLayout>
						</blocks>
					</ObjectPageSubSection>
				</subSections>
			</ObjectPageSection>

I am using file reader function and its corresponding events to read the file. In this article we are reading the XLS file as a binary string and then parsing the string with specified REGEX operations and converting into a string array. Remember, send your content from SAP UI5 in such a way so that it becomes easy in back-end SAP to parse and convert the data like a table for further operations. You can also use readAsBinaryURL for reading different media types. In such cases the content becomes base64 encoded and you must decode the base64 encoding at your back-end SAP in order to decipher the data and make use of it.

The corresponding JS function attached to “Execute Upload” button is shown below –

onExecuteUpload: function (oEvent) {
			var oFileUploader = this.byId("fileUploader");
			var xlsDomRef = oFileUploader.getFocusDomRef();
			var xlsFile = xlsDomRef.files[0];
			var that = this;

			this.fileName = xlsFile.name;
			this.fileType = xlsFile.type;

			//Get selected Radio Button
			for (var j = 0; j < this.getView().byId("radioButtonGroup").getButtons().length; j++) {
				if (this.getView().byId("radioButtonGroup").getButtons()[j].getSelected()) {
					this.operation = this.getView().byId("radioButtonGroup").getButtons()[j].getText();
				}
			}

			var oReader = new FileReader();
			oReader.onload = function (oReadStream) {
				//Get the number of columns present in the file uploaded & convert the regex unformatted stream
				//to array. This will be parsed at the backend SAP
				var noOfcolumn = oReadStream.currentTarget.result.match(/[^\r\n]+/g)[0].split("\t").length;
				var binContent = oReadStream.currentTarget.result.match(/[^\r\n\t]+/g);

				//Provide the binary content as payload. This will be received as an XSTRING in
				//SAP under the CREATE_STREAM method of media resource structure
				var payload = {
					"Content": binContent
				};

				//Provide additional details through header. Column No, Filename + File Type + TableName + Operation
				var header = {
					"slug": noOfcolumn + "," + that.fileName + "," + that.fileType + "," + that.operation
				};

				//Call a CREATE_STREAM activity
				that.getModel().create("/TableStructureSet", payload, {
					headers: header,
					success: function (oData, oResponse) {
						MessageToast.show("Data Uploaded Successfully!");
					},
					error: function (oError) {
						MessageToast.show("Data Uploaded Failed!");
					}
				});

			};

			// Read the file as a binary String. Do not read URI, you have to encode before sending
			oReader.readAsBinaryString(xlsFile);

		},

Handle Uploaded Filedata in Gateway

The string that we are sending from SAP UI5 will be received as in XString format in the back end. You need to write necessary code to convert the XString to binary content followed by a binary to string conversion. once you receive the content in string format you need to remove the unnecessary parenthesis, colons, braces from the string. With the parsed string you need to dynamically build the internal table before you insert the data in database table. Practice caution and make required validations both in SAP UI5 and gateway code before uploading data to database table. It is a good idea to implement a busy dialogue in SAP UI5 while the back-end operation is happening so that the user remains informed about an ongoing activity with the file data. Send successful message if the operation is completed successfully or send suitable error messages if there are any difficulties arise while processing or uploading the data.

The GW code for handling the file data is shown below – this has been implemented under CREATE_STREAM method.

* Data Declarations
  DATA: lv_xstring         TYPE xstring,
        lv_string          TYPE string,
        lv_length          TYPE i,
        lv_strcomp_counter TYPE i,
        li_bin_tab         TYPE STANDARD TABLE OF x255,
        lt_dref            TYPE REF TO data,
        ls_dref            TYPE REF TO data.

  FIELD-SYMBOLS: <lfs_dyntab> TYPE STANDARD TABLE,
                 <lfs_dyntab_line>    TYPE any.


* Get the column count and table name for upload
  SPLIT iv_slug AT ',' INTO TABLE DATA(li_slug_data).
  DATA(lv_colcount)  = VALUE #( li_slug_data[ 1 ] OPTIONAL ).
  DATA(lv_operation) = VALUE #( li_slug_data[ 4 ] OPTIONAL ).
  DATA(lv_fileName)  = CONV string( VALUE #( li_slug_data[ 2 ] OPTIONAL ) ).
  SPLIT lv_fileName AT '-' INTO TABLE DATA(li_fileName).
  DATA(lv_tableName) = VALUE tabname16( li_fileName[ 1 ] OPTIONAL ).

* Read the stream information
  lv_xstring = is_media_resource-value.


* XString to Binary Conversion
  CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'
    EXPORTING
      buffer        = lv_xstring
    IMPORTING
      output_length = lv_length
    TABLES
      binary_tab    = li_bin_tab.

* Binary to String Conversion
  CALL FUNCTION 'SCMS_BINARY_TO_STRING'
    EXPORTING
      input_length = lv_length
    IMPORTING
      text_buffer  = lv_string
    TABLES
      binary_tab   = li_bin_tab
    EXCEPTIONS
      failed       = 1
      OTHERS       = 2.
  IF sy-subrc <> 0.
* Implement suitable error handling here
  ENDIF.

* Format String to Readable format to Parse in Table
  REPLACE ALL OCCURRENCES OF '{' IN lv_string WITH ''.
  REPLACE ALL OCCURRENCES OF '}' IN lv_string WITH ''.
  REPLACE ALL OCCURRENCES OF ':' IN lv_string WITH ''.
  REPLACE ALL OCCURRENCES OF '[' IN lv_string WITH ''.
  REPLACE ALL OCCURRENCES OF ']' IN lv_string WITH ''.
  REPLACE ALL OCCURRENCES OF '"' IN lv_string WITH ''.
  REPLACE ALL OCCURRENCES OF 'Content' IN lv_string WITH ''.

  SPLIT lv_string AT ',' INTO TABLE DATA(li_values).


  CREATE DATA lt_dref TYPE TABLE OF (lv_tableName).
  CREATE DATA ls_dref TYPE (lv_tableName).

* Assign field symbol with table type of DDIC
  ASSIGN lt_dref->* TO <lfs_dyntab>.
* Assign field symbol with Structure type of DDIC
  ASSIGN ls_dref->* TO <lfs_dyntab_line>.


* Initialize Counter
  lv_strcomp_counter = 1.

* Dynamically read the values and build internal table
  LOOP AT li_values ASSIGNING FIELD-SYMBOL(<lfs_values>).

    ASSIGN COMPONENT lv_strcomp_counter OF STRUCTURE <lfs_dyntab_line> TO FIELD-SYMBOL(<lfs_tgt_data>).
    <lfs_tgt_data> = <lfs_values>.

    lv_strcomp_counter = lv_strcomp_counter + 1.

*   LineBreak Counter for checking column count
    DATA(lv_linebrk_counter)  = sy-tabix MOD ( lv_colcount ).
    IF lv_linebrk_counter EQ 0.
      lv_strcomp_counter = 1.
      APPEND <lfs_dyntab_line> TO <lfs_dyntab>.
      CLEAR: <lfs_dyntab_line>.
    ENDIF.

  ENDLOOP.

* Delete the header row of the table
  DELETE <lfs_dyntab> INDEX 1.

* Create Entry to Actual Table
  MODIFY (lv_tableName) FROM TABLE <lfs_dyntab>.
  IF sy-subrc <> 0.

  ENDIF.
Rating: 0 / 5 (0 votes)

The post Excel File Upload Download in SAPUI5 first appeared on ERP Q&A.

]]>
Some of UI annotations/Local annotations for LROP Fiori application. https://www.erpqna.com/some-of-ui-annotations-local-annotations-for-lrop-fiori-application/?utm_source=rss&utm_medium=rss&utm_campaign=some-of-ui-annotations-local-annotations-for-lrop-fiori-application Tue, 26 Jul 2022 10:32:08 +0000 https://www.erpqna.com/?p=65493 There two ways we can write annotations. Using CDS views Using annotation modler in webide We can overwrite the annotations coming from backend cds view using annotation modler. In this blog I am focusing purely on Local annotations for List report object page application which are useful while developing the fiori elements list report object […]

The post Some of UI annotations/Local annotations for LROP Fiori application. first appeared on ERP Q&A.

]]>
There two ways we can write annotations.

  • Using CDS views
  • Using annotation modler in webide

We can overwrite the annotations coming from backend cds view using annotation modler.

In this blog I am focusing purely on Local annotations for List report object page application which are useful while developing the fiori elements list report object page application. Only some of the annotations I have covered in this blog.

We can utilize annotation modler to overwrite/create new annotations based on our need. This post can give you insights on where the local annotations can show the result on the screen.

Through this post I am writing some of local annotations step by step process on list report and object page, so that if any one is trying to build fiori elements app and they can easily understand where different local annotations fits on the screen.

Note: Here I had used standard PO service to demonstrate the UI annotations.

How to link annotation.xml file to our service:

– create annotation.xml(filetype should be .xml) file inside webapp folder. Path: right click on webapp folder-new-Annotation File ->in the next window we have to choose the corresponding odata service = this will create an local annotation file for that service and corresponding code will be added in manifest.json file.

Manifest.json link for annotation file:

Path: sap.app – dataSources

In below example inside localannotations section check the uri to find where the annotation file exist in webapp folder.

1) Selection fields

<Annotation Term="UI.SelectionFields">
<Collection>
<PropertyPath>Supplier</PropertyPath>
<PropertyPath>CompanyCode</PropertyPath>
</Collection>
</Annotation>

Result:

2) Value List:

Here we have to always match valuelist property with local property(i.e the property which is coming from the bindelement aggregation).

<Annotations Target="MM_PUR_PO_MAINT_V2_SRV.C_PurchaseOrderTPType/Supplier">
<Annotation Term="Common.ValueList">
<Record Type="Common.ValueListType">
<PropertyValue Property="CollectionPath" String="C_MM_SupplierValueHelp"/>
<PropertyValue Property="Parameters">
<Collection>
<Record Type="Common.ValueListParameterInOut">
<PropertyValue Property="LocalDataProperty" PropertyPath="Supplier"/>
<PropertyValue Property="ValueListProperty" String="Supplier"/>
</Record>
<Record Type="Common.ValueListParameterDisplayOnly">
<PropertyValue Property="ValueListProperty" String="SupplierName"/>
</Record>
<Record Type="Common.ValueListParameterIn">
<PropertyValue Property="LocalDataProperty" PropertyPath="CompanyCode"/>
<PropertyValue Property="ValueListProperty" String="CompanyCode"/>
</Record>
</Collection>
</PropertyValue>
</Record>
</Annotation>
</Annotations>

Result:

3) Line Item :

Showing the line item (table at main page)

<Annotation Term="UI.LineItem">
<Collection>
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="PurchaseOrder"/>
</Record>
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="Supplier"/>
</Record>
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="CompanyCode"/>
</Record>
</Collection>
</Annotation>

Result:

4) Identification:

<Annotation Term="UI.Identification">
<Collection>
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="PurchaseOrder"/>
</Record>
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="CompanyCode"/>
</Record>
</Collection>
</Annotation>

Result:

5) Header Info:

<Annotation Term="UI.HeaderInfo">
<Record Type="UI.HeaderInfoType">
<PropertyValue Property="TypeName" String="{@i18n&gt;PURCHASE_ORDER}"/>
<PropertyValue Property="TypeNamePlural" String="{@i18n&gt;PURCHASE_ORDERS}"/>
<PropertyValue Property="Title">
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="PurchaseOrder"/>
</Record>
</PropertyValue>
<PropertyValue Property="Description">
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="PurchaseOrderType"/>
</Record>
</PropertyValue>
<PropertyValue Property="ImageUrl" String="/sap/bc/ui5_ui5/sap/mm_po_manages1/images/purchaseorder.jpg"/>
</Record>
</Annotation>

Result:

6) Data Point:

<Annotation Term="UI.DataPoint" Qualifier="postatus">
<Record Type="UI.DataPointType">
<PropertyValue Property="Value" Path="PurchaseOrderStatus"/>
<PropertyValue Property="Title" String="po status"/>
</Record>
</Annotation>
<Annotation Term="UI.DataPoint" Qualifier="netamunt">
<Record Type="UI.DataPointType">
<PropertyValue Property="Value" Path="PurchaseOrderNetAmount"/>
<PropertyValue Property="Title" String="NetAmount"/>
</Record>
</Annotation>
<Annotation Term="UI.DataPoint" Qualifier="progress">
Record Type="UI.DataPointType">
<PropertyValue Property="Value" Path="NumberOfOverduePurOrdItm"/>
<PropertyValue Property="Visualization" EnumMember="UI.VisualizationType/Progress"/>
<PropertyValue Property="Title" String="progress data"/>
<PropertyValue Property="TargetValue" Int="100"/>
</Record>
</Annotation>
<Annotation Term="UI.DataPoint" Qualifier="rating">
<Record Type="UI.DataPointType">
<PropertyValue Property="Value" Path="NumberOfOverduePurOrdItm"/>
<PropertyValue Property="Visualization" EnumMember="UI.VisualizationType/Rating"/>
<PropertyValue Property="Title" String="Rating"/>
</Record>
</Annotation>

Result:

7) Header Facets:

map datapoint annotation info here: in below example rating and progress indicator are data point annotations are mapped in header facet as a reference facet.

<Annotation Term="UI.HeaderFacets">
<Collection>
<Record Type="UI.ReferenceFacet">
<PropertyValue Property="Target" AnnotationPath="@UI.DataPoint#netamunt"/>
</Record>
<Record Type="UI.ReferenceFacet">
<PropertyValue Property="Target" AnnotationPath="@UI.DataPoint#postatus"/>
</Record>
<Record Type="UI.ReferenceFacet">
<PropertyValue Property="Target" AnnotationPath="@UI.DataPoint#rating"/>
</Record>
<Record Type="UI.ReferenceFacet">
<PropertyValue Property="Target" AnnotationPath="@UI.DataPoint#progress"/>
</Record>
</Collection>
</Annotation>

Result:

8) object page: Item’s line Item annotations:

Result:

Annotations:

Step1) prepare line item annotations of the corresponding item entity with a Qualifier.

<Annotation Term="UI.LineItem" Qualifier="item">
<Collection>
<Record Type="UI.DataField">
<PropertyValue Path="PurchaseOrderItem" Property="Value"/>
<Annotation Term="UI.Importance" EnumMember="UI.ImportanceType/High"/>
</Record>
</Collection>
</Annotation>

Step2) the step1 qualifier we have to add as a reference facet in the entity UI.Facet annotation

<Annotation Term="UI.Facets">
<Collection>
<Record Type="UI.ReferenceFacet">
<PropertyValue Property="Label" String="Second Facet"/>
<PropertyValue Property="Target" AnnotationPath="to_PurchaseOrderItemTP/@UI.LineItem#item"/>
</Record>
</Collection>
</Annotation>

9) navigate to sub object page:

To navigate to sub object from an objectpage, then below code is used in manifest.json file.

"pages": {
"ObjectPage|to_PurchaseOrderItemTP": {
"navigationProperty": "to_PurchaseOrderItemTP",
"entitySet": "C_PurchaseOrderItemTP",
"component": {
"name": "sap.suite.ui.generic.template.ObjectPage"
}}}

10) Hide objectpage data fields:

we can use UI.Hidden annotation to hide any data field on the screen.

<Annotation Term="UI.Identification">
<Collection>
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="MaterialGroup"/>
<PropertyValue Property="Label" String="material group"/>
<Annotation Term="UI.Hidden" Bool="true"/>
</Record>
</Collection>
</Annotation>

11) Field group annotation:

Here we can group several data fields into groups using the field group annotations.

Result:

Step1) add annotation field group with qualifier:

<Annotation Term="UI.FieldGroup" Qualifier="subobjectfg1">
<Record Type="UI.FieldGroupType">
<PropertyValue Property="Data">
<Collection>
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="OverdelivTolrtdLmtRatioInPct"/>
<PropertyValue Property="Label" String="over delivery"/>
</Record>
</Collection>
</PropertyValue>
</Record>
</Annotation>

Step2) add this qualifier as reference facet in Item’s entity in the annotation file:

<Annotations Target="MM_PUR_PO_MAINT_V2_SRV.C_PurchaseOrderItemTPType">
<Annotation Term="UI.Facets">
<Collection>
<Record Type="UI.CollectionFacet">
<PropertyValue Property="ID" String="GeneralInformation"/>
<PropertyValue Property="Label" String="{@i18n&gt;@GeneralInfoFacetLabel}"/>
<PropertyValue Property="Facets">
<Record Type="UI.ReferenceFacet">
<PropertyValue Property="Target" AnnotationPath="@UI.FieldGroup#subobjectfg1"/>
<PropertyValue Property="Label" String="Delivery"/>
</Record>
</Collection>
</Annotation>
</Annotations>

12) add table in subobject page:

in the sub-object page if we have to add table then, we can make use of below annotations.

Result:

Here inside the item object page , add table for schedules lines using annotations

Step1) add line item annotations for corresponding entity:

<Annotations Target="MM_PUR_PO_MAINT_V2_SRV.C_PurOrdScheduleLineTPType">
<Annotation Term="UI.LineItem">
<Collection>
<Record Type="UI.DataField">
<PropertyValue Property="Value" Path="ScheduleLine"/>
<PropertyValue Property="Label" String="scheduleline"/>
</Collection>
</Annotation>
</Annotations>

Step2) we have to add the above line item annotation as reference facet funder UI.Facet for corresponding entity(the current page navigated entity)

<Annotations Target="MM_PUR_PO_MAINT_V2_SRV.C_PurchaseOrderItemTPType">
<Annotation Term="UI.Facets">
<Collection>
<Record Type="UI.CollectionFacet">
<PropertyValue Property="ID" String="GeneralInformation"/>
<PropertyValue Property="Label" String="{@i18n&gt;@GeneralInfoFacetLabel}"/>
<PropertyValue Property="Facets">
<Collection>
</Collection>
</PropertyValue>
</Record>
<Record Type="UI.ReferenceFacet">
<PropertyValue Property="Target" AnnotationPath="to_PurOrdScheduleLineTP/@UI.LineItem"/>
<PropertyValue Property="Label" String="schedule lines"/>
</Collection>
</Anotation>
</Annotations>

13) more link:

Inside general information tab when we have multiple sections of data and we want to show the data using show more/show less buttons

Result:

<Record Type="UI.ReferenceFacet">
<PropertyValue Property="Target" AnnotationPath="@UI.FieldGroup#DeliveryInvoiceGroup2"/>
<PropertyValue Property="Label" String="delivery group1"/>
<Annotation Term="UI.PartOfPreview" Bool="false"/>
</Record>
Rating: 0 / 5 (0 votes)

The post Some of UI annotations/Local annotations for LROP Fiori application. first appeared on ERP Q&A.

]]>