OData - ERP Q&A https://www.erpqna.com Trending SAP Career News and Guidelines Wed, 09 Apr 2025 12:45:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png OData - ERP Q&A https://www.erpqna.com 32 32 Connecting Multiple Databases and OData Services to SAP BTP using BAS https://www.erpqna.com/connecting-multiple-databases-and-odata-services-to-sap-btp-using-bas/?utm_source=rss&utm_medium=rss&utm_campaign=connecting-multiple-databases-and-odata-services-to-sap-btp-using-bas Wed, 09 Apr 2025 12:45:29 +0000 https://www.erpqna.com/?p=91450 Introduction In this post, I will explain how to connect multiple databases and OData services to SAP BTP using BAS . This guide covers configuring UI5 applications to consume multiple OData services from both SAP S/4HANA (on-premise) and SAP BTP ABAP environment, ensuring seamless data integration. Context I aim to integrate three different OData services […]

The post Connecting Multiple Databases and OData Services to SAP BTP using BAS first appeared on ERP Q&A.

]]>
Introduction

In this post, I will explain how to connect multiple databases and OData services to SAP BTP using BAS . This guide covers configuring UI5 applications to consume multiple OData services from both SAP S/4HANA (on-premise) and SAP BTP ABAP environment, ensuring seamless data integration.

Context

I aim to integrate three different OData services from different database sources :

  1. On-Premise S/4HANA OData V2 Services
    • ZPR_EMP_DETAILS_SRV : Fetches employee details.
    • ZFI_TEST_ODATA_SRV : Fetches financial data.
  2. SAP BTP ABAP Environment OData V4 Service
    • ZUI_TAB_TRAVDAT_O4_AKS_001: Fetches travel-related data using the RAP Model.

Steps to Connect Multiple OData Services

Step 1: Add Data Sources in UI5 Application

  1. Create a UI5 application in VS Code.
  2. While creating the application, provide the service URL to configure an OData service automatically.
  3. Navigate to webapp/manifest.json.
  4. Modify the dataSources section to include multiple OData services as follows:
"dataSources": {
  "mainService": {
    "uri": "/sap/opu/odata/sap/ZPR_EMP_DETAILS_SRV/",
    "type": "OData",
    "settings": {
      "odataVersion": "2.0"
    }
  },
  "subService": {
    "uri": "/sap/opu/odata/sap/ZFI_TEST_ODATA_SRV/",
    "type": "OData",
    "settings": {
      "odataVersion": "2.0"
    }
  },
  "rapService": {
    "uri": "/sap/opu/odata4/sap/zui_tab_travdat_o4_aks_001/",
    "type": "OData",
    "settings": {
      "odataVersion": "4.0"
    }
  }
}

Step 2: Configure Models in manifest.json

Each model corresponds to an OData service:

"models": {
  "": {
    "dataSource": "mainService",
    "preload": true,
    "settings": {}
  },
  "subModel": {
    "dataSource": "subService",
    "preload": true,
    "settings": {}
  },
  "rapModel": {
    "dataSource": "rapService",
    "preload": true,
    "settings": {
      "operationMode": "Server",
      "autoExpandSelect": true,
      "earlyRequests": true
    }
  }
}

Step 3: Configure ui5.yaml for Backend Connections

In ui5.yaml, configure the destinations for OData services:

server:
  customMiddleware:
    - name: ui5-middleware-simpleproxy
      afterMiddleware: compression
      configuration:
      ui5:
          path:
            - /resources
            - /test-resources
          url: https://ui5.sap.com
        backend:
# On-premise SAP System Destination 1 - Replace with your URL
        - scp: true
        path: /sap/opu/odata/sap/ZPR_EMP_DETAILS_SRV/
        url: https://your-onpremise-system-hostname:port/
        destination: SAHANA_CONN
        client: 'your client ID'
        authenticationType: BasicAuthentication

  # On-premise SAP System Destination 2 - Replace with your URL  
        - scp: true
        path: /sap/opu/odata/sap/ZFI_TEST_ODATA_SRV
        url: https://your-onpremise-system-hostname:port/
        destination: SAHANA_CONN
        client: 'your client ID'
        authenticationType: BasicAuthentication

  # SAP BTP ABAP Environment Destination - Replace with your instance URL
        - scp: true
        path: /sap/opu/odata/sap/zui_tab_travdat_o4_aks_001/
        url: https://your-abap-instance-guid.abap.region.hana.ondemand.com
        destination: abap-cloud-default_abap-trial-9a87acc7trial-dev
        client: 'your client ID'
        authenticationType: OAuth2UserTokenExchange

Destination:

S4HANA_CONN is used for two OData V2 services hosted in the on-premise S/4HANA system as shown in below screenshot.

abap-cloud-default_abap-trial-9a87ace7trial-dev is for the RAP-based OData V4 service hosted on SAP BTP as shown in below screenshot.

Authentication Types:

  • On-premise S/4HANA services use BasicAuthentication, which is simpler but less secure for cloud-based scenarios.
  • The SAP BTP service uses OAuth2UserTokenExchange, enabling secure token-based communication.

All the Connected OData will be Listed in Service Manager as shown in below screenshot.

Avoid connecting directly to OData services via the Service Manager, as it can lead to data inconsistency issues (e.g., all requests may unintentionally retrieve only the last edited destination). Instead, I prefer using the Destination Service (as shown in the configuration above) for:

  • Stable & secure connections
  • Proper authentication handling (Basic/OAuth2)
  • Consistent data retrieval (prevents mixing/overwriting datasets).

Conclusion

By following these steps, you can successfully integrate multiple OData services in a SAP BTP-based Fiori application. This setup allows seamless data flow between on-premise S/4HANA and cloud-based SAP BTP ABAP environment, ensuring a modern, scalable architecture.

Rating: 5 / 5 (1 votes)

The post Connecting Multiple Databases and OData Services to SAP BTP using BAS first appeared on ERP Q&A.

]]>
How to enhance system generated ODATA service in SAP BW4HANA https://www.erpqna.com/how-to-enhance-system-generated-odata-service-in-sap-bw4hana/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-enhance-system-generated-odata-service-in-sap-bw4hana Sat, 23 Nov 2024 09:19:09 +0000 https://www.erpqna.com/?p=88803 Overview: This blog will guide you through the steps to enhance the system generated OData service In BW4HANA. Prerequisites: In BW4HANA Odata service can be generated automatically from BW query by clicking checkbox “By Odata”, this service is then can be activated in any SAP Gateway system and consumed from there. In case of any […]

The post How to enhance system generated ODATA service in SAP BW4HANA first appeared on ERP Q&A.

]]>
Overview: This blog will guide you through the steps to enhance the system generated OData service In BW4HANA.

Prerequisites:

  • SAP BW4HANA – SAP BW Query
  • SAP BW4HANA – OData Services

In BW4HANA Odata service can be generated automatically from BW query by clicking checkbox “By Odata”, this service is then can be activated in any SAP Gateway system and consumed from there.

In case of any requirements to enhance the system generated service below steps can be followed.

Steps:

Step 1: Generate OData service from BW Query.

Step 2: Create a project in t-code SEGW. Please remember <Project Name>_SRV is going to be the new service name, so please name the project accordingly.

Step 3: Right click on the Data Model and click on “Redefine” -> “OData Service (SAP GW)

Step 4: In this screen it will ask for a service where we need to select the system generated service that needs to be enhanced and click “Next”

Step 5: Click Select all and click on Finish

Step 6: Enhance the Result set with new field and Save.

Step 7: Right click on project and click on “Generate Runtime”

Runtime artifacts will be generated like shown below

Step 8: Go to DPC_EXT (Data provider extended class)

Select GET_ENTITYSET method and click on Redefine button

Step 9: Write the logic to populate the newly added field here. Super class get_entityset code will be shown by default, start the new logic after the default code. Please note that this code will be executed for each and every row in output.

Step 10: Test the output of OData service.

Go to TCODE /IWFND/GW_CLIENT, give the URL in Request URI and click on execute.

Validate the output with written logic.

Rating: 5 / 5 (1 votes)

The post How to enhance system generated ODATA service in SAP BW4HANA first appeared on ERP Q&A.

]]>
Integration Between SAP Datasphere and SAP Analytics Cloud https://www.erpqna.com/integration-between-sap-datasphere-and-sap-analytics-cloud/?utm_source=rss&utm_medium=rss&utm_campaign=integration-between-sap-datasphere-and-sap-analytics-cloud Thu, 06 Jun 2024 10:33:11 +0000 https://www.erpqna.com/?p=85310 In this blog, I want to provide the steps we followed to create an OData Service Connection to send data from Datasphere to SAC. By integrating Datasphere Actuals into SAC, we lay the foundation for strategic decision-making, driving business growth and agility. Sending Data from Datasphere to SAC via OData Service Our goal for this […]

The post Integration Between SAP Datasphere and SAP Analytics Cloud first appeared on ERP Q&A.

]]>
In this blog, I want to provide the steps we followed to create an OData Service Connection to send data from Datasphere to SAC. By integrating Datasphere Actuals into SAC, we lay the foundation for strategic decision-making, driving business growth and agility.

Sending Data from Datasphere to SAC via OData Service

Our goal for this task is to consume actuals from Datasphere into SAC for reporting and planning purposes. We will combine the actuals from Datasphere with planning data in SAC.

Here is a list of steps we took to import Datasphere actual data into SAC:

1. Identify the OData Source (Data Service URL) in Datasphere

In order for us the identify the Data Service URL we went to the following link. Within this URL, you can see all the assets available within the spaces available in your Datasphere tenant.

    Within this URL, we copied the assetRelationDataURL for the dataset we required. This is your Data Service URL. We then looked up our chosen assetRelationDataURL using the URL in the figure below. This will get us the meta data from the OData Service.

    NOTE: To view the data from your chosen asset, add the dataset name again at the end of the URL.

    2. Lookup your redirect URI in SAC

    The redirect URI in SAC is the specific endpoint URL. The URI is the destination to which the authentication server redirects the user’s browser after authentication has been successfully completed.

      We added an OData Services connection in our SAC tenant. We then change the Authentication Type to OAuth 2.0 Authorization Code. At the bottom of this window, we can see the Redirect URI.

      3. Create the OAuth Client in Datasphere

      We now want to create an OAuth client in Datasphere. We need the Redirect URI from the above step to create the OAuth client. We do this by going to System >> Administration >> App Integration and click on the ‘Add a New OAuth Client’. Once you paste the Redirect URI and click Add, you’ll get a list of the following information which you will need to take a note of.

      • OAuth Client ID
      • Secret
      • Authorization URL
      • Token URL

      4. Create OData Connection in SAC

      Within SAC, we go to Connections >> Add a Connection and chose OData Service. You need to now add the following connection parameters:

      • Data Service URL = assetRelationalDataUrl
      • OAuth Client ID
      • Secret
      • Token URL
      • Authorisation URL

      5. Import from OData Connection

      We now need to create a model in SAC. We chose OData Services as the data source and click the new connection we just made. We then create a new query and select which dimensions and measures you want when building that query.

      You can see that we have now consumed actuals data from datasphere to SAC! At this stage you can clean data and fix any issues. You have successfully imported the actuals data from OData Connection!!

      6. Create Story and Test

      As you can see here, we created a story. We added a table to our story. We chose the query we created above as the data source for the table. We added ‘Category’ as a column and using the version management functionality we can choose both the Actual and Plan category. This shows Actual data from Datasphere and Plan data within SAC.

      Rating: 0 / 5 (0 votes)

      The post Integration Between SAP Datasphere and SAP Analytics Cloud first appeared on ERP Q&A.

      ]]>
      Service Consumption Model 2 for OData Client Proxy https://www.erpqna.com/service-consumption-model-2-for-odata-client-proxy/?utm_source=rss&utm_medium=rss&utm_campaign=service-consumption-model-2-for-odata-client-proxy Tue, 07 Nov 2023 10:19:24 +0000 https://www.erpqna.com/?p=79270 Introduction This blog post will describe the Service Consumption Model 2 for OData. I will describe its benefits, provide a description of the model, take a look at the OData Client Proxy at runtime and also dive into the ABAP cross trace integration as well. If you want to receive or send data to an […]

      The post Service Consumption Model 2 for OData Client Proxy first appeared on ERP Q&A.

      ]]>
      Introduction

      This blog post will describe the Service Consumption Model 2 for OData. I will describe its benefits, provide a description of the model, take a look at the OData Client Proxy at runtime and also dive into the ABAP cross trace integration as well.

      If you want to receive or send data to an OData V2 or V4 service within SAP BTP, ABAP Environment or SAP S/4HANA ABAP Environment, you can use the “Service Consumption Model 2” for OData. As of 2311, it is also possible to consume also Complex Types, Complex Collection, Action (bound) and Functions (bound). Of course, the consumption of EntityTypes and EntitySets is also possible.

      In addition you get fewer generated artifacts. The persistence of the underlying model has changed completely. In the first version you get an abstract CDS View for each EntityType. For large services with multiple EntityTypes, you could end up with a lot of artifacts. Now only one class with type definitions is created.

      Many artifacts in the first version of the Service Consumption Model for OData

      Use case

      If you have an SAP BTP, ABAP Environment or SAP S/4HANA ABAP Environment system you can use the Service Consumption Model for OData. From here you can connect to a Cloud or an On premise system.

      Scenario

      In my scenario I am consuming the /dmo/travel service from another Cloud system. A call to the EntitySet Travel returns the following data:

      GET /sap/opu/odata4/dmo/api_travel_u_v4/srvd_a2x/dmo/travel_u/0001/Travel
      {
          "@odata.context": "$metadata#Travel",
          "@odata.metadataEtag": "W/\"20230919122803\"",
          "@odata.nextLink": "/sap/opu/odata4/dmo/api_travel_u_v4/srvd_a2x/dmo/travel_u/0001/Travel?$skiptoken=100",
          "value": [
              {
                  "@odata.etag": "W/\"SADL-202305191948080000000C~20230519194808.0000000\"",
                  "AgencyID": "70041",
                  "AgencyName": "Maxitrip",
                  "BeginDate": "2023-06-03",
                  "BookingFee": 40.0,
                  "CurrencyCode": "USD",
                  "CustomerID": "594",
                  "CustomerName": "Ryan",
                  "EndDate": "2024-03-31",
                  "LastChangedAt": "2023-05-19T19:48:08Z",
                  "Memo": "Vacation",
                  "SAP__Messages": [
                  ],
                  "Status": "P",
                  "StatusText": "Planned",
                  "TotalPrice": 1889.0,
                  "TravelID": "1"
              },
              ...

      Service Consumption Model 2 for OData

      As of 1808, you can use the Service Consumption Model for OData. From 2311 on, the wizard in the ADT will automatically uses the new version.

      Wizard

      Select File -> New -> Service Consumption Model and choose OData as Consumption mode in ADT.

      The consumption system requires a representation of the remote service. This knowledge is used to create the URL, write and read the JSON form the HTTP requests and responses. Therefore, the wizard needs the EDMX file (a service metadata document that describes the data model exposed by the service as an HTTP endpoint. OData uses EDMX as the format for this description) of the remote service. You can get this by adding $metadata to the end of the service document, in my case it is the following URL:

      GET /sap/opu/odata4/dmo/api_travel_u_v4/srvd_a2x/dmo/travel_u/0001/$metadata
      

      I saved this file on my computer to use this in the wizard. In addition I chose ZBG_TRAVEL_SCM as the class name. This class is the model representation and will contain all the types for my client.

      EDMX import and class name

      The EDMX file is analyzed beforehand to identify potential problems. It may be that certain artifacts are ignored, for example, parts of them violate the OData metadata rules. In my case, the EDMX file describes several EntityTypes, EntitySets, a complex type and a Bound action.

      Analyse of EDMX file

      The next step looks for the OptimisticConcurrency annotation of the Org.OData.Core.V1 vocabulary. If an EntitySet has this annotation, modifying requests must use an etag. If an EntitySet does not have this annotation, you can select the ETag support here.

      ETag support

      Finally, I get the Service Consumption Model 2 for OData:

      Service Consumption Model 2 for OData

      In the upper left section, you can see the model class that describes the /dmo/travel service. I use the code snippets from Travel EntitySet and Read list as operation, as a starting point for my OData client.

      Model class

      The model class has the following parts:

      1. Type definitions that can be used in my client code. I use a table of zbg_travel_scm=>tys_travel_type to retrieve the travel data.
      2. To find the corresponding types, constants are created for EntitySets, EntityTypes, ComplexTypes, Actions and Functions. Here is the ABAP doc for the EntityType constant, including the link to the type:

      3. The model definition is done in several method implementations. The types are used to define all the artifacts of the remote service. In addition a mapping between the ABAP and the EDMX name is done here. ABAP artifacts are limited to 30 characters. In EDMX they can be up to 128 characters long in camel case.

      If you need to customize the result of the wizard, e.g. because you need to adapt to certain naming conventions, you can modify the source code of the generated class and adapt it to your needs.

      Support for Action and Functions

      To call an action or a function, structures and tables for the parameter are generated. With 2311 the ADT integration is still missing. So I can’t select a bound action, an action import, a bound function or a function import and use the code snippet from the ADT. However, the parameter structure is there and I can use it at runtime to call the operation.

      Connecting to a remote service

      In a cloud system I need an outbound communication scenario, an outbound service and a communication arrangement for the http connection. Tutorial: Prepare Consuming System and Service Consumption Model describes the steps to achieve this.

      OData Client Proxy at Runtime

      For my client, I have used the code snippet from ADT (right hand side in the Service Consumption Model) to read the EntitySet Travel. The client code uses the type from the model. I changed three things after that:

      1. Establish the HTTP connection using the communication scenario and the outbound service.
      2. Change the IV_RELATIVE_SERVICE_ROOT parameter of the factory (line 56 in the screenshot below) to point to the OData service. This path depends on the communication system.
      3. Add the out->write() statement at the end (line 81).

      And voila, the GET request and transformation of the JSON response to ABAP was done for me.

      OData Client Proxy at Runtime

      Conclusion

      The Service Consumption Model 2 for OData supports more OData features and generates fewer artifacts. The model class can be adapted to the my needs, if the result of the wizard does not exceed my expectations. It would be great to get feedback from you if this is indeed the case to further improve the wizard.

      Tip for developers

      Sometimes it is good to know what the OData Client Proxy does under the hood. Especially connecting to another system can be tricky. Therefore is the OData Client Proxy part of the ABAP Cross trace (ADT Windows -> Show View -> ABAP Cross trace):

      Activate OData Client Proxy in cross trace

      In the trace result you can see for example the response payload and the CSRF token fetch:

      http payload in cross trace
      Rating: 0 / 5 (0 votes)

      The post Service Consumption Model 2 for OData Client Proxy first appeared on ERP Q&A.

      ]]>
      Filter in Function import using oData and UI5 https://www.erpqna.com/filter-in-function-import-using-odata-and-ui5/?utm_source=rss&utm_medium=rss&utm_campaign=filter-in-function-import-using-odata-and-ui5 Thu, 08 Jun 2023 10:40:08 +0000 https://www.erpqna.com/?p=75300 I am writing this blog for beginners on how to filter the data using function Import in SAP UI5 using OData which will be helpful for both front end and back end developer. Function imports are used to do the GET and POST operations for the requirements, which are not possible by the standard operations […]

      The post Filter in Function import using oData and UI5 first appeared on ERP Q&A.

      ]]>
      I am writing this blog for beginners on how to filter the data using function Import in SAP UI5 using OData which will be helpful for both front end and back end developer.

      Function imports are used to do the GET and POST operations for the requirements, which are not possible by the standard operations available in OData.

      Sometimes we may want to filter the data based on names or city(non key value), that time we can use function import.

      We will implement the function import code inside EXECUTE_ACTION method by redefining this method in the DPC Extension class.

      Steps to create Function Import as follows.

      First we will create function import in odata.

      Step1:- Create a table. Use t-code se11

      
CREATING A TABLE
      CREATING A TABLE
      enter the info

      here we will create field and we will give data element.

      click on data element of phone.

      click yes

      here you have to give domain name.

      here give data type name and no. of character

      then activate it.

      Table

      we have to go to enhancement category

      click on more—extras— click on Enhancement category.

      enhancement of data
      select data class

      now activate the table.

      After this, Enter the data in table.

      more—utilities—table content—create entries.

      table entries
      table data

      Step 2:- Go to SEGW and create a new project.

      Give project name and save in local object or package.

      project created
      now your project will look like this

      Step 3:- Right click on Data Model and select import and then select DDIC structure.

      Give Name and table name and click next.

      Select table name and click next

      select is Key and click on finish.

      now generate runtime artifacts.

      runtime artifacts generated

      Step 4:- Right Click on data model, select create and then click on function import.

      runtime artifacts generated

      Give Function import name and click on continue.

      Fill like this.
      Now project look like this .

      Click on function import folder and fill the following.

      What you want to search you can give the Name.

      After this, generate runtime artifacts.

      Step 5:- Click on DPC_EXT and redine EXECUTE_ACTION.

      Write this code in execute_action method.

      Come back to segw.

      Click on Service maintanence and register. Give Local and select continue and then select Local package.

      After this, Click on SAP Gateway client and select yes.

      Step 6:- Give Function Import name followed by what you want to display the data

      Here I have given Designation=’DEVELOPER’.

      So only developer data will be visible.

      This is about oData function import.

      —————————————————————————————————————————-

      Function Import using UI5 in VS code

      if you have not installed vs code, install vs code and node js.

      open vs code— click on view—then command palette—then explore and install generator.

      install all generator

      Step 1:- Create new project(click on view—then command palette—click on fiori: open application generator).

      select basic

      Step 2:- Select service of function import.

      data source as connect to a system.

      system:- destination configuration

      service as segw function import service.

      click on next.

      Give View name and project name and click on finish.

      After this project template will be generated.

      step 3:- First we will create a json model to store the data.

      click on model—create a file as data.json

      json model

      Configure in manifest.json

      Here “data” is named model.

      step 4:- in View.

      view name is Home.view.xml

      Here I have used Select control to select the key and table to display the data based on selection.

      <mvc:View xmlns:core="sap.ui.core" controllerName="functionimport.controller.Home"
      xmlns:mvc="sap.ui.core.mvc" displayBlock="true" xmlns="sap.m">
      <App >
      <Page id="page" title="{i18n>title}">
      <customHeader>
      <Bar >
       <contentLeft>
      <Label />
      <Title text="Designation"/>
      <Select selectedKey="SELECT" id="designationsearchid" change="onPress">
      <items>
      <core:Item text="-----Select one------" key="SELECT"></core:Item>
      <core:Item text="DEVELOPER" key="DEVELOPER"></core:Item>
      <core:Item text="TESTER" key="TESTER"></core:Item>                            
      </items>
      </Select>
      </contentLeft>
      </Bar>
      </customHeader>
      <content>
      <Table items="{data>/}">
      <columns>
          <Column >
          <Text text="NAME"></Text>
          </Column>
          <Column >
          <Text text="EMAIL"></Text>
          </Column>
          <Column >
           <Text text="PHONE"></Text>
          </Column>
          <Column >
          <Text text="AGE"></Text>
          </Column>
          <Column >
          <Text text="DESIGNATION"></Text>
          </Column>
          </columns>
              <items>
                 <ColumnListItem >
                    <cells>
                      <Text text="{data>Name}"></Text>
                    </cells>
      
                    <cells>
                       <Text text="{data>Email}"></Text>
                    </cells>
      
                     <cells>
                        <Text text="{data>Phone}"></Text>
                     </cells>
      
                     <cells>
                        <Text text="{data>Age}"></Text>
                      </cells>
      
                      <cells>
                        <Text text="{data>Designation}"></Text>
                      </cells>
      
                  </ColumnListItem>
               </items>
            </Table>
      
                  
              </content>
          </Page>
         </App>
      </mvc:View>

      step 5:- in Controller

      controller name :-Home.controller.js

      in controller first we should get the selected key, and get the model.

      Then we have to use callFunction where we need to give entityset, method and urlparameters.

      onPress:function(){
                      debugger
        var selectedKey=this.getView().byId("designationsearchid").getSelectedKey()
                      var oModel=this.getView().getModel('data')
                      var oContext=this;
                      var oDataModel=this.getOwnerComponent().getModel()
                      var arr=new Array();
      
                      oDataModel.callFunction(
                          "/ZAP_FunctionDemo", {
                              method:"GET",
                              urlParameters:{
                                  DESIGNATION:selectedKey
                              },
                              success:function(oData,response){
      
                                  arr=oData.results
      
                                  oModel.setData(arr)
                                  oContext.getView().setModel(oModel);
                                  debugger
                              },
                              error:function(oError)
                              {
      
                              }
                          }
                      );
                  }
              });

      ————————————output——————————–

      run the program in terminal using npm start.

      output view
      on select of developer
      on select of tester
      Rating: 0 / 5 (0 votes)

      The post Filter in Function import using oData and UI5 first appeared on ERP Q&A.

      ]]>
      VSCODE Color Theme Support in OData CSDL Modeler https://www.erpqna.com/vscode-color-theme-support-in-odata-csdl-modeler/?utm_source=rss&utm_medium=rss&utm_campaign=vscode-color-theme-support-in-odata-csdl-modeler Fri, 13 Jan 2023 11:30:38 +0000 https://www.erpqna.com/?p=71724 In this blog post, we’ll introduce the VSCODE color theme support in the new OData CSDL Modeler 1.0.6 release. VSCODE Color Theme Support in OData CSDL Modeler We introduced a new graphical modeler tool called OData CSDL modeler for VSCODE environment that provides OData developers a graphical UI to view OData CSDL document. With the […]

      The post VSCODE Color Theme Support in OData CSDL Modeler first appeared on ERP Q&A.

      ]]>
      In this blog post, we’ll introduce the VSCODE color theme support in the new OData CSDL Modeler 1.0.6 release.

      VSCODE Color Theme Support in OData CSDL Modeler

      We introduced a new graphical modeler tool called OData CSDL modeler for VSCODE environment that provides OData developers a graphical UI to view OData CSDL document. With the latest release of CSDL modeler version 1.0.6, the modeler now supports VSCODE color themes.

      In order to test VSCODE color themes support in the OData CSDL Modeler, first open the OData CSDL or EDMX file using the model through the VSCODE context menu “Open With…”. And then launch color theme through VSCODE menu “File” -> “Preferences” -> “Color Themes”:

      Now you can see the available color themes in VSOCDE as below:

      And then you can switch to different color themes:

      Rating: 0 / 5 (0 votes)

      The post VSCODE Color Theme Support in OData CSDL Modeler first appeared on ERP Q&A.

      ]]>
      Importing an external data model using CDS Graphical Modeler for Visual Studio Code https://www.erpqna.com/importing-an-external-data-model-using-cds-graphical-modeler-for-visual-studio-code/?utm_source=rss&utm_medium=rss&utm_campaign=importing-an-external-data-model-using-cds-graphical-modeler-for-visual-studio-code Fri, 13 May 2022 11:23:11 +0000 https://www.erpqna.com/?p=63015 In this blog post, we’ll demonstrate how to import an external OData data model from an edmx file to an existing CDS model, and create a relationship to one of the OData entities from this external data model through the CDS Graphical Modeler for Visual Studio Code. Preparing the External Data Model First install the […]

      The post Importing an external data model using CDS Graphical Modeler for Visual Studio Code first appeared on ERP Q&A.

      ]]>
      In this blog post, we’ll demonstrate how to import an external OData data model from an edmx file to an existing CDS model, and create a relationship to one of the OData entities from this external data model through the CDS Graphical Modeler for Visual Studio Code.

      Preparing the External Data Model

      First install the CDS Graphical Modeler for VSCODE from the VSCODE Marketplace at https://marketplace.visualstudio.com/items?itemName=SAPSE.vscode-wing-cds-editor-vsc.

      After preparing your CDS project at VSCODE, open your cds model through the CDS Graphical Modeler. Then try to create an entity by clicking the “Add Entity” toolbar button:

      Create a “Customers” entity on the entity dialog in the model:

      Click “Create” button and close the dialog, and you will see that the new “Customers” entity has been created:

      Now before importing the external data model, copy the edmx file that contains the OData model to your project:

      You can also look into the OData model through the OData CSDL Modeler which is also a VSCODE extension by installing the tool at https://marketplace.visualstudio.com/items?itemName=SAPSE.vsc-extension-odata-csdl-modeler.

      Open the edmx file using the CSDL modeler:

      Since the imported edmx file contains a complex OData model, you can locate the external model by clicking the “Search” button from the toolbar that launches the entity search dialog:

      Suppose you’re importing the “BusinessPartner” entity into your CDS model, now type “businesspartner” to locate your entity:

      Click the entity type in the list:

      The entity will be shown in the canvas. You can then check the details of this entity from both the canvas as well as the property sheet:

      Importing an External Data Model using CDS Graphical Modeler

      Let’s go back to the CDS model. Click “Import” button on the model:

      Click “External Files” menu item:

      Select the edmx file and click “Open” button to close the dialog:

      You can optionally provide an alias for the imported data model, otherwise click “OK” button to finish the import. CDS model will navigate to the imported namespace automatically:

      Click the “Home” button to go back to your main CDS namespace:

      If we try to create an association from the newly created “Customers” entity to the business partner entity, select the entity and click “Add Relationship” context menu:

      You will see the linkage line moving along with your mouse.

      This will launch the relationship dialog:

      In the “Target Entity Type” drop down list, type in “businesspartner” to filter the entity types:

      Select the business partner entity from the drop down list:

      Click “Create” button and close the dialog:

      Now we have successfully created a managed association from our data model to an external entity that is from an imported OData model in an edmx file.

      Rating: 0 / 5 (0 votes)

      The post Importing an external data model using CDS Graphical Modeler for Visual Studio Code first appeared on ERP Q&A.

      ]]>
      Create a Business Service using CDS, Node.js, SQLite and a OData service combined with an SAPUI5 App https://www.erpqna.com/create-a-business-service-using-cds-node-js-sqlite-and-a-odata-service-combined-with-an-sapui5-app/?utm_source=rss&utm_medium=rss&utm_campaign=create-a-business-service-using-cds-node-js-sqlite-and-a-odata-service-combined-with-an-sapui5-app Tue, 13 Jul 2021 08:33:13 +0000 https://www.erpqna.com/?p=50680 Introduction In this blog post you will learn, how to create a basic business service using Core Data & Services (CDS), Node.js, and SQLite, by making use of the SAP Cloud Application Programming Model (CAP) combined with a SAP Fiori Worklist Floorplan. Basically, the same thing can be done by using Visual Studio Code with […]

      The post Create a Business Service using CDS, Node.js, SQLite and a OData service combined with an SAPUI5 App first appeared on ERP Q&A.

      ]]>
      Introduction

      In this blog post you will learn, how to create a basic business service using Core Data & Services (CDS), Node.js, and SQLite, by making use of the SAP Cloud Application Programming Model (CAP) combined with a SAP Fiori Worklist Floorplan.

      Basically, the same thing can be done by using Visual Studio Code with the extension for SAP CDS Language Support. I choose the Business Application Studio, because this gives me the opportunity to make use of the layout editor, which is not yet available in Visual Studio Code.

      Motivation

      I love to go out cycling or walking and I noticed, that depending on the season or the weather or the location, some garbage bins are often overflowing while others that are more remote, remain empty.

      SAPUI5

      So, I thought about a business service which every community can integrate in their SAP landscape, to keep their community clean.

      The idea is to build an app, that shows the level of each garbage bin or glas container. Every bin or container has a sensor(in this tutorial simulated by a Postman collection) that reports the level to the SAP app via OData service, so the waste collector can empty the bin or container.

      Thinking about the near future, this can trigger a self driving garbage collector to empty the garbage bin or glas container on demand. To simulate such a scenario I have build a little proof of concept with a self driving toy car, controlled by a Raspberry Pi and a Bosch connector, that reports the level of a bin or container to the SAP app. This tutorial will focus on the SAP business app though.

      Prerequisites

      For this job is to have set up an free account on SAP BTP TRIAL:

      https://developers.sap.com/tutorials/hcp-create-trial-account.html

      1. Create Dev Space on SAP BTP TRIAL account

      Got to SAP BTP Trial and click on(or go directly to) SAP Business Application Studio

      Then “Create Dev Space”

      Give your Dev Space a name, I called it TestDev and choose “Full Stack Cloud Application”. Click on “Create Dev Space”:

      Then the Dev Space needs a minute or two to get started.

      Open your TestDev Space and the SAP Business Application Studio shows up:

      2. Deploy a data model to an SQLite database

      Add a new file to the “db” folder by right-clicking and “New” and call it data-model.cds.

      Add the following code to data-model.cds:

      namespace my.communitymanager;
      using { managed } from '@sap/cds/common';
      
      entity Buckets {
      key ID : Integer;
      title : String;
      location : Association to Locations;
      locationName : String;
      level : Boolean;
      }
      
      entity Locations {
      key ID : Integer;
      name : String;
      bucket : Association to Buckets on bucket.location = $self;
      long : Decimal;
      lat : Decimal;
      }
      
      entity CallsForDisposal : managed {
      key ID : UUID;
      bucket : Association to Buckets;
      level : Boolean;
      }

      Add a new file to the “db” folder by right-clicking and “New” and call it csv/my.communitymanager-Buckets.csv to create a the file in the new folder csv.

      Add the following code to my.communitymanager-Buckets.csv:

      ID;title;location_ID;locationName;level
      101;Bucket1;201;Dietmar-Hopp-Allee;false
      102;Bucket2;202;Rudolf-Diesel-Strasse;true
      103;Glas-Container1;203;Daimlerstrasse;false
      104;Glas-Container2;204;Robert-Bosch-Strasse;true

      Add a new file to the “db” folder by right-clicking and “New” and call it csv/my.communitymanager-Locations.csv to create a the file in the new folder csv.

      Add the following code to my.communitymanager-Locations.csv:

      ID;name;long;lat
      201;Dietmar-Hopp-Allee;8.642059;49.293005
      202;Rudolf-Diesel-Strasse;8.644433;49.291635
      203;Daimlerstrasse;8.645690;49.289374
      204;Robert-Bosch-Strasse;8.647599;49.291037

      Save with Ctrl + S.

      If CDS server is running, stop it with CTRL + D in the terminal.

      Install SQLite3 packages:

      npm i sqlite3 -D

      Deploy the data model to an SQLite database:

      cds deploy --to sqlite:db/my-communitymanager.db

      Open SQLite and view the newly created database:

      sqlite3 db/my-communitymanager.db -cmd .dump

      To stop SQLite and go back to your project directory, choose CTRL+D.

      Start CDS server again:

      cds watch

      3. Build a Business Service using CAP and Node.js

      Choose “Start from template” to create a new project.

      Choose “CAP Project” and click “Start”:

      Give it a name. I called it CommunityManager and click “Finish”:

      You will see the following structure in the EXPLORER:

      Open a new Terminal by clicking Terminal -> New Terminal and run:

      npm install

      and:

      cds watch

      This will try to start the CDS server. As there’s no content in the project so far, it just keeps waiting for content.

      Add a new file to the “srv” folder by right-clicking and “New” and call it com-service.cds.

      Add the following code to com-service.cds:

      using my.communitymanager as my from '../db/data-model';
      
      service DisposalService {
      
      entity Buckets @readonly as projection on my.Buckets;
      entity Locations @readonly as projection on my.Locations;
      entity CallsForDisposal @insertonly as projection on my.CallsForDisposal;
      
      }

      4. Create an OData service for CRUD operations on database

      Add a new file to the “srv” folder by right-clicking and “New” and call it com-service.js.

      Add the following code to com-service.js:

      module.exports = (srv) => {
      
      const Buckets = 'my.communitymanager.Buckets'
      
      // Set level of bucket. True -> Full, False -> Empty
      srv.before ('CREATE', 'CallsForDisposal', async (req) => {
      const disposal = req.data
      if (disposal.level == 1) console.log('level = true')
      if (disposal.level == 0) console.log('level = false')
      const tx = cds.transaction(req)
      const affectedRows = await tx.run (
      UPDATE (Buckets)
      .set ({ level: disposal.level})
      .where ({ ID: disposal.bucket_ID})
      )
      })
      }

      5. Build a Fiori Worklist Floorplan

      Choose “SAP Fiori Application” to create a new project:

      Choose “SAPUI5 freestyle” and then “SAP Fiori Worklist Application”:

      Choose “Use a Local CAP Project” as data source, select your CAP project as project folder and connect the “DisposalService” as OData service:

      Configure the selected service as following:

      Configure project attributes as following:

      In the Terminal, follow the link to the App:

      Open the SAPUI5 Web Application “/community-manager/webapp/index.html”:

      And see the first draft:

      6. Modify Worklist and use event handler

      Open Worklist.view.xml:

      and add the following code to bring the table in a better shape:

      <mvc:View xmlns="sap.m" xmlns:mvc="sap.ui.core.mvc" xmlns:semantic="sap.f.semantic" controllerName="ns.communitymanager.controller.Worklist"><semantic:SemanticPage id="page" headerPinnable="false" toggleHeaderOnTitleClick="false">
      
      <semantic:titleHeading>
      <Title text="Community Manager"/>
      </semantic:titleHeading>
      
      <semantic:content>
      <Table id="table" width="auto" items="{ path: '/Buckets', sorter: { path: 'title', descending: false } }" noDataText="{worklistView>/tableNoDataText}" growing="true" growingScrollToLoad="true" updateFinished=".onUpdateFinished">
      
      <headerToolbar>
      <Toolbar>
      <Title id="tableHeader" text="Buckets"/>
      <ToolbarSpacer/>
      <SearchField id="searchField" tooltip="{i18n>worklistSearchTooltip}" search=".onSearch" width="auto">
      </SearchField>
      </Toolbar>
      </headerToolbar>
      
      <columns>
      <Column id="nameColumn">
      <Text text="Bucket ID" id="columnID"/>
      </Column>
      <Column xmlns:mvc="sap.ui.core.mvc" xmlns:semantic="sap.f.semantic" xmlns="sap.m" id="unitNumberColumn_copy3" hAlign="Left">
      <header>
      <Text xmlns="sap.m" text="Title" id="columnTitle"/>
      
      </header>
      
      </Column>
      <Column id="unitNumberColumn" hAlign="Left">
      <header>
      <Text xmlns="sap.m" text="Location" id="columnLocationID"/>
      </header>
      </Column>
      <Column xmlns:mvc="sap.ui.core.mvc" xmlns:semantic="sap.f.semantic" xmlns="sap.m" id="unitNumberColumn_copy2" hAlign="Left">
      <header>
      <Text xmlns="sap.m" text="Level" id="columnServiceCall"/>
      
      </header>
      
      </Column>
      </columns>
      
      <items><ColumnListItem type="Navigation" press=".onPress">
      <cells>
      <ObjectIdentifier title="{ID}" id="identifier0"/>
      <ObjectAttribute text="{ path: 'title', formatter: '.formatter.textUnit' }" id="attribute0"/>
      <ObjectNumber number="{ path: 'locationName', formatter: '.formatter.integerUnit' }" unit="{location}" id="number0"/>
      <ObjectStatus xmlns="sap.m" id="status0" state="{= ${level} === 'Yes' ? 'Error' : 'Success'}" text="{= ${level} === 'Yes' ? 'Full' : 'Empty'}"/>
      </cells>
      </ColumnListItem>
      </items>
      </Table>
      </semantic:content>
      
      <semantic:sendEmailAction>
      <semantic:SendEmailAction id="shareEmail" press=".onShareEmailPress"/>
      </semantic:sendEmailAction>
      
      </semantic:SemanticPage>
      </mvc:View>

      And see the changes in the app:

      Open “Worklist.controller.js”:

      and add the onRefresh() call to the onUpdateFinished function, that the function looks like this:

      onUpdateFinished : function (oEvent) {
      // update the worklist's object counter after the table update
      var sTitle,
      oTable = oEvent.getSource(),
      iTotalItems = oEvent.getParameter("total");
      // only update the counter if the length is final and
      // the table is not empty
      this.onRefresh();
      if (iTotalItems && oTable.getBinding("items").isLengthFinal()) {
      sTitle = this.getResourceBundle().getText("worklistTableTitleCount", [iTotalItems]);
      } else {
      sTitle = this.getResourceBundle().getText("worklistTableTitle");
      }
      this.getModel("worklistView").setProperty("/worklistTableTitle", sTitle);
      },

      Check the app once again. It should still look like this:

      Open SQLite in a second terminal window:

      sqlite3 db/my-communitymanager.db -cmd .dump

      Change the Level of Bucket 101:

      sqlite> UPDATE my_communitymanager_Buckets SET level = 1 WHERE ID = 101;

      The Level in Bucket 101 changes from Empty to Full:

      7. Run app on localhost

      Now the OData service can be used by either deploy it in the cloud and expose it, or run it locally.

      To keep things simple, I choose to run it locally here.

      Go to your DevSpace, download the sources and save it to a folder of your choice on your local machine:

      Navigate to the folder where you saved the sources, extract it, navigate inside the root folder of the project and install the CDS development kit:

      npm i -g @sap/cds-dk

      On Linux/Mac follow the steps here.

      Run Node Package Manager installation, to install the dependencies defined in package.json:

      npm install

      Install SQLite3 packages:

      npm i sqlite3 -D

      Start CDS server:

      cds watch

      Open the Web Application in your browser by following the link provided by the CDS server:

      You can see the Community Manger running locally on your machine:

      Download the Postman app and import the postman.json to the postman collection and send the “Calls without UUID” request. (When CDS server is listening on localhost:4004, just click send. Otherwise use the right port):

      See how the Level in the Web Application changes from Full to Empty(level : false -> Empty, level : true -> Full):

      Congratulations, you have created a basic business service with a OData service and a SAPUI5 app. You have learned how to use the SAP Business Application Studio, to use and modify a template and how to trigger a service with a REST API.

      Rating: 0 / 5 (0 votes)

      The post Create a Business Service using CDS, Node.js, SQLite and a OData service combined with an SAPUI5 App first appeared on ERP Q&A.

      ]]>