ABAP Connectivity - ERP Q&A https://www.erpqna.com Trending SAP Career News and Guidelines Sat, 04 Jan 2025 09:18:27 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png ABAP Connectivity - ERP Q&A https://www.erpqna.com 32 32 SAP RAP Unmanaged scenario example-Simplified https://www.erpqna.com/sap-rap-unmanaged-scenario-example-simplified/?utm_source=rss&utm_medium=rss&utm_campaign=sap-rap-unmanaged-scenario-example-simplified Thu, 04 Jul 2024 11:24:06 +0000 https://www.erpqna.com/?p=86138 SAP RAP (ABAP RESTful Application Programming Model) has two main flavors: managed and unmanaged. Let’s focus on the unmanaged version. Unmanaged SAP RAP refers to a development approach where developers have more control over the data persistence and business logic compared to the managed approach. Here are some key aspects Overall, unmanaged SAP RAP provides […]

The post SAP RAP Unmanaged scenario example-Simplified first appeared on ERP Q&A.

]]>
SAP RAP (ABAP RESTful Application Programming Model) has two main flavors: managed and unmanaged. Let’s focus on the unmanaged version.

Unmanaged SAP RAP refers to a development approach where developers have more control over the data persistence and business logic compared to the managed approach. Here are some key aspects

  1. Custom Logic: In unmanaged RAP, developers write their own custom logic for handling data retrieval, manipulation, and persistence. This gives more flexibility in how data is processed and stored.
  2. Direct Database Access: Developers can directly access the database tables and define their own data models using Core Data Services (CDS) views or ABAP classes.
  3. Explicit Service Definition: Unlike managed RAP, where service definitions are automatically generated based on annotations, unmanaged RAP requires developers to explicitly define service implementations and behaviors.
  4. Manual CRUD Operations: CRUD (Create, Read, Update, Delete) operations need to be implemented explicitly in unmanaged RAP, giving full control over how data is managed.
  5. Integration with Existing Systems: Unmanaged RAP is often used when integrating with existing systems or when there is a need for complex business logic that cannot be easily handled by the managed approach.
  6. Flexibility: Developers have more freedom to implement complex validation rules, authorization checks, and other custom requirements directly in the application logic.

Overall, unmanaged SAP RAP provides a more hands-on approach to application development compared to the managed approach, allowing developers to leverage their expertise in ABAP programming and database handling while building modern RESTful APIs.

Top of Form

In this example, we will show a simple application for Employee build with RAP Unmanaged flavors.

Development steps.

To be summarized below object will be created for Unmanaged scenario.

Table ZT01_EMPLOYEE

Base CDS View Z_I_EMPLOYEES_U

Consumption CDS view Z_C_EMPLOYEES_U

Behavior Definition

Bottom of Form

Behavior definition

Implement the Create method

Implement Update Method

Implement Delete Method

Implement Adjust_Numbers method.

Implement Save method.

Test

1. Open the Application.

2. Click on Create. Give Input value and Create.

3. New Record got created.

4. Select any Row , click on Edit.

5. Change the value and Save.

6. Record will be updated.

7. Select the Rows and click on Delete.

8. Records will be deleted.

9. In the Database table also you can see the records.

So, all the CRUD operation is successful using RAP Unmanaged flavors.

Rating: 0 / 5 (0 votes)

The post SAP RAP Unmanaged scenario example-Simplified first appeared on ERP Q&A.

]]>
Lead & Lag : SQL Window Functions and a bit more … https://www.erpqna.com/lead-lag-sql-window-functions-and-a-bit-more/?utm_source=rss&utm_medium=rss&utm_campaign=lead-lag-sql-window-functions-and-a-bit-more Tue, 04 Jun 2024 12:28:39 +0000 https://www.erpqna.com/?p=85227 In this blog, we will examine some of the SQL window functions available within ABAP-SQL and delve into their application through the use of examples. So, here’s the problem: We have a table called ZMOVIE_SEATS with two columns, SEAT_ID (Integer) and STATUS (Boolean). SEAT_ID is an auto increment primary key (this column will have values […]

The post Lead & Lag : SQL Window Functions and a bit more … first appeared on ERP Q&A.

]]>
In this blog, we will examine some of the SQL window functions available within ABAP-SQL and delve into their application through the use of examples.

So, here’s the problem: We have a table called ZMOVIE_SEATS with two columns, SEAT_ID (Integer) and STATUS (Boolean). SEAT_ID is an auto increment primary key (this column will have values like 1,2,3 … so on). Our goal is to find out which consecutive seat ids are occupied (or unoccupied which is simple flip in the solution). This problem is also known as the “Consecutive Available Seat Problem” The trick is that we need to solve it using SQL alone. That’s what caught my attention. It’s easy enough to solve using programming constructs like loops and conditionals, but doing it with SQL alone is what makes it interesting according to me.

Definitely, there could be multiple variations of the solutions, but the one I intend to present here is to utilize window functions, namely LEAD and LAG, while concurrently introducing the concept of window functions to readers who may not yet be acquainted with it.

Sample Input Table Data:

SAP ABAP Development, SAP ABAP

Expected Output: 3, 4, 5

Before we get to solving it using window functions, lets learn what are the “window functions

What are window functions?

Window functions in SQL are a powerful tool for performing calculations across a set of rows that are related to the current row. They provide the ability to perform aggregate-like operations (e.g., SUM, COUNT, AVG) or ranking calculations (e.g., RANK, ROW_NUMBER) without actually grouping the result set into fewer rows. This means that each row maintains its individual identity while still incorporating information from its surrounding “window” of rows.

Now, let’s look at definitions of some key terms and concepts:

  • Windows: A window is essentially a subset of rows within your result set. You define the window using the OVER clause. It can encompass the entire result set, a partition (group) of rows, or a specific range of rows relative to the current row.
  • Window Frame: The window frame is the specific set of rows within the window that are used in the calculation for a particular row. You can control the frame’s boundaries using keywords like ROWS BETWEEN, UNBOUNDED PRECEDING, and CURRENT ROW.
  • Partitioning (PARTITION BY): This clause divides the result set into distinct groups or partitions. Window functions are then applied separately within each partition.
  • Ordering (ORDER BY): This clause sorts the rows within the window or partition, which is crucial for calculations like running totals or rankings.

Types of Window Functions

  • Aggregate Window Functions: These functions (SUM, AVG, COUNT, MIN, MAX) work similarly to regular aggregate functions but without collapsing rows.
  • Ranking Window Functions: These functions (RANK, DENSE_RANK, ROW_NUMBER, NTILE) assign rankings or numerical positions to rows within their window.
  • Value Window Functions: These functions (LAG, LEAD, FIRST_VALUE, LAST_VALUE) access values from other rows within the window.

Why Use Window Functions?

  • Flexibility: You can perform complex calculations without restructuring your result set.
  • Performance: They are often more efficient than solutions involving self-joins or subqueries.
  • Expressiveness: They make your SQL queries more concise and readable.
TYPES: BEGIN OF movie_seat,
         seat_id TYPE i,
         free    TYPE abap_bool,
       END OF movie_seat,
       movie_seats TYPE STANDARD TABLE OF movie_seat WITH NON-UNIQUE KEY seat_id.

DATA(lt_movie_seats) = VALUE movie_seats( ( seat_id = 1 free = abap_true )
                                          ( seat_id = 2 free = abap_false )
                                          ( seat_id = 3 free = abap_true )
                                          ( seat_id = 4 free = abap_true )
                                          ( seat_id = 5 free = abap_true ) ).

SELECT seat_id                               AS seat_id,
       free                                  AS current_free,
       LAG( free ) OVER( ORDER BY seat_id )  AS previous_free,
       LEAD( free ) OVER( ORDER BY seat_id ) AS next_free
  FROM _movie_seats AS movie_seats
  ORDER BY seat_id
  INTO TABLE (lt_tab).

SELECT seat_id FROM _tab AS data1
  WHERE current_free = 
    AND ( previous_free =  OR next_free =  )
  INTO TABLE (lt_output).

cl_demo_output=>display_data( value = lt_output ).

Program Output:

SAP ABAP Development, SAP ABAP

Now let’s explore what do these functions LEAD and LAG do?

The window functions LEAD and LAG are designed to give you access to data from rows other than the current row you’re processing within a result set. They operate within a specified window (a subset of rows) and help you analyze data in relation to its neighboring rows.

  • LEAD: Peels back the curtain to show you data from rows that come after the current row in your window. It’s like having a sneak peek into the future.
  • LAG: Takes you back in time to reveal data from rows that precede the current row. Think of it as looking through a rearview mirror.

These have the following syntax:

LEAD|LAG( sql_exp1[, diff[, sql_exp2]]

diff determines the number of rows to either look forward or backward for LEAD and LAG respectively. If diff is not specified a value of 1 is assumed implicitly.If the row determined by diff is not in the current window, the result is the null value by default.

OVER Clause is essential for both LEAD and LAG. It defines the window (set of rows) within which the functions operate. The ORDER BY clause within OVER is crucial to establish the sequence of rows.

The below illustrations should be helpful in understanding the concept:

LAG:

SAP ABAP Development, SAP ABAP

LEAD:

SAP ABAP Development, SAP ABAP

Let’s look at two more window functions …

Since we are on the topic of window functions, lets also explore two more window functions namely FIRST_VALUE and LAST_VALUE with an example:

Imagine we have a logging table with below entries:

SAP ABAP Development, SAP ABAP

Now, if we are tasked to identify the first and the last log entry for every user per day then we achieve it by using the above introduced window functions as shown below:

TYPES: BEGIN OF lty_entry_log,
         entry_id TYPE char5,
         time     TYPE tims,
         date     TYPE date,
         user_id  TYPE char10,
       END OF lty_entry_log,
       ltt_entry_log TYPE STANDARD TABLE OF lty_entry_log WITH NON-UNIQUE KEY entry_id.

DATA lt_entry_log TYPE ltt_entry_log.

lt_entry_log = VALUE #( ( entry_id = '10001' time = '084534' date = '20231010' user_id = 'mark' )
                        ( entry_id = '10002' time = '093012' date = '20231010' user_id = 'mark' )
                        ( entry_id = '10003' time = '111547' date = '20231010' user_id = 'john' )
                        ( entry_id = '10004' time = '123022' date = '20231010' user_id = 'john' )
                        ( entry_id = '10005' time = '142551' date = '20231010' user_id = 'mark' )
                        ( entry_id = '10006' time = '083609' date = '20231011' user_id = 'john' )
                        ( entry_id = '10007' time = '092841' date = '20231011' user_id = 'mark' )
                        ( entry_id = '10008' time = '113012' date = '20231011' user_id = 'mark' )
                        ( entry_id = '10009' time = '121855' date = '20231011' user_id = 'mark' )
                        ( entry_id = '10010' time = '143639' date = '20231011' user_id = 'john' ) ).

SELECT DISTINCT user_id,
                date,
                FIRST_VALUE( time ) OVER( PARTITION  BY user_id, date ORDER BY date ) AS first_entry,
                LAST_VALUE( time ) OVER( PARTITION BY user_id, date  ORDER BY date )  AS last_entry
  FROM _entry_log AS log_entry
  ORDER BY date
  INTO TABLE (lt_result).

cl_demo_output=>display_data( lt_result ).

Upon, execution we get the following output which is as expected:

SAP ABAP Development, SAP ABAP

So what do FIRST_VALUE and LAST_VALUE essentially do?

Consider a window, which is a subset of rows from a query result. These functions offer a glimpse into values within that window:

  • FIRST_VALUE: retrieves the value from the very first row in the window.
  • LAST_VALUE: obtains the value from the very last row in the window.

The window is specified using an OVER clause. Similar to LEAD and LAG, FIRST_VALUE and LAST_VALUE require this clause. It’s crucial to specify the ordering of the rows within the window using ORDER BY to determine the “first” or “last” row.

OVER Clause: The OVER clause defines the window. It shares similarities with LEAD and LAG. ORDER BY within the OVER clause is essential in determining which row is considered the “first” or “last”.

PARTITION BY: While optional, PARTITION BY is useful for dividing data into groups.

Rating: 0 / 5 (0 votes)

The post Lead & Lag : SQL Window Functions and a bit more … first appeared on ERP Q&A.

]]>
Creation of bgRFC and calling in a program https://www.erpqna.com/creation-of-bgrfc-and-calling-in-a-program/?utm_source=rss&utm_medium=rss&utm_campaign=creation-of-bgrfc-and-calling-in-a-program Sat, 01 Apr 2023 12:53:14 +0000 https://www.erpqna.com/?p=73335 1. What is bgRFC? The bgRFC allows application to record data that is received later by a called application. When the data is received, we must ensure that the data was transferred to the receiver either once only in any order (transactional) or once only in the order of creation(queued). 2. bgRFC Configuration? This is […]

The post Creation of bgRFC and calling in a program first appeared on ERP Q&A.

]]>
1. What is bgRFC?

The bgRFC allows application to record data that is received later by a called application. When the data is received, we must ensure that the data was transferred to the receiver either once only in any order (transactional) or once only in the order of creation(queued).

2. bgRFC Configuration?

  • Creation of supervisor Destination.

This is a mandatory step because the bgRFC can only function if a supervisor destination has been define for bgRFC processing. For creation of supervisor destination the t-code is SBGRFCCONF.

bgRFC_Configuration_1
bgRFC_Configuration_2

Prerequisite:

Need to verify the supervisor destination as an RFC destination using the transaction SM59. This destination must be defined as either an ABAP connection or a logical connection.

bgRFC_Configuration_3

A user, password, and client must be entered for both connection types. Please refer the attached screenshot.

bgRFC_Configuration_4

ABAP Connection:

No load balancing can be defined.

No system number can be entered.

No server can be entered.

bgRFC_Configuration_5

Creation of destination:

We have to create inbound/outbound destination name based on the requirement.

Creation of Inbound Destination:

On the define inbound Dest. Tab page in the transaction SBGRFCCONF, we can maintain a separate inbound destination for each application. This is also mandatory step to create any inbound bgRFC.

Logon/Server group can be defined using transaction RZ12.

All the settings and activities related to the transaction SBGRFCCONF is BASIS related activity so before creating/configuring any bgRFC please consult with BASIS team.

bgRFC_Configuration_6
bgRFC_Configuration_7
bgRFC_Configuration_8

Creation of outbound destination:

We can create outbound destination using transaction SM59. Creation of outbound destination in SM59 is normal like any of the destination creation. Please refer the below screenshot for reference.

bgRFC_Configuration_9

We have to created outbound destination under ABAP Connections. For this destination we have to maintain the necessary target destination IP, system no etc. Please refer all the below screenshots for detailed setting in each tab.

bgRFC_Configuration_10
bgRFC_Configuration_11
bgRFC_Configuration_12

After creation of the outbound destination in SM59 we have to maintain this destination in SBGRFCCONF transaction. We have to maintain the destination in Scheduler Destination tab of transaction SBGRFCCONF. Please refer the below screenshot.

bgRFC_Configuration_13

3. bgRFC Programming

After the entire configuration now let’s talk about the programming.

  • Creation of unit:

We have to create one bgRFC unit by taking the reference from the configured inbound/outbound destination name. Destination objects can be requested using the class methods from the class CL_BGRFC_DESTINATION_OUTBIUND for the outbound and the class CL_BGRFC_DESTINATION_INBOUND for the inbound. We have to use method create of the above mentioned class to create a destination object.

Please see the below example of how to create an inbound destination object.

Pass any of the configured inbound destination name in the below mentioned variable.

Call the below mentioned method to create reference of inbound destination.

bgRFC_Programming_1

If the inbound/outbound destine is invalid the program is going to dump.

Please refer the blow screen shot for your reference.

bgRFC_Programming_2

For handling the runtime error we need to use exception class CX_BGRFC_INVALID_DESTINATION. Please refer the below screenshot for your reference.

bgRFC_Programming_3

After creation of the destination object it is time to create a bgRFC unit. bgRFC unite can be two types like tRFC and qRFC. We have to use method CREATE_TRFC_UNIT to create a tRFC unit and method CREATE_QRFC_UNIT to create a qRFC unit. Please refer the below screenshot.

bgRFC_Programming_4

To create a qRFC unit we just have to call CREATE_QRFC_UNIT method instead of CREATE_TRFC_UNIT method.

Calling a function module:

After creation of the unit we have to call any function module in background. The new syntax for calling a function module for background processing is as follows.

CALL FUNTION ‘function_name’

IN BACKGROUND UNIT unit

EXPORTING…

Inside the calling function module we have to write our required logic which we want to process in background (e.g Update a table). Please refer below screenshot for calling of a function module.

bgRFC_Programming_5

In the above example the function module is called in background using the created unit. In the exporting parameter we can define whatever we want to send in the function module. If we want to send some table also we can define it in the calling module and we can send it.

bgRFC_Programming_6

The above screenshot shows complete program to call bgRFC.

RFC Function module:

RFC_Fun_1

The function module must be RFC enabled function module.

RFC_Fun_2

Here one thing need to remembered, calling the function module will happen once program does the COMMIT WORK. If we want to create bgRFC from our custom program like report in that case we have to do external COMMIT WORK but if we are trying to create bgRFC from any BADI/USER EXIT/Enhancement Spot in that case we do not need to apply external COMMIT WORK. Once standard SAP do the COMMIT WORK bgRFC unit will be Created.

4. bgRFC Monitoring:

bgRFC monitor typically required to check if any unit is failed or there are any issue in any particular unit.

We have to use transaction SBGRFCMON to monitor bgrfc unit. In the selection screen we have option to monitor Inbound/Outbound unit along with tRFC unit or qRFC unit.

Also there are several options in the selection screen. Please refer the below screen shot.

bgRFC_Mon_1

After execution the transaction we will be able to monitor all the erroneous units in the system. We have to remember if any unit is created successfully and if there are no data issue or any connection issue in that case it will be executed and we will not able to see it in this transaction. Only erroneous and warning units will be displayed here.

Please refer the below screenshot.

bgRFC_Mon_2

Conclusion:

Background Remote Funtion Call (bgRFC) is a technology in SAP that allow for asynchronous (qRFC) and synchronously (tRFC) communication between different systems. It is used to enable background processing of distributed system in a secure and reliable manner.

With bgRFC, developers can create and schedule processes to run in the background of SAP system, which helps reduce the workload on the front-end and improve system performance. The technology ensure that all data is transmitted securely and can handle large volumes of data efficiently.

Rating: 0 / 5 (0 votes)

The post Creation of bgRFC and calling in a program first appeared on ERP Q&A.

]]>
How to expose CDS View using RAP in SAP S/4HANA https://www.erpqna.com/how-to-expose-cds-view-using-rap-in-sap-s-4hana/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-expose-cds-view-using-rap-in-sap-s-4hana Sat, 26 Nov 2022 10:11:27 +0000 https://www.erpqna.com/?p=70544 SAP BTP (Business Technology Platform) ABAP Environment is an all-inclusive ABAP platform that is cloud ready. PS: The reason I was inspired to write this blog was when I came across the term Steampunk which sounded so cool! But it has been brought to my attention that this term is now obsolete and so I […]

The post How to expose CDS View using RAP in SAP S/4HANA first appeared on ERP Q&A.

]]>
SAP BTP (Business Technology Platform) ABAP Environment is an all-inclusive ABAP platform that is cloud ready.

PS: The reason I was inspired to write this blog was when I came across the term Steampunk which sounded so cool! But it has been brought to my attention that this term is now obsolete and so I am removing the references to avoid confusion.

Some clients have huge ABAP Repositories, and it is understandable if they would want to continue using it in S/4HANA environment as well. SAP BTP ABAP environment is what they would then use for the cloud ready developments. RESTful OData Services based on CDS (Core Data Services) views too can be created.

If someone told me that we could expose a custom SAP view through a browser 5 years ago, I would not have believed him/her and yet here I am, about to do exactly that; expose a CDS View without even explicitly creating an OData service. The below is the outcome you will reap if you follow this blog post till the end. Isn’t it cool? So, what are we waiting for!! Let’s dive in…

Here is a comprehensive step-by-step guide with screenshots to create a CDS view and expose it:

1. Open Eclipse IDE (ABAP Development Tools) and login to your system (called project in ADT terms)

2. Create a new Data Definition

3. No transport request (TR) is selected as this is a local object. But if yours is an object that is not local, here is where you select a transport request.

4. We will be creating a view with association, so you need to select that option and click on finish. Association is an on-demand join which is executed only when the data is accessed (also called the lazy approach; prerequisite is that the association needs to be made public).

5. Below is the code.

@AbapCatalog.sqlViewName:'ZVTESTFLIGHT'
@AbapCatalog.compiler.compareFilter:true
@AbapCatalog.preserveKey:true
@AccessControl.authorizationCheck:#CHECK
@EndUserText.label:'CDS View for flight booking'
defineview ztest_flight_cds 
asselectfrom sbook as Booking
association[0..1]to I_Country  as _Country  on$projection.country = _Country.Country
{
key carrid as FlightID,
key connid as ConnectionID,
key fldate as Flight_date,
key bookid as Booking,
     passname   as CustomerName,
     _Country.Country,
     order_date as DateOfBooking,
     fldate     as DateOfTravel,
     forcuram,
     forcurkey  as CurrencyCode,
     _Country // Make association public
}

6. Now add the UI annotations at the header level and for each field and the final code will be as follows. Copy paste below code in the editor.

@AbapCatalog.sqlViewName: 'ZVTESTFLIGHT'
@AbapCatalog.compiler.compareFilter: true
@AbapCatalog.preserveKey: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'CDS View for flight booking'
@Search.searchable : true
@UI:
{
 headerInfo:
  {
    typeName: 'Booking',
    typeNamePlural: 'Bookings',
    title: { type: #STANDARD, value: 'Booking' }
  }
 }
define view ztest_flight_cds 
    as select from sbook as Booking
  association [0..1] to I_Country  as _Country  on $projection.country = _Country.Country
    {
          @UI.facet: [
        {
          id:       'Booking',
          purpose:  #STANDARD,
          type:     #IDENTIFICATION_REFERENCE,
          label:    'Booking',
          position: 10 }
      ]

      @UI: {
          lineItem: [ { position: 10, importance: #HIGH, label: 'Flight ID' } ],
          identification:[ { position: 10, label: 'Flight ID' } ]
          }
     key carrid as FlightID,
           @UI: {
          lineItem: [ { position: 20, importance: #HIGH, label: 'Connection ID' } ],
          identification:[ { position: 20, label: 'Connection ID' } ]
          }
     key connid as ConnectionID,
           @UI: {
          lineItem: [ { position: 30, importance: #HIGH, label: 'Fl.Date' } ],
          identification:[ { position: 30, label: 'Fl.Date' } ]
          }
     key fldate as Flight_date,
           @UI: {
          lineItem: [ { position: 40, importance: #HIGH, label: 'Booking ID' } ],
          identification:[ { position: 40, label: 'Booking ID' } ]
          }
     key bookid as Booking,
           @UI: {
        lineItem: [ { position: 50, label: 'Customer', importance: #HIGH } ],
        identification:[ { position: 50, label: 'Customer' } ]
      }
      @Search.defaultSearchElement: true
     passname   as CustomerName,
           @UI: {
           identification:[ { position: 60, label: 'Country' } ]
       }
     _Country.Country,
           @UI: {
           identification:[ { position: 60, label: 'Booked On' } ]
       }
     order_date as DateOfBooking,
           @UI: {   identification:[ { position: 70, label: 'Traveling on' } ]    }
     fldate     as DateOfTravel,
           @UI: {
      lineItem: [ { position: 80, label: 'Cost', importance: #HIGH } ],
      identification:[ { position: 80, label: 'Cost' } ]
      }
      @Semantics.amount.currencyCode: 'CurrencyCode'
     forcuram,
           @UI: { identification:[ { position: 90, label: 'Currency' } ]     }
      @Semantics.currencyCode: true
     forcurkey  as CurrencyCode,
     _Country // Make association public
}

7. Activate the CDS View

8. We can test our CDS View with Data Preview. Right-click on the Data Definition and select Open with->Data Preview

Data is displayed as below:

Right click on any of the value, follow association, and we can see country data too

9. Now we will create service definition.

a. Right-click the Data Definition name and select ‘New Service Definition’.

b. No TR is required since this is a local object as well.

c. Click Finish.

d. Copy and paste the below code:

@EndUserText.label:'Service defnition for Flight CDS'
defineservice Ztest_flight_srvd {
expose ztest_flight_cds;
expose I_Country;
}

e. Activate the Service Definition

  1. Now we will create the service binding

a. Right-click the service definition name and select ‘New Service Binding’

b. Click Finish

c. Click on Publish

The OData Service URL is automatically generated without us using SEGW transaction

d. Click on the service URL link

The above metadata is displayed

e. Select an entity and click on the Preview option

f. Click on settings and select all columns

g. Click OK

You still don’t see anything and no need to panic 😊

h. Just click on GO

Voila!! All the entries magically appear

i. Go back to the Eclipse and select the other entity this time and click on Preview

j. Repeat steps f. through h.

Rating: 0 / 5 (0 votes)

The post How to expose CDS View using RAP in SAP S/4HANA first appeared on ERP Q&A.

]]>
Simple Change Request Management Using JIRA Cloud/SAP CPI/ABAP Transports https://www.erpqna.com/simple-change-request-management-using-jira-cloud-sap-cpi-abap-transports/?utm_source=rss&utm_medium=rss&utm_campaign=simple-change-request-management-using-jira-cloud-sap-cpi-abap-transports Sat, 28 May 2022 12:08:29 +0000 https://www.erpqna.com/?p=63522 Introduction For a whole SAP migration/implementation or even simple maintenance projects we do need a Change Request Management Tool. The best in the SAP context is to have ChaRM or Focused Build (on Top of SAP Solution Manager/ChaRM) which is natively integrated with SAP CTS/TMS. But what if you don’t dispose of those tools and […]

The post Simple Change Request Management Using JIRA Cloud/SAP CPI/ABAP Transports first appeared on ERP Q&A.

]]>
Introduction

For a whole SAP migration/implementation or even simple maintenance projects we do need a Change Request Management Tool. The best in the SAP context is to have ChaRM or Focused Build (on Top of SAP Solution Manager/ChaRM) which is natively integrated with SAP CTS/TMS.

But what if you don’t dispose of those tools and you would like to build your in house Change Request Management Tool based on JIRA or Mantis for instance.

The purpose of this blog is to build a simple solution based on JIRA WebHooks, SAP Cloud Process Integration (SAP CPI) and SAP ABAP Transports to achieve an end-to-end change request management.

Prerequisites

To be able to follow different steps you will need:

  1. A Cloud Connector configured and exposing endpoints on your on premises SAP System
  2. A JIRA Cloud (You could have a Trial one)
  3. A SAP Cloud Process Integration instance running.
  4. You should be familiar with API Management and SAP CPI

User Story

As a JIRA User I would be able to release my Transport Workbench/Customizing Task automatically once the JIRA Backlog Item is set to DONE.

In most of the cases, The project Manager or the Functional Consultant do send reminders to Developers to release theirs tasks in order to be able to release the whole Transport. The Developer forget often to release open Tasks nevertheless ithe corresponding JIRA Backlog Item has been closed.

So the best would be to release the Transport Task automatically once the JIRA one is set to DONE (or whatever the wanted status in the workflow).

Architecture Overview

The below diagram illustrates the different components needed to be integrated together.

Architecture Overview
  1. We will need a Webhook configured at JIRA ADMIN UI Level: A webhook will act as an event listener that will trigger an action once the configured event occurs.
  2. The Webhook will trigger a call to a SAP CPI FLOW that will process the payload transform it according to the expected output. You need to pay attention to this part since the purpose is to build a flow that should make abstraction of the incoming input. Here for simplicity purpose, we will focus on JIRA incoming payload but it could be also a Mantis Payload.
  3. The CPI Flow will trigger a call to an API Proxy passing the transformed Payload
  4. In its turn, the API Proxy will trigger a call to an On Premise Restful Web Service via Cloud Connector. The Restful Web Service will manage the reaming task of releasing the Task

In addition, we will consider a predefined naming convention of the Transport Request Description since we need to find a way on how to map a JIRA Backlog Item and Transport Request Task. For instance, as a Team, we need to agree on putting the JIRA Backlog Item ID at the beginning of the Transport Request Description. Here is an example.

FG-01: Develop ABAP Report for Data Cleanup

In the next section we will go through all components mentioned above but in the reversed order since it is the best way to building blocks.

Develop the Restful web service

The purpose of this web service is to provide an endpoint expecting a POST Request having a predefined payload containing all need information to look for a Transport Request Task in the Development System and release it.

We could either build an OData Web Service or just a simple Rest one. I opted for a simple one to keep things simple.

Following are steps to create a Simple ICF Rest Web Service.

Create a class implementing IF_HTTP_EXTENSION interface

Go to SE24 (or SE80) and create a class and set IF_HTTP_EXTENSION as interface. Let’s call it ZCL_PMT_INT_HTTP_HANDLER. The interface IF_HTTP_EXTENSION contains one method IF_HTTP_EXTENSION~HANDLE_REQUEST that will handled all HTTP Requests.

IF_HTTP_EXTENSION Interface Implementation

Now we will provide a simple implementation of that method to be able to test it through ICF.

IF lv_http_method EQ 'GET'.
      server->response->set_status(
        EXPORTING
          code   = 200
          reason = 'Service is Up'
      ).

      DATA: conv_out TYPE REF TO cl_abap_conv_out_ce.
      conv_out = cl_abap_conv_out_ce=>create( encoding = 'UTF-8' ).

      DATA: lv_content TYPE xstring, lv_text TYPE string.
      lv_text = 'Service is Up'.
      conv_out->convert( EXPORTING data = lv_text  IMPORTING buffer = lv_content ).


      server->response->set_data( lv_content ).

ENDIF.

Create the ICF Node

Now go to TCODE SICF and create a node under /default_host/sap like below. Let’s name it pmt

Restful Web Service Endpoint (ICF Node)

Double click on the pmt ICF Node. Click on Edit, select the Handler List Tab and set the Handler to the previously created class.

ICF Node Handler

Save and include your object in Transport Request or save it locally in $TMP Package.

At this stage the Service is Ready in it is minimal version and could be tested: in the SICF TCODE, look for the pmt node and right click and select Activate Service. Right Click again and select Test Service.

Test Service

The corresponding URL should be <protocol>://<host>:<port>/sap/pmt

Provide the Login/Password.

You should received a message in the Browser saying Service is Up

ABAP Code to release a Transport Request Task

Now let’s concentrate on main task: Search for Transport Request Task and Release it.

To achieve this, we need to analyse what will be the payload which is supposed to be attached to the POST Request: According to the JIRA Documentation, the payload should look like below

JIRA Webhook Payload

So we should be expecting a structure containing a component named issue that contains a component named key. This is most pertinent information for us.

The ABAP Code Below allow to extract that information, and look into E07T/E070 DB Tables for the corresponding TRs.

METHOD if_http_extension~handle_request.

    DATA(lv_http_method) = server->request->get_method( ).

    IF lv_http_method EQ 'POST'.
      DATA: lv_request TYPE string.
      DATA(lv_content_type) = server->request->get_content_type( ).
      DATA lr_data TYPE REF TO data.
      DATA lv_json TYPE /ui2/cl_json=>json.
      lv_request = server->request->get_cdata( ).

      lv_json = lv_request.

      lr_data = /ui2/cl_json=>generate( lv_request ).

      FIELD-SYMBOLS:
        <data>  TYPE data,
        <issue> TYPE data,
        <key>   TYPE data,
        <s_key> TYPE data.

      IF lr_data IS BOUND.
        """"""""""""""""""""""""""""""""""""""""""""
        """ Extract the KEY of the JIRA Backlog Item
        """"""""""""""""""""""""""""""""""""""""""""
        ASSIGN lr_data->* TO <data>.

        DATA(lo_structdescr) = CAST cl_abap_structdescr( cl_abap_structdescr=>describe_by_data( p_data = <data> ) ).
        DATA(components)     = lo_structdescr->get_components( ).

        IF line_exists( components[ name = 'ISSUE' ] ).
          ASSIGN COMPONENT 'ISSUE' OF STRUCTURE <data> TO <issue>.
          ASSIGN <issue>->* TO FIELD-SYMBOL(<issue_struct>).

          lo_structdescr = CAST cl_abap_structdescr( cl_abap_structdescr=>describe_by_data( p_data = <issue_struct> ) ).
          components     = lo_structdescr->get_components( ).

          IF line_exists( components[ name = 'KEY' ] ).
            ASSIGN COMPONENT 'KEY' OF STRUCTURE <issue_struct> TO <key>.
          ENDIF.

          ASSIGN <key>->* TO <s_key>.

          IF <key> IS BOUND.
            """"""""""""""""""""""""""""""""""""""""""""
            """ Search for the corresponding Transport
            """ Request
            """"""""""""""""""""""""""""""""""""""""""""

            DATA(lv_tr_serach) = |{ <s_key> }%|.

            SELECT  FROM e07t LEFT JOIN e070 ON e07t~trkorr = e070~trkorr AND langu = @sy-langu
              FIELDS DISTINCT e07t~trkorr
              WHERE as4text LIKE @lv_tr_serach
              AND trfunction = 'K'
              INTO TABLE @DATA(lt_trs).

            IF lt_trs[] IS NOT INITIAL.

              IF lines( lt_trs[] ) EQ 1.
                DATA(lv_tr) =  CONV trkorr( lt_trs[ 1 ] ).
                DATA lt_requests TYPE trwbo_request_headers.
                DATA lt_tasks TYPE trwbo_request_headers.
                """"""""""""""""""""""""""""""""""""""""""""
                """ Look for tasks in the found Transport
                """ Request
                """"""""""""""""""""""""""""""""""""""""""""

                CALL FUNCTION 'TR_READ_REQUEST_WITH_TASKS'
                  EXPORTING
                    iv_trkorr          = lv_tr
                  IMPORTING
                    et_request_headers = lt_requests
*                   ET_REQUESTS        =
                  EXCEPTIONS
                    invalid_input      = 1
                    OTHERS             = 2.
                IF sy-subrc = 0.
                  """"""""""""""""""""""""""""""""""""""""""""
                  """ Select only Open and classified Tasks
                  """"""""""""""""""""""""""""""""""""""""""""
                  lt_tasks = VALUE #( FOR wa IN lt_requests WHERE ( trfunction = 'S' AND trstatus = 'D' )
                                         ( trkorr = wa-trkorr )
                                       ).
                  IF lt_tasks[] IS NOT INITIAL.
                    DATA lt_return TYPE bapiret2.
                    """"""""""""""""""""""""""""""""""""""""""""
                    """ Release Tasks
                    """"""""""""""""""""""""""""""""""""""""""""
                    LOOP AT lt_tasks ASSIGNING FIELD-SYMBOL(<fs_tr>).
                      CALL FUNCTION 'BAPI_CTREQUEST_RELEASE'
                        EXPORTING
                          requestid = lv_tr
                          taskid    = <fs_tr>-trkorr
*                         COMPLETE  =
*                         BATCH_MODE       =
                        IMPORTING
                          return    = lt_return.

                      IF lt_return IS INITIAL.
                        "Do Something
                      ENDIF.
                    ENDLOOP.
                  ENDIF.
                ENDIF.


              ENDIF.
            ENDIF.

          ENDIF.
        ENDIF.
      ENDIF.

      server->response->set_status(
       EXPORTING
         code   = 201
         reason = 'Action Done'
     ).
    ENDIF.

    IF lv_http_method EQ 'GET'.
      server->response->set_status(
        EXPORTING
          code   = 200
          reason = 'Service is Up'
      ).

      DATA: conv_out TYPE REF TO cl_abap_conv_out_ce.
      conv_out = cl_abap_conv_out_ce=>create( encoding = 'UTF-8' ).

      DATA: lv_content TYPE xstring, lv_text TYPE string.
      lv_text = 'Service is Up'.
      conv_out->convert( EXPORTING data = lv_text  IMPORTING buffer = lv_content ).


      server->response->set_data( lv_content ).

    ENDIF.

  ENDMETHOD.

At this stage we are done with backend task. I assume now that your backend system if exposed via Cloud Connector and that the given pattern /sap/pmt is authorized.

In the next section we will wrap the developed Web Service with a SAP API Management Proxy.

SAP API Management Proxy

First of all we will need to create an API Provider.

Browse to your SAP Integration Suite instance and click on the API Management Tile. On the left hand side menu click on Configure. The list of available API Provider is displayed. Click on create. Fill in all required fields as below:

  • Type: On Premise
  • Host: The Virtual Host configured in your Cloud Connector
  • Port: The Virtual Port configured in your Cloud Connector
  • Location ID: The Location ID that you set when connecting your BTP Account to Cloud Connector
API Provider Creation (1)
API Provider Creation (2)

In The left hand side menu, click on Develop. We will create an API Proxy.

The list of APIs is displayed, click on Create. Fill in the form as below and click Create button.

API Proxy Creation

Save and deploy. The API is now deployed and exposed via the API Proxy URL below.

API Deployment

You could now call your on premises Restful Web Service via that URL by adding the suffix /pmt (remember that the URL on premises is /sap/pmt: Here we need just to add pmt since the /sap was already configured as a prefix in the API Proxy).

The next step is to create the CPI Flow that will transform the payload ans consume this API Proxy.

Build SAP CPI Flow

You need to prepare a flow like below

CPI Flow

1- HTTPS: This is the input HTTP POST Request that will trigger the flow

HTTPS Post Action

2 – A log Step is added to log the incoming message: You could pick the Groovy Script from here

3 – A content modifier to do simple tranformation

4 – Remove Non-Supported Attributes: The received JSON payload from JIRA contains some weird fields named as following 24×24 and 48×48 which represents images dimensions. Those fields blocks that JSON to XML Transformation and leads to a failure. Here is the Groovy Script

import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import groovy.json.JsonSlurper;
import groovy.json.JsonOutput;
import groovy.json.*
def Message processData(Message message) {
    //Body
    def body = message.getBody(String);
    def jsonSlurper = new JsonSlurper();
    def bodyJson = jsonSlurper.parseText(body);

    bodyJson.user.avatarUrls = {};
    bodyJson.issue.fields.project.avatarUrls = {};
    bodyJson.issue.fields.assignee.avatarUrls = {};
    bodyJson.issue.fields.creator.avatarUrls = {};
    bodyJson.issue.fields.reporter.avatarUrls = {};
    
    message.setBody(new JsonBuilder(bodyJson).toString());
    return message;
}

5 – Convert the JSON to XML

6 – Apply XSLT Transformation to Copy only needed fields: No need to pass the whole payload. Here is the Transformation:

<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
 <xsl:output omit-xml-declaration="yes" indent="yes"/>
 <xsl:strip-space elements="*"/>

 <xsl:template match="node()|@*">
     <xsl:copy>
       <xsl:apply-templates select="node()|@*"/>
     </xsl:copy>
 </xsl:template>

 <xsl:template match="node()[not(self::root or ancestor-or-self::user or ancestor-or-self::issue)]">
  <xsl:apply-templates/>
 </xsl:template>
</xsl:stylesheet>

7 – Convert Back the result from XML to JSON. Don’t forget to check Suppress JSON Root Element otherwise you won’t get the expected structure in the backend

XML To JSON Converter

8 – Call the API Proxy prepared earlier: Make sur to put the API Proxy already prepared above.

HTTP Outbound call to APIM

Pay attention to set credentials in the Manage Security Materials section.

Testing the CPI Flow

We need to make sure that every thing is working well before proceeding with the WebHook configuration at JIRA ADMIN UI.

I advise to install the Chrome Extension: SAP CPI Helper. It helps a lot for issue tracking during Unit Tests and provides an overview/insights on failed/succeeded calls.

Once the extension is added, you will have a set of buttons that will appear at the Top Right Corner:

CPI Helper

Click on the Info Button in the CPI Helper toolbar and pick the Endpoint URL.

Testing the Flow Endpoint

You need to configure a Basic Authentication using your BTP credentials or ClientId/ClientSecret (usually used for OAuth2).

Now we made sure that our Flow and different blocks are working correctly. It is tile to plug our endpoint into the JIRA WebHook callback.

JIRA Cloud WebHook Creation

Now will concentrate on the WebHook configuration. It is nothing then a callback URL with a setting describing when it should be triggered. We will keep it simple and configure the WebHook callback to be triggered if an issue status is set to DONE and if the issue do belong to a specific project.

Go to your JIRA Cloud and click on the Configuration wheel. Select System

JIRA Cloud Webhook configuration (1)

Scroll to bottom in the left hand side menu and select WebHooks.

Click on Create a WebHook button to Create a new WebHook.

Make sure to:

  • Enter the Flow Endpoint URL
  • Set the Event
  • Not to exclude the body: Exclude body should be set to No since we would like to receive the body.
JIRA Cloud Webhook configuration (2)

Here the configuration should be finished… But wait !? How JIRA Cloud will be able to consume the FLOW Endpoint URL witout Authentification?

Unfortunatly JIRA Cloud Webhooks do not support Basic Authentication or other Authentication mode right now. There is already an issue which still open since a while requesting this feature. You can check it here.

To overcome this limitation, we will change our Architecture a little bit an introduce a Block for Anonymous Access (Not really Good thing but for POC purpose).

We will add a API Proxy that will handle the Webhook callback and will route the call to the CPI FLOW Endpoint. We will define an Authentication Policy at this API Proxy level which will inject credentials.

Architecture Change: Adding API Proxy Block

Go back to your Integration Suite and Go to Design, Develop, and Manage APIs.

Create a new API pointing to the CPI Flow Endpoint URL. The best is to create a API Provider of type Cloud Integration and to set it when creating the API Proxy.

API Provider Type Cloud Integration
Anonymous API Proxy (1)
Anonymous API Proxy (2)

After creating the new API, Edit its policies and add below ones:

  • Create a Assign Message Policy at the Target Endpoint PreFlow level. Fill in variables with credentials
Assign Message Policy
  • Add a new Basic Authentication Policy and point to the header username and password to the already created variables in the Assign Message Policy Step.
Basic Authentication Policy

Save and Deploy.

At this stage we do have an Endpoint URL (API Proxy) providing a Anonymous Access and could be configured at the Webhook Level.

Now go back to JIRA Coud WebHook configuration and set the latest created API Proxy. Click Save and make sur the the WebHook is enabled.

Try to test the whole flow by changing a backlog item status to DONE. Make sure that you are selecting an item belonging to the set project in the Event configuration.

Put a break point at the ABAP side. The breakpoint will be hit and you will be able to analyse the received payload.

Debug

And here you are done by putting in place your simple in house made Change Request Management Tool.

Rating: 5 / 5 (1 votes)

The post Simple Change Request Management Using JIRA Cloud/SAP CPI/ABAP Transports first appeared on ERP Q&A.

]]>
STEAMPUNK is not just sci-fi anymore! https://www.erpqna.com/steampunk-is-not-just-sci-fi-anymore/?utm_source=rss&utm_medium=rss&utm_campaign=steampunk-is-not-just-sci-fi-anymore Mon, 02 May 2022 03:50:01 +0000 https://www.erpqna.com/?p=62630 Steampunk and RAP (ABAP RESTful Application Programming Model), and yes, we are still talking about SAP! SAP finally came up with a cool name for an even cooler product. Steampunk aka RAP aka SAP BTP (Business Technology Platform) ABAP Environment is an all-inclusive ABAP platform that is cloud ready. Some clients have huge ABAP Repositories, […]

The post STEAMPUNK is not just sci-fi anymore! first appeared on ERP Q&A.

]]>
Steampunk and RAP (ABAP RESTful Application Programming Model), and yes, we are still talking about SAP! SAP finally came up with a cool name for an even cooler product. Steampunk aka RAP aka SAP BTP (Business Technology Platform) ABAP Environment is an all-inclusive ABAP platform that is cloud ready.

Some clients have huge ABAP Repositories, and it is understandable if they would want to continue using it in S/4HANA environment as well. SAP BTP ABAP environment is what they would then use for the cloud ready developments. RESTful OData Services based on CDS (Core Data Services) views too can be created.

If someone told me that we could expose a custom SAP view through a browser 5 years ago, I would not have believed him/her and yet here I am, about to do exactly that; expose a CDS View without even explicitly creating an OData service. The below is the outcome you will reap if you follow this blog post till the end. Isn’t it cool? So, what are we waiting for!! Let’s dive in…

Here is a comprehensive step-by-step guide with screenshots to create a CDS view and expose it:

1. Open Eclipse IDE (ABAP Development Tools) and login to your system (called project in ADT terms)

2. Create a new Data Definition

3. No transport request (TR) is selected as this is a local object. But if yours is an object that is not local, here is where you select a transport request.

4. We will be creating a view with association, so you need to select that option and click on finish. Association is an on-demand join which is executed only when the data is accessed (also called the lazy approach; prerequisite is that the association needs to be made public).

5. Below is the code.

@AbapCatalog.sqlViewName:'ZVTESTFLIGHT'
@AbapCatalog.compiler.compareFilter:true
@AbapCatalog.preserveKey:true
@AccessControl.authorizationCheck:#CHECK
@EndUserText.label:'CDS View for flight booking'
defineview ztest_flight_cds 
asselectfrom sbook as Booking
association[0..1]to I_Country  as _Country  on$projection.country = _Country.Country
{
key carrid as FlightID,
key connid as ConnectionID,
key fldate as Flight_date,
key bookid as Booking,
     passname   as CustomerName,
     _Country.Country,
     order_date as DateOfBooking,
     fldate     as DateOfTravel,
     forcuram,
     forcurkey  as CurrencyCode,
     _Country // Make association public
}

6. Now add the UI annotations at the header level and for each field and the final code will be as follows. Copy paste below code in the editor.

@AbapCatalog.sqlViewName: 'ZVTESTFLIGHT'
@AbapCatalog.compiler.compareFilter: true
@AbapCatalog.preserveKey: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'CDS View for flight booking'
@Search.searchable : true
@UI:
{
 headerInfo:
  {
    typeName: 'Booking',
    typeNamePlural: 'Bookings',
    title: { type: #STANDARD, value: 'Booking' }
  }
 }
define view ztest_flight_cds 
    as select from sbook as Booking
  association [0..1] to I_Country  as _Country  on $projection.country = _Country.Country
    {
          @UI.facet: [
        {
          id:       'Booking',
          purpose:  #STANDARD,
          type:     #IDENTIFICATION_REFERENCE,
          label:    'Booking',
          position: 10 }
      ]

      @UI: {
          lineItem: [ { position: 10, importance: #HIGH, label: 'Flight ID' } ],
          identification:[ { position: 10, label: 'Flight ID' } ]
          }
     key carrid as FlightID,
           @UI: {
          lineItem: [ { position: 20, importance: #HIGH, label: 'Connection ID' } ],
          identification:[ { position: 20, label: 'Connection ID' } ]
          }
     key connid as ConnectionID,
           @UI: {
          lineItem: [ { position: 30, importance: #HIGH, label: 'Fl.Date' } ],
          identification:[ { position: 30, label: 'Fl.Date' } ]
          }
     key fldate as Flight_date,
           @UI: {
          lineItem: [ { position: 40, importance: #HIGH, label: 'Booking ID' } ],
          identification:[ { position: 40, label: 'Booking ID' } ]
          }
     key bookid as Booking,
           @UI: {
        lineItem: [ { position: 50, label: 'Customer', importance: #HIGH } ],
        identification:[ { position: 50, label: 'Customer' } ]
      }
      @Search.defaultSearchElement: true
     passname   as CustomerName,
           @UI: {
           identification:[ { position: 60, label: 'Country' } ]
       }
     _Country.Country,
           @UI: {
           identification:[ { position: 60, label: 'Booked On' } ]
       }
     order_date as DateOfBooking,
           @UI: {   identification:[ { position: 70, label: 'Traveling on' } ]    }
     fldate     as DateOfTravel,
           @UI: {
      lineItem: [ { position: 80, label: 'Cost', importance: #HIGH } ],
      identification:[ { position: 80, label: 'Cost' } ]
      }
      @Semantics.amount.currencyCode: 'CurrencyCode'
     forcuram,
           @UI: { identification:[ { position: 90, label: 'Currency' } ]     }
      @Semantics.currencyCode: true
     forcurkey  as CurrencyCode,
     _Country // Make association public
}

7. Activate the CDS View

8. We can test our CDS View with Data Preview. Right-click on the Data Definition and select Open with->Data Preview

Data is displayed as below:

Right click on any of the value, follow association, and we can see country data too

9. Now we will create service definition.

a. Right-click the Data Definition name and select ‘New Service Definition’.

b. No TR is required since this is a local object as well.

c. Click Finish.

d. Copy and paste the below code:

@EndUserText.label:'Service defnition for Flight CDS'
defineservice Ztest_flight_srvd {
expose ztest_flight_cds;
expose I_Country;
}

e. Activate the Service Definition

  1. Now we will create the service binding

a. Right-click the service definition name and select ‘New Service Binding’

b. Click Finish

c. Click on Publish

The OData Service URL is automatically generated without us using SEGW transaction

d. Click on the service URL link

The above metadata is displayed

e. Select an entity and click on the Preview option

f. Click on settings and select all columns

g. Click OK

You still don’t see anything and no need to panic

h. Just click on GO

Voila!! All the entries magically appear

i. Go back to the Eclipse and select the other entity this time and click on Preview

j. Repeat steps f. through h.

You see the above country entries from the Country view.

Rating: 5 / 5 (1 votes)

The post STEAMPUNK is not just sci-fi anymore! first appeared on ERP Q&A.

]]>
Consuming CDS View Entities Using ODBC-Based Client Tools https://www.erpqna.com/consuming-cds-view-entities-using-odbc-based-client-tools/?utm_source=rss&utm_medium=rss&utm_campaign=consuming-cds-view-entities-using-odbc-based-client-tools Thu, 26 Aug 2021 10:11:53 +0000 https://www.erpqna.com/?p=53021 In this blog post, we would like to show you how you can access CDS view entities in an ABAP system using SQL via ODBC. Open Database Connectivity (ODBC) is a standard API for accessing databases. Why an ODBC Driver for ABAP? There are situations where you would like to have external SQL read access […]

The post Consuming CDS View Entities Using ODBC-Based Client Tools first appeared on ERP Q&A.

]]>
In this blog post, we would like to show you how you can access CDS view entities in an ABAP system using SQL via ODBC. Open Database Connectivity (ODBC) is a standard API for accessing databases.

Why an ODBC Driver for ABAP?

There are situations where you would like to have external SQL read access to CDS objects owned by the ABAP system. Direct SQL read access to the underlying SAP HANA database of an ABAP system is not a good choice. Some of the problems are listed in SAP Note 2511210. Names and internal structures in the ABAP database might not be stable because the lifecycle is managed by the ABAP system. Typecasts might not be performed as expected, for example, NUMC data types might not be correctly padded, or currency data might not be correctly shifted. Database session variables are only set correctly if a view is accessed from the ABAP system. ABAP-level security concepts are bypassed.

All those problems go away when you treat the ABAP system itself as a database by accessing the ABAP system directly using ODBC. In this case, authentication and authorization is done using an ABAP user. Full ABAP SQL semantics apply and even application-server-level buffering can be used as well as ABAP-level access control and read access logging .

Compared to the ODATA interface, the ODBC interface has the advantage that it allows unrestricted SQL access to all exposed ABAP CDS view entities. Data from different entities can be joined in an ad-hoc fashion and data can be aggregated for analytical queries.

In its current version, the “ODBC driver for ABAP” supports the use of a technical user in the ABAP system, with privileged access (no DCLs) only. Only read access to the exposed ABAP CDS objects is allowed.

Overview of Steps and Prerequisites

To access CDS view entities in an ABAP system via ODBC, these entities first need to be properly exposed in the back-end system before accessing them via the “ODBC driver for ABAP” in an ODBC-based client tool

This blog post provides a step-by-step description containing the following steps:

  1. Create and fill some test tables
  2. Create CDS view entities for your tables
  3. Create a service definition and an SQL-typed service binding
  4. Create a communication scenario with object privileges
  5. Create a communication system, a communication user, and a communication arrangement
  6. Install the ODBC driver on Windows
  7. Create an ODBC data source
  8. Open an ODBC connection in Microsoft Excel and access data

The following prerequisites are needed when you want to follow this blog post:

  • An SAP BTP ABAP Environment system (aka “Steampunk”)
  • A Developer user in the system
  • The ABAP Development Tools (ADT) installed
  • An ABAP Cloud project configured in ADT and connected to the ABAP system
  • A 64bit Excel version since the “ODBC driver for ABAP” is a 64bit driver.

Create and fill some test tables

As an example, we have chosen two demo table entities ZORDERS and ZORDERITEMS. The definition in ADT looks as follows:

@EndUserText.label : 'ORDERS'
@AbapCatalog.enhancement.category : #NOT_EXTENSIBLE
@AbapCatalog.tableCategory : #TRANSPARENT
@AbapCatalog.deliveryClass : #A
@AbapCatalog.dataMaintenance : #RESTRICTED
define table zorders {
  key id       : abap.numc(10) not null;
  creationdate : abap.datn;

}
@EndUserText.label : 'ORDER ITEMS'
@AbapCatalog.enhancement.category : #NOT_EXTENSIBLE
@AbapCatalog.tableCategory : #TRANSPARENT
@AbapCatalog.deliveryClass : #A
@AbapCatalog.dataMaintenance : #RESTRICTED
define table zorderitems {
  key orderid : abap.numc(10) not null;
  key pos     : abap.int4 not null;
  item        : abap.char(100) not null;
  amount      : abap.int4 not null;

}

We have also created some test data in the tables with the following ABAP sample code:

class zcl_fill_orders definition

  public
  final
  create public.

  public section.
    interfaces if_oo_adt_classrun.
  protected section.
  private section.

endclass.

class zcl_fill_orders implementation.

  method if_oo_adt_classrun~main.

    data: lt_orders type table of zorders.
    delete from zorders.
    lt_orders = value #(
      ( id = '1' creationdate = '20210801' )
      ( id = '2' creationdate = '20210802'  )
      ( id = '3' creationdate = '20210803' )
    ).
    insert zorders from table @lt_orders.
    out->write( sy-dbcnt ).

    data: lt_orderitems type table of zorderitems.
    delete from zorderitems.
    lt_orderitems = value #(
      ( orderid = '1' pos = '1' item = 'Apple' amount = '5' )
      ( orderid = '1' pos = '2' item = 'Banana' amount = '5' )
      ( orderid = '1' pos = '3' item = 'Orange Juice' amount = '2' )

      ( orderid = '2' pos = '1' item = 'Orange' amount = '10' )
      ( orderid = '2' pos = '2' item = 'Apple' amount = '5' )

      ( orderid = '3' pos = '1' item = 'Bottle Water' amount = '5' )
    ).
    insert zorderitems from table @lt_orderitems.
    out->write( sy-dbcnt ).

  endmethod.

endclass.

Create CDS view entities for your tables

We first created a package Z_PACKAGE_SQL, where we place our new objects and then create CDS view entities for the two tables, using the ADT wizard. The new CDS view entities will be called ZORDERSVIEW and ZORDERITEMSVIEW.

To do this, right click on the tables in the project explorer in ADT, then select “New Data Definition” and fill out the data requested in the following pop up.

On the next screen, select “Define View Entity” and click “Finish”. Currently, only CDS view entities can be exposed to ODBC consumers.

After defining the new CDS view entities, the definition may look as follows. Note that we have renamed the original table columns to use mixed-case names.

@AccessControl.authorizationCheck: #NOT_REQUIRED
@EndUserText.label: 'ORDERS'
define view entity ZORDERSVIEW as select from zorders {
  key id as Id,
  creationdate as CreationDate
}

@AccessControl.authorizationCheck: #NOT_REQUIRED
@EndUserText.label: 'ORDER ITEMS'
define view entity ZORDERITEMSVIEW as select from zorderitems {
  key orderid as OrderId,
  key pos as Pos,
  item as Item,
  amount as Amount
}

Create a service definition and an SQL-typed service binding

Now we need a service definition and a corresponding SQL-typed service binding to define that the new CDS view entities are exposed in a SQL service. To define a new service definition, right-click on one of the ZORDERSVIEW or ZORDERITEMSVIEW views in the project explorer, then select “New Service Definition” and give the service definition a name, for example, Z_SERVICE_DEF_SQL.

After defining the new service definition Z_SERVICE_DEF_SQL, open it, add the second view, and add alias names for both view entities. The service definition may look as follows. Note that we have chosen again mixed-case names for the alias names.

@EndUserText.label: 'SERVICE DEF'
define service Z_SERVICE_DEF_SQL {
  expose ZORDERSVIEW as Orders;
  expose ZORDERITEMSVIEW as OrderItems;
}

The newly created service definition can now be used in a service binding. We need a service binding of type SQL1 to access the objects using ODBC later. Right-click on the service definition, and select “New Service Binding”:

Select “SQL1” as binding type and activate it.

You can use a mixed-case name for the service binding. As we will see later, the service binding name will act as the schema name for external ODBC consumers.

Create a communication scenario with object privileges

Since we want to use a technical user in the ABAP system to access the service binding, we now need to create a communication scenario and a communication arrangement in the SAP BTP ABAP Environment system.

To create a communication scenario choose New -> Other ABAP Repository Object- > Search for “Communication”, then select “Communication Scenario”:

In our case, we called the new communication scenario Z_COMM_SCENARIO_SQL. Since we want to use user/password authentication in our Windows Excel test case, select “Basic” as supported authentication methods.

Now go to tab “Inbound”. In section “Inbound Services”, click on the “Add…” button, and enter the “S_PRIVILEGED_SQL1” service and click “Finish”. The “S_PRIVILEGED_SQL1” inbound service is a pre-configured service for the privileged access to CDS view entities, that is, no are applied. (DCL stands for Data Control Language. It provides an access control mechanism to restrict the results returned by the CDS view from the database according to conditions)

Now, we need to add additional authorizations to enable access to our service binding. Go to the tab “Authorizations”. Below “Authorization Objects”, press the “Insert” button and add the “S_SQL_VIEW” authorization object:

To fill in values for SQL_SCHEMA, SQL_VIEW, and SQL_VIEWOP, first choose “OK” and then select the added authorization object and fill out the authorizations in the details:

SQL_SCHEMA -> ZORDERS

SQL_VIEW -> *

SQL_VIEWOP -> SELECT

SQL_SCHEMA must contain the name of the service binding that we want to grant access to. In this case, upper-case notation is ok. The value “*” for SQL_VIEW means that we allow access to all views in the service definition that is attached to the service binding ZORDERS.

Effectively, we grant the SELECT privilege on all views in the schema ZORDERS to the users of the communication scenario, just like we would issue a GRANT statement in a database.

After you have finished, save your entries and choose the “Publish Locally” button to publish it in the current development system. After publishing we can create a communication arrangement.

Create a Communication System, a Communication User, and a Communication Arrangement

The final preparation tasks in the ABAP system cannot be performed in ADT and need to be done in the administration UI of the system. In this UI you will find the relevant tiles in the section “Communication Management”.

We need to create a communication system first. So, click on “Communication Systems” and click on “New”:

Choose a name for the communication system, for example, SQL_ACCESS, and choose “Create”. After this, mark it as “Inbound Only”.

Then, go to “Users for Inbound Communication” and choose the “+” button.

There, select “New User”

We name the new user “SQL_CLIENT_USER” and assign a password. Remember the password and choose “Create”. Then add it to the communication system by choosing “OK” in the dialog.

The user is now listed in the section “Users for Inbound Communication”.

choose “Save” to finish the creation of the communication system.

The last thing we need to do is to create a communication arrangement. The arrangement links the communication scenario that we created in ADT before to the communication system and user.

Go back to the administration UI launchpad home and select “Communication Arrangements”.

Choose “New” and select our communication scenario with name “Z_COMM_SCENARIO_SQL”.

To complete the link between communication scenario and communication system, we add our communication system SQL_ACCESS, and the system will automatically add the user SQL_CLIENT_USER.

At this point it is important to note down the service URL. It’s something like https://<hostname>/sap/bc/sql/sql1/sap/S_PRIVILEGED.

Finally, choose“Save”. We have now finished all preparation tasks in the ABAP system and can now access our exposed objects via ODBC.

Install the ODBC driver on Windows

To install the “ODBC driver for ABAP”, you must download the ODBC DRIVER FOR ABAP 1.0 and the SAPCRYPTOLIB SAR files from the SAP Support Portal. Make sure that you choose the packages for the Windows operating system. In addition, you need SAPCAR to unpack the SAR files.

Choose a directory as your ODBC driver location (for example, C:\ODBCTEST) and unpack the SAR files. After this, you will see at least the following files in your ODBC driver directory:

ODBC_driver_for_ABAP_x64u.msi
sapcrypto.dll

Start the Windows installer file to install the ODBC driver for ABAP:

Click “Next” and accept the license agreement. On the next screen, you can choose “Typical” as installation type and complete the installation.

In the Windows registry, the installer has now created the keys for HKEY_LOCAL_MACHINE -> SOFTWARE -> ODBC -> ODBCINST.INI -> ODBC driver for ABAP. The installed files are in “<SystemDrive>:\Program Files\SAP\ODBC driver for ABAP” .

The ODBC driver is now ready to be used in ODBC data sources.

Create an ODBC data source

Launch the Windows ODBC data source administrator tool. The “ODBC Data Sources (64bit)” tool is part of a standard Windows installation.

In this tool, you can create ODBC data sources and assign a data source name (DSN) to a newly created data source. Either choose the “User DSN” or “System DSN” tab, choose “Add”, choose the ODBC driver for ABAP as ODBC driver and choose “Finish”. This will start the DSN setup dialog of the ODBC driver for ABAP.

We have chosen a “MYABAP” as DSN. The default port number that needs to be used for an SAP BTP ABAP Environment system is 443. The hostname and the service path can be derived from the service URL that we noted down when creating the communication arrangement.

Just for testing we did not bother to create a PSE for the trust of the server certificate and used “TrustAll=true;” as additional parameter instead.

The user name SQL_CLIENT_USER that we created in the communication system is automatically an alias name for a generated ABAP user name. Therefore, we need to switch user type to alias.

Open an ODBC connection in Microsoft Excel

After creating a DSN, you can use this in an ODBC client tool of your choice. For simplicity reasons, we have chosen Microsoft Excel as an example for this blog post.

Start Excel and go to “Data” -> “Get Data” -> “From Other Sources” -> “From ODBC”. In the following popup, enter our newly defined DSN “MYABAP”.

In the Database section, Excel will now ask for the user and password to log on to the ABAP system. We use the password that we noted down when creating the communication system.

After choosing “Connect”, the navigator appears and shows all exposed objects in our SQL schema ZOrders. We can click on one of the CDS entities and Excel will show a preview of the data content.

After this, you choose either “LOAD” to load the data into an Excel sheet or you can choose “TRANSFORM DATA” to switch to the power query tool. When you load the data into an Excel sheet, you can always refresh the data if needed.

As a last step, we just want to show that it is possible to execute a free style SQL query on the exposed entities. The simplest way to do this is to choose “Data” -> “Get Data” -> “From Other Sources” -> “From ODBC” again in Excel and then click on “Advanced Options”. A new control is opened that allows you to enter a SELECT statement directly.

In the SELECT statement, you must prefix all view names by our schema name “ZOrders”. Apart from this, you can use ANSI-like SQL syntax.

The result set will directly show up in an Excel preview window.

Rating: 0 / 5 (0 votes)

The post Consuming CDS View Entities Using ODBC-Based Client Tools first appeared on ERP Q&A.

]]>
Zchat – ABAP based instant messaging https://www.erpqna.com/zchat-abap-based-instant-messaging/?utm_source=rss&utm_medium=rss&utm_campaign=zchat-abap-based-instant-messaging Mon, 17 May 2021 09:08:25 +0000 https://www.erpqna.com/?p=47527 We usually connect to the client system using various combinations of Remote Desktop. For security reasons, the clipboard between Remote Desktop and the work computer is disabled. Also, we usually don’t have any Instant Messaging or even mail. Lack of communication turns ordinary actions (copy, past, send) into a whole quest – you need to […]

The post Zchat – ABAP based instant messaging first appeared on ERP Q&A.

]]>
We usually connect to the client system using various combinations of Remote Desktop. For security reasons, the clipboard between Remote Desktop and the work computer is disabled. Also, we usually don’t have any Instant Messaging or even mail.

Lack of communication turns ordinary actions (copy, past, send) into a whole quest – you need to call, dictate the program name, line number, etc. This greatly slows down development and disturbs concentration.

I wrote a Web Service that works like a chat. The data is stored in the SAP system, the client part is stored in the web browser.

A public and private key is generated based on the login and password. The public key is stored in the database (so that other interlocutors can encrypt messages to you), the private key (with which you decrypt your messages) does not leave the browser.

Here are the chat features:

  • Enter – send message
  • Shift + Enter – new line
  • If you forgot password – you lose your history
  • History – icon above
  • Escape history – exit icon
  • Set avatar – avatar icon
  • You can past small images in chat

Installation via abapgit, source: https://github.com/AntonSikidin/elitechat

All may be fine or not. I’m only worried about the Web Service.

Launch transaction sicf.

F8

Navigate zchat.

If it does not exist. Create new.

Activate service.

To find chat link do this.

Default browser open Web Service. Enter sap_login, sap_password. Create user and you can communicate with your teammate.

As а bonus, you can communicate with yourself.

Rating: 5 / 5 (1 votes)

The post Zchat – ABAP based instant messaging first appeared on ERP Q&A.

]]>
Calling custom developed Adobe Form from standard Transaction Code for Print Output in SAP ABAP https://www.erpqna.com/calling-custom-developed-adobe-form-from-standard-transaction-code-for-print-output-in-sap-abap/?utm_source=rss&utm_medium=rss&utm_campaign=calling-custom-developed-adobe-form-from-standard-transaction-code-for-print-output-in-sap-abap Sat, 08 Feb 2020 10:50:08 +0000 https://www.erpqna.com/?p=26047 Introduction: Adobe Forms are used frequently in SAP to generate Portable Document Format(PDF) files of various business documents like Invoices, Order Confirmations, Account Statements etc. PDF files have got below advantages over other file formats. There are many cases we don’t want to display purchase order details in the format provided by standard Adobe Form […]

The post Calling custom developed Adobe Form from standard Transaction Code for Print Output in SAP ABAP first appeared on ERP Q&A.

]]>
Introduction: Adobe Forms are used frequently in SAP to generate Portable Document Format(PDF) files of various business documents like Invoices, Order Confirmations, Account Statements etc. PDF files have got below advantages over other file formats.

  1. It is an Open Standard Universal File Format that ensures its appearance same independent of application and platform used to create it or display it.
  2. It provides security like Digital Signature Validation, Disable Save Button, Disable Select Option and Disable Print Option.

There are many cases we don’t want to display purchase order details in the format provided by standard Adobe Form attached to its standard print program. So in this article we are going to learn how to call or configure custom developed Adobe Form to standard T Code: ME23N (Purchase Order).

Also Read: SAP ABAP 7.4 Certification Preparation Guide

For this lets find out the standard print program for T Code ME23N (Purchase Order Display) with below steps:

1. Go to T Code NACE ->select application ‘EF’ (Purchase Order) then click on ‘Output Types’.

2. Select the output Type ‘NEU’ (Purchase order) and then double click on ‘Processing routines’.

3. From the below screen you can come to know that driver program name is ‘SAPFM06P’.

4. Go to T Code SE38 -> give program name as ‘SAPFM06P’ -> click on ‘Display’ button.

5. Click on global find and find in Search Term ‘FP_FUNCTION_MODULE_NAME’.

6. Double click on above program to go to source code.

7. Find out all call function ls_function statements in include ‘FM06PE04’ and note down all its exporting parameters.

Note: We should pass all the above exporting parameters as importing parameters in our custom developing interface of Adobe Form.

8. Create an interface ‘YTEST’ with all above noted parameters as importing parameters and active using T Code SFP.

9. Now create Form ‘YTESTFORM’ by giving interface name as ‘YTEST’.

10. Here for demo purpose i am going to show few header fields only in output, considering this drag and drop import parameter ‘HEADER’ from ‘YTEST’ to ‘YTESTFORM’ and click on ‘SAVE’ button.

11. Deactivate all the fields under ‘HEADER’ node which are giving errors as below and correct other issues if any exists.

12. Click on ‘Layout’ Tab ->click on ‘Master Pages’->click on ‘Data View’ Tab.

13. Select field ‘EBELN’ and drag and drop into Main Pages ->check in Preview PDF -> click on ‘Save’ icon then ‘Activate’ icon. Here for demo purpose, only one field is selected.

14. To Configure the custom developed Adobe Form ‘YTESTFORM’ to standard print program ‘SAPFM06P’ of application ‘EF’ (Purchase Order) , Go to T Code NACE->Select application ‘EF’ ->Click on ‘Output Types’.

15. Select Existing Standard Output Type ‘NEU’ -> Click on ‘Details’ ->Click on ‘Storage System’ Tab

->Give Storage Mode as ‘Print Only’ and Document Types as ‘MEOORDER’->Click on ‘Save’.

16. Double click on ‘Processing routines’ ->Replace the existing values with new values and save.

Note: Here we have replaced Form routine field value ‘ENTRY_NEU’ to ‘ADOBE_ENTRY_NEU’, removed Form field value ‘MEDRUCK’ , gave PDF/SmartForm Form value as ‘YTESTFORM’ and gave Type field value as ‘PDF’.

17. Go to T Code ME23N -> Open any purchase order ->click on ‘Print Preview’

18. Select our newly created output type ‘NEU’ and click on ‘Print Preview’

(Ctrl_Shift+F1), then it will call out custom developed Adobe Form for preview only.

19. To give the print, click on ‘Messages’ ->click on new entries

and give below details.

20. select new entry and click on communication method

and select appropriate details and go back (F3).

21. click on ‘Further Data’ then select Dispatch time as ‘Send Immediately’ and go back (F3).

22. Finally select the new entry and click on ‘Save’ button then it will be dispatched to configured printer successfully.

Rating: 0 / 5 (0 votes)

The post Calling custom developed Adobe Form from standard Transaction Code for Print Output in SAP ABAP first appeared on ERP Q&A.

]]>