BW (SAP Business Warehouse) Archives - ERP Q&A https://www.erpqna.com/category/bw-sap-business-warehouse/ Trending SAP Career News and Guidelines Tue, 13 Jan 2026 08:57:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png BW (SAP Business Warehouse) Archives - ERP Q&A https://www.erpqna.com/category/bw-sap-business-warehouse/ 32 32 SAP BW Bridge In SAP Datasphere : Connectivity Between S/4HANA System & BW Bridge https://www.erpqna.com/sap-bw-bridge-in-sap-datasphere-connectivity-between-s-4hana-system-bw-bridge/?utm_source=rss&utm_medium=rss&utm_campaign=sap-bw-bridge-in-sap-datasphere-connectivity-between-s-4hana-system-bw-bridge Wed, 27 Mar 2024 10:24:54 +0000 https://www.erpqna.com/?p=82956 Introduction: SAP BW bridge is a functional enhancement of SAP Datasphere and enables ABAP-based data extraction and staging capabilities within SAP Datasphere. It provides customers who are running SAP Business Warehouse or SAP BW/4HANA with access to the public cloud as it offers SAP BW capabilities directly in SAP Datasphere. SAP BW Bridge for SAP […]

The post SAP BW Bridge In SAP Datasphere : Connectivity Between S/4HANA System & BW Bridge appeared first on ERP Q&A.

]]>
Introduction:

SAP BW bridge is a functional enhancement of SAP Datasphere and enables ABAP-based data extraction and staging capabilities within SAP Datasphere. It provides customers who are running SAP Business Warehouse or SAP BW/4HANA with access to the public cloud as it offers SAP BW capabilities directly in SAP Datasphere.

SAP BW Bridge for SAP Datasphere is a feature that makes certain elements from the on-premise SAP BW system available in the cloud. In order to transfer these elements to SAP Datasphere, SAP also supplies the appropriate conversion tools. SAP BW Bridge is a cost-effective and simple way to switch from an on-premise SAP BW system to SAP Datasphere.

Provisioning the SAP BW Bridge Tenant:

We need to define SAP BW Bridge Storage before creating SAP BW Bridge Tenant. Configure the size of your SAP BW bridge tenant in the Tenant Configuration:

From the side navigation, go to System =>Configuration =>Tenant Configuration

Choose the size & Save

Now create SAP BW Bridge instance
Go to System -> Configuration -> SAP BW Bridge and create

Provide the Instance name and description

For Development system: Select enable system for development
For Production system: Deselect enable system for development

A SAP BW Bridge Space has been created automatically as given below –

We just need to add users in the BW Bridge space.

Activate BW Bridge Tenant:

When a new SAP BW bridge tenant is provisioned together with a new SAP Datasphere tenant, the system owner receives a welcome email. Click the Activate Account button to connect to the server and set your password.

When a new SAP BW bridge tenant is provisioned to an already existing SAP Datasphere tenant, the first login to the SAP BW bridge tenant must be done by the user who was system owner of the SAP Datasphere tenant when the SAP BW bridge tenant was provisioned.

System owner needs to create the other users in the BW Bridge Cockpit.

After activating it get activated:

Open SAP BW Bridge Cockpit:

From the side navigation, choose Data Integration Monitor -> Choose BW Bridge space

Now we can see the BW bridge cockpit, Click Open SAP BW Bridge Cockpit

Preparing Connectivity for ODP Source Systems in SAP BW Bridge:

Connecting an SAP on-premise system to SAP BW bridge requires a few more steps than connecting the same system to an SAP BW on SAP HANA or SAP BW∕4HANA system.

We will required a Cloud Connector which serves as a link between the on-premise source system and your SAP BW bridge tenant which is technically based on an ABAP Platform in SAP BTP. RFC is used as a protocol for data exchange between on-premise source systems and SAP BW bridge.

System needs to be added in the cloud connector with RFC protocol.

The on-premise source system must be configured as communication system in BW Bridge Cockpit.

The source system connectivity is established following these procedures:

  1. Add the SAP Datasphere subaccount in the Cloud Connector.
  2. Create the on-premise source system in the Cloud Connector.
  3. Add the relevant resources to the source system.
  4. Add a service channel to the SAP BW bridge tenant in the Cloud Connector.
  5. Create a communication system in the SAP BW bridge tenant.
  6. Create the source system in the SAP BW Modeling Tools.
  7. Optional: Select and activate preconfigured SAP BW bridge Content objects.

Add a Service Channel to the SAP BW Bridge Tenant in the Cloud Connector:

The on-premise source system must be able to call the SAP BW bridge tenant via RFC. Therefore, a service channel must be added in the Cloud Connector.

Procedure:

  1. Log in to the Cloud Connector
  2. In the left-side menu of the administration UI, select On-Premise To Cloud
  3. In the Service Channels section, click (Add) to add a new service channel.
  4. In the Add Service Channel dialog, use the following values:
    1. Type: ABAP Cloud System
    2. ABAP Cloud Tenant Host: For SAP BTP ABAP based systems like SAP BW bridge, the tenant host is <serviceinstanceguid> .abap.<region>.hana.ondemand.com . The region is, for example, eu10 or us10.

To retrieve the host name:

  • Log on to SAP Datasphere.
  • From the side navigation, choose Space Management.
  • Select the space BW Bridge.
  • Navigate to the section Connections.
  • Mark the local connection BWBRIDGE and choose Edit.
  • Under HTTP Access copy the host name to the clipboard (without https://).

  1. In the same dialog window, define the Local Instance Number under which the SAP BW bridge system is reachable for the source system(s). The <Local Instance Number> will later be used when maintaining the RFC destination in the on-premise source system pointing to the SAP BW bridge system (i.e. in the so-called callback destination).
  2. Leave Connections set to 1.
  3. Leave Enabled selected to establish the channel immediately after choosing Finish. Unselect it if you don’t want to establish the channel immediately.
  4. Select Finish.

You can now see that Service channel is enabled.

Create a Communication System in the SAP BW Bridge Tenant:

The on-premise source system must be configured as communication system in the SAP BW bridge tenant. A communication system is a specification of a system that represents a communication partner and the technical information required for the communication (inbound/outbound), such as the host name and user information (inbound/outbound).

Procedure:

1. Log on to the SAP BW Bridge Cockpit.
2. In the Communication Management section, select the app Communication Systems.
3. Click New to add a new Communication System.
4. In the New Communication System dialog, enter a System ID and a System Name and choose Create.

1. Under Technical Data, maintain the following values:

a. Enter the virtual host name maintained in the Cloud Connector as Host Name.
b. Fill in the port number 33<instance number> as Port, e.g. 3301 if the instance number is 01.
c. Switch on the property Cloud Connector.

1. Under RFC Settings, maintain the following values:

a. Set Load Balancing if the on-premise source system was created with the option With load balancing (system ID and message server) in the Cloud Connector.
b. Maintain the client of the source system as Client.
c. If Load Balancing was set, maintain a logon group of the source system as Group. Enter the virtual system ID maintained in the Cloud Connector as Target System. Enter the virtual message server maintained in the Cloud Connector as Message Server.
d. If Load Balancing was not set, enter the virtual instance number maintained in the Cloud Connector as Instance Number. Enter the virtual application server maintained in the Cloud Connector as Target Host.
e. Maintain the location ID of the Cloud Connector as SCC Location ID. In case you did not specify a location ID in the Cloud Connector, you can leave SCC Location ID blank.

  1. Under Users for Inbound Communication, click (Add) to maintain the user that is used in the SAP BW bridge tenant for the inbound communication. Select an existing user or click New User to create a new user.
  2. Under Users for Outbound Communication, Enter the username and password of an existing user in the on-premise source system.
  3. Save the new communication system.

As a result, you have created the on-premise source system as communication system in the SAP BW bridge.

Maintain Communication Arrangement:

You need to maintain communication arrangement for the communication system that we have created in the BW Modeling tool.

You need to choose the predefined communication scenario SAP_COM_0692. And enter the required details like –

  1. Use the technical name of the source system as Arrangement Name
  2. User for Inbound Communication
  3. User for Outbound Communication

Maintain Callback Destination:

Create a RFC in S4H system(Standard Name Given in Communication systems):

  1. Log in to the on-premise source system.
  2. Call transaction SM59.
  3. Choose Create to create a new RFC connection.
  4. In the Create Destination dialog, enter the name of the Callback Destination shown in the BW Modeling Tools as name of the Destination and choose RFC connection to ABAP system as Connection Type.
  5. On the Technical Settings tab, enter the hostname of the Cloud connector (without https:// and without port number) as Target Host and the local instance number you defined in the Cloud Connector for the service channel as Instance Number.
  6. On the Logon & Security tab, enter the Language, 100 as Client and User and Password of the user you defined as User for Inbound Communication.

ODP_INBOUND User which is created in BW bridge cockpit, User for inbound communication

Connection test:

Authorization test:

Create an SAP BW Bridge Project:

To test the connection between we will create a project in BW modeling tool using service key.

For service key we need to go to BW Bridge Space => Choose Connections and select the BW Bridge space => Choose Edit => Copy the SAP BW Service Key

Now Create an SAP BW Bridge Project in BW modeling tool with the system details

Logical destination and callback destination(which we have created in backend S/4 HANA system) should have status green as below –

Now the connection has been established successfully between S/4HANA and BW Bridge cockpit.

With SAP BW Bridge for SAP Datasphere, SAP has created an attractive and cost-effective way for companies to move from their SAP BW on – premise system to SAP Datasphere. SAP BW Bridge is particularly interesting if companies do not want to do a greenfield implementation but want to take existing content with them into the cloud.

Rating: 0 / 5 (0 votes)

The post SAP BW Bridge In SAP Datasphere : Connectivity Between S/4HANA System & BW Bridge appeared first on ERP Q&A.

]]>
C_TBW50H_75 Certification: How to Get Ready for It? https://www.erpqna.com/score-high-in-the-c_tbw50h_75-exam/?utm_source=rss&utm_medium=rss&utm_campaign=score-high-in-the-c_tbw50h_75-exam Mon, 16 Oct 2023 08:17:54 +0000 https://www.erpqna.com/?p=78972 In the ever-evolving world of IT, staying competitive and advancing your career often requires certifications that demonstrate your expertise. One such certification that holds great significance in the IT industry is the SAP C_TBW50H_75 certification. This credential signifies your proficiency in SAP Business Warehouse (BW) powered by SAP HANA, an essential skill for data management […]

The post C_TBW50H_75 Certification: How to Get Ready for It? appeared first on ERP Q&A.

]]>
In the ever-evolving world of IT, staying competitive and advancing your career often requires certifications that demonstrate your expertise. One such certification that holds great significance in the IT industry is the SAP C_TBW50H_75 certification. This credential signifies your proficiency in SAP Business Warehouse (BW) powered by SAP HANA, an essential skill for data management and analytics professionals.

In this comprehensive guide, we’ll delve into the world of SAP C_TBW50H_75 certification preparation, providing valuable insights and study tips to help you ace the exam. Whether you’re new to the certification journey or looking to enhance your existing knowledge, these tips will be your roadmap to success.

What Is the C_TBW50H_75 Certification All About?

C_TBW50H_75 or the SAP Certified Application Associate – Modeling and Data Acquisition with SAP BW 7.5 powered by SAP HANA certification confirms that the candidate possesses the necessary expertise in MODELING, DATA ACQUISITION, and QUERY DESIGN within the context of SAP BW 7.5 on SAP HANA, specifically tailored to the application consultant role. This certification extends the foundational knowledge acquired through formal training by a BW consultant and ideally enhanced through hands-on experience within a BW project team while under mentorship.

The Importance of Earning the C_TBW50H_75 Certification:

Before we dive into the study tips, let’s establish why mastering effective study techniques is crucial for passing the SAP C_TBW50H_75 certification exam.

The SAP C_TBW50H_75 certification sets you apart in the IT industry by confirming your ability to work with SAP BW powered by SAP HANA. With businesses relying on data-driven decisions more than ever, this certification is highly sought after by employers, making it a valuable addition to your resume.

To succeed, you must understand the structure of the exam. The SAP C_TBW50H_75 certification exam has multiple-choice questions and a time limit. To pass, you need to achieve a minimum score. Knowing these details will help you tailor your study plan effectively.

Follow A Structured Study Schedule for the C_TBW50H_75 Certification Exam:

Having a structured study schedule is the foundation of successful preparation. It keeps you organized, ensures you cover all necessary topics and prevents last-minute cramming. Every individual has unique learning habits and constraints. Tailor your study plan to your preferences, allowing time for focused study and breaks. This personalized approach maximizes your efficiency. Time management is crucial. Allocate more time to challenging topics and regularly assess your progress. Setting realistic goals ensures you stay motivated and stay calm.

Get the Proper Study Materials:

To prepare effectively, an aspirant needs the right study materials. These include textbooks, online resources, and practice exams. Invest in quality materials to ensure you receive accurate information. Consider reputable sources for your study materials. Online platforms, SAP’s official documentation, and study guides are excellent places to start. Don’t forget to explore practice exams and mock tests for self-assessment.

Understand the C_TBW50H_75 Exam Objectives:

Familiarize yourself with the different exam objectives and domains covered in the certification. Understanding these areas helps you allocate your study time wisely. Identify the key topics within each domain likely to be emphasized in the exam. This knowledge allows you to prioritize your studies and ensure comprehensive coverage.

Study with Active Learning Techniques:

Active learning techniques are highly effective in retaining information. Engaging in activities such as creating flashcards, taking detailed notes, and solving problems reinforces your understanding. Incorporate these techniques into your study routine. For instance, use flashcards to memorize key concepts, maintain organized notes for quick reference, and tackle practice problems to apply your knowledge.

Studying in Groups Is Essential:

Study groups provide a supportive environment for learning. Collaborating with peers allows you to discuss challenging topics, gain different perspectives, and reinforce your understanding through teaching. Seek out study partners who are equally committed to the certification. Online forums, social media groups, and professional networks are excellent places to find like-minded individuals.

Gauge Your Readiness with the C_TBW50H_75 Certification:

Practice exams are essential for gauging your readiness. They simulate the real exam environment, helping you assess your knowledge and identify areas that need improvement. Explore reputable sources for practice exams and mock tests. These resources often provide valuable insights into the exam’s format and types of questions.

Get Guidance from Experts:

Mentorship can greatly benefit your preparation journey. Reach out to experienced professionals who have already obtained the SAP C_TBW50H_75 certification. Their guidance can be invaluable. Experienced mentors can offer insights, share their study strategies, and provide emotional support during challenging times. Don’t hesitate to seek their advice.

Maintain A Healthy Lifestyle:

Amidst rigorous study sessions, pay attention to your health. A balanced diet, regular exercise, and stress management are essential for maintaining physical and mental well-being. Incorporate small wellness routines into your daily life. Dedicate time for exercise, practice mindfulness techniques, and ensure adequate sleep to stay sharp and focused.

Review and Revise Regularly with the C_TBW50H_75 Exam Study Notes:

Continuous revision is vital to retaining information. Implementing spaced repetition techniques ensures you remember and understand concepts in the long run. Regularly review your notes and practice problems. Use spaced repetition apps or methods to revisit previous topics at optimal intervals.

Stay Confident and Positive Throughout Your Preparation:

Certification exams can be daunting. It’s normal to face challenges and stressors during your preparation journey. Stay motivated by setting small milestones, visualizing your success, and practicing relaxation techniques to manage test anxiety. A positive mindset can be your greatest asset.

Concluding Thoughts:

The SAP C_TBW50H_75 certification is a significant achievement in the IT industry, opening doors to exciting career opportunities. Following the above-mentioned study tips outlined in this guide will prepare you to excel in the certification exam.

Remember, success in certification exams is not just about knowledge but also about effective study techniques and mindset. Implement these tips, stay dedicated, and watch yourself confidently step into a brighter future in SAP BW powered by SAP HANA.

Rating: 0 / 5 (0 votes)

The post C_TBW50H_75 Certification: How to Get Ready for It? appeared first on ERP Q&A.

]]>
Steps to Start Your Career as an SAP Consultant https://www.erpqna.com/how-to-become-sap-consultant/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-become-sap-consultant Tue, 30 May 2023 12:11:18 +0000 https://www.erpqna.com/?p=75098 In this comprehensive guide, we will walk you through the steps to How to Become SAP Consultant and offer valuable insights...

The post Steps to Start Your Career as an SAP Consultant appeared first on ERP Q&A.

]]>
Are you intrigued by the world of SAP? Do you aspire to become an SAP consultant and embark on a rewarding career in this thriving field? Look no further! In this comprehensive guide, we will walk you through the steps to ”How to Become SAP Consultant” and offer valuable insights on how to pass the related SAP certification exam.

What is SAP?

SAP stands for Systems, Applications, and Products in Data Processing. It is a globally recognized software company specializing in business solutions and enterprise software. SAP offers various modules and applications that cater to business operations, such as finance, human resources, supply chain management, customer relationship management, and more.

The Role of an SAP Consultant

A SAP consultant helps organizations implement, customize, and optimize SAP software solutions to meet their business needs. They act as advisors, guiding companies through the entire SAP implementation process, from requirements gathering and system configuration to user training and post-implementation support.

Step 1: Understand the SAP Ecosystem

To become a SAP consultant, it is crucial to thoroughly understand the SAP ecosystem and familiarize yourself with the different modules and applications available.

1. Exploring the Different SAP Modules

SAP offers a diverse range of modules, each targeting specific business functions. Here are some of the prominent SAP modules you should be aware of:

2. SAP ERP (Enterprise Resource Planning)

SAP ERP is the core module that integrates various business processes into a unified system, such as finance, procurement, production, sales, and distribution. It provides real-time insights, streamlines operations, and facilitates efficient decision-making.

3. SAP CRM (Customer Relationship Management)

SAP CRM helps organizations manage their customer relationships effectively. It enables businesses to streamline sales, marketing, and customer service activities, fostering stronger customer connections and satisfaction.

4. SAP BW (Business Warehouse)

SAP BW, or SAP Business Warehouse, focuses on data warehousing and analytics. It allows businesses to gather, store, and analyze data from various sources, enabling informed decision-making and strategic planning.

5. SAP HANA (High-Performance Analytic Appliance)

SAP HANA is an in-memory database platform that provides lightning-fast data processing capabilities. It empowers organizations to perform real-time analytics, predictive modeling, and complex data processing tasks, revolutionizing the speed and efficiency of data-driven operations.

6. SAP SuccessFactors

SAP SuccessFactors is a module dedicated to human capital management (HCM). It encompasses a range of HR processes, including talent acquisition, performance management, learning and development, and employee engagement. SAP SuccessFactors helps businesses optimize their HR operations and drive employee productivity.

7. SAP Ariba

SAP Ariba is a cloud-based procurement platform that enables organizations to manage their end-to-end procurement processes. It facilitates supplier management, contract management, sourcing, and procurement analytics, fostering transparency and efficiency in the procurement lifecycle.

8. Choosing the Right SAP Module for You

With the vast array of SAP modules available, choosing the right one is essential based on your interests, skills, and career goals. Take the time to explore each module in-depth and consider how it aligns with your expertise and aspirations. This will ensure that you embark on a path that truly resonates with you and maximizes your potential as a SAP consultant.

Step 2: Acquiring SAP Knowledge and Skills

Once you clearly understand the SAP ecosystem and have chosen a module to specialize in, it’s time to acquire the necessary knowledge and skills.

1. Formal Education vs. Self-Study

Formal education in a relevant field, such as information technology or business administration, can provide a solid foundation for your SAP journey. However, self-study is also a viable option, especially if you have a strong passion for SAP and are committed to continuous learning.

2. SAP Certification Programs

SAP offers a range of certification programs that validate your expertise in specific SAP modules. These certifications are highly regarded in the industry and can enhance your credibility as a SAP consultant. Consider enrolling in the certification program relevant to your chosen module and dedicate time and effort to prepare for the exam.

3. Online Training Courses

Online training courses are an excellent way to gain comprehensive knowledge and hands-on experience with SAP software. Numerous platforms offer SAP training, allowing you to learn at your own pace and convenience. Look for courses that provide practical exercises and simulations to reinforce your understanding of the concepts.

4. Practical Experience and Hands-On Projects

While theoretical knowledge is essential, practical experience is equally crucial for becoming a successful SAP consultant. Seek opportunities to work on real-world projects or internships where you can apply your skills in a realistic setting. This hands-on experience will deepen your understanding and boost your confidence in tackling real-life SAP implementation challenges.

Step 3: Preparing for the SAP Certification Exam

Once you have acquired the necessary knowledge and skills, it’s time to prepare for the SAP certification exam.

1. Understanding the SAP Exam Structure and Content

Familiarize yourself with the exam structure, duration, and the topics that will be covered. Understand the weightage of each topic and prioritize your study accordingly. Obtain study guides or syllabi provided by SAP to clarify the exam content.

2. Creating a Study Plan

Develop a study plan that outlines your study goals, timelines, and the resources you will utilize. Break down the topics into manageable sections and allocate sufficient time for each. Be realistic about your study commitments and set aside dedicated study hours each day or week.

3. Leveraging Study Resources

Use various study resources to help you prepare for the SAP certification exam. This includes textbooks, online tutorials, video courses, practice exams, and official SAP documentation. These resources will deepen your understanding of the concepts and reinforce your knowledge.

4. Joining Study Groups and Forums

Engaging with fellow SAP aspirants through study groups and online forums can be immensely helpful. These communities provide a platform to discuss concepts, share resources, ask questions, and learn from each other’s experiences. Collaborating with like-minded individuals can enhance your understanding and offer valuable insights.

5. Taking SAP Practice Test

Practice tests are an essential part of exam preparation. They familiarize you with the exam format, assess your knowledge, and identify areas that require further improvement. Take advantage of the practice exams available online or through SAP to evaluate your readiness for the certification exam. Erpprep offers an enhanced and trustworthy SAP practice tests to enhance your readiness.

Step 4: Sitting for the SAP Certification Exam

Now that you have diligently prepared, it’s time to sit for the SAP certification exam. Here are some tips to help you perform your best on exam day:

1. Exam-Day Tips and Strategies

Ensure you have a good night’s sleep before the exam to be mentally sharp and focused. Arrive at the exam center well in advance to avoid any last-minute rush. Carry all the necessary identification documents and any permitted study materials. Follow the instructions carefully and manage your time effectively.

2. Managing Time Effectively

The SAP certification exam is time-bound, so managing your time efficiently is crucial. Read through all the questions first and allocate time to each section based on the weightage of the topics. Answer the questions you are confident about first, then revisit the more challenging ones. Don’t spend too much time on a single question, as it may impact your progress.

3. Remaining Calm and Focused

Stay calm and composed throughout the exam. Stress and anxiety can hinder your performance. Take deep breaths, trust your preparation, and focus on each question individually. Maintain a positive mindset and approach each question with confidence.

4. Reviewing and Double-Checking Answers

Once you have completed all the questions, use any remaining time to review your answers. Double-check for any errors or omissions. Pay attention to details and ensure that your responses are accurate. Be cautious to avoid making any sudden changes that may introduce errors.

Step 5: Continuing Professional Development

Becoming a SAP consultant is not a one-time achievement; it’s a continuous journey of professional growth. Here are some ways to further develop your skills and advance in your career:

1. Staying Updated with the Latest SAP Technologies

The field of SAP is constantly evolving, with new technologies and updates being introduced regularly. Stay updated with the latest trends, innovations, and advancements in the SAP ecosystem. Follow SAP blogs, attend webinars, and participate in conferences to keep abreast of the changes.

2. Participating in SAP Community Events

Engage with the SAP community by participating in events, conferences, and user groups. These gatherings provide opportunities to network with industry professionals, exchange knowledge, and gain valuable insights. Collaborating with peers and experts can open new opportunities and enhance professional growth.

3. Building a Professional Network

Networking is a vital aspect of career development. Connect with other SAP professionals, consultants, and industry leaders through networking platforms like LinkedIn. Building relationships within the SAP community can lead to mentorship opportunities, job referrals, and a broader understanding of industry practices.

4. Exploring Further Certification Opportunities

As you gain experience and expand your skill set, consider pursuing additional SAP certifications in different modules or specialized areas. These certifications can enhance your expertise, increase market value, and open doors to new career opportunities within the SAP ecosystem.

Conclusion: Embrace the Journey of How to Become SAP Consultant

Becoming a SAP consultant is an exciting and fulfilling journey that requires dedication, continuous learning, and a passion for technology and business processes. Understanding the SAP ecosystem, acquiring the necessary knowledge and skills, preparing for the certification exam, and embracing professional development can pave the way for a successful career as a SAP consultant.

Remember, becoming a SAP consultant may have challenges, but with perseverance and a growth mindset, you can overcome them and thrive in this dynamic field. So, take the first step, explore the different SAP modules, and embark on your journey to becoming a trusted advisor, helping organizations optimize their business processes and succeed with SAP.

Get ready to dive into the world of SAP and unlock endless possibilities for personal and professional growth. Are you prepared to embrace the journey to SAP consultancy? The future awaits!

Rating: 5 / 5 (1 votes)

The post Steps to Start Your Career as an SAP Consultant appeared first on ERP Q&A.

]]>
How to call a BW ABAP-Backend from SAC Analytic Application https://www.erpqna.com/how-to-call-a-bw-abap-backend-from-sac-analytic-application/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-call-a-bw-abap-backend-from-sac-analytic-application Thu, 13 Apr 2023 12:31:36 +0000 https://www.erpqna.com/?p=73756 Introduction There are several blogs showing how to organize interaction between SAC and S/4HANA using OData services. However, one cannot ignore the fact that SAC, as an analytical tool, often uses data from SAP BW or BW/4HANA. This blog provides a simple example of how an OData Service can be used to exchange information between […]

The post How to call a BW ABAP-Backend from SAC Analytic Application appeared first on ERP Q&A.

]]>
Introduction

There are several blogs showing how to organize interaction between SAC and S/4HANA using OData services. However, one cannot ignore the fact that SAC, as an analytical tool, often uses data from SAP BW or BW/4HANA. This blog provides a simple example of how an OData Service can be used to exchange information between a SAC Analytic Application and BW over a live connection. It is important to note, that this exchange is bi-directional.

Our scope does not include a detailed discussion of the rich features of OData, such as the use of complex types in services. Only the necessary setup steps are shown. If necessary, a specialist will be able to use much more complex Services, required in practical tasks, following to the same scheme.

It should also be noted that described solution can be used to transfer data from SAC to BW with subsequent processing in ABAP. This may be required in cases where the SAC scripting language capabilities are insufficient for complex calculations. Also, this solution is useful if the SAC Analytic Application is used as a replacement for the Web Dynpro applications, where it was possible to initiate the execution of any actions at the Application Server using standard ABAP tools.

Prerequisites

CORS Settings

The described scheme only works when using a live connection between SAC and BW. When creating this connection, BASIS Team performs CORS settings on the AS ABAP side.

Authorizations

It is assumed that the developer of the OData service has the standard set of ABAP development permissions. Including creating and editing ABAP classes and using the debugger. Additionally, following permissions will be required:

– transaction SEGW

– transactions /IWFND/***

– transactions /IWBEP/***

These authorizations can be described as follows:

– SEGW development authorization: authorization object /IWBEP/SB. In our example, a project (i.e. OData service) with the technical name ZODATAV4 will be created, it will be placed in the local ABAP package $TMP. Corresponding authorization should look as follows:

– authorizations to start the OData service. In our example, ZODATAV4_GROUP service group will be created. To do this, you will first need to have the following authorizations on the S_USER_STA object:

Here the choice is made from the list in the AUTHOBJNAM field. Please, note that you must first select the type of object in the “Type” drop-down list

Then the required values can be selected from the list or the “Full Authorization” button is pressed. After confirming the selection, AUTHOBJTYP and AUTHPGMID fields will be filled in automatically according to the value selected in the “Type” drop-down list.

After that you should restart transaction PFCG and add the following S_START authorization with the same way as previous:

S_USER_STA authorisations are required only while editing the S_START object. When it is done, S_USER_STA can be deactivated or removed from the role profile.

Demo Scenario

  1. OData Service creation and configuration in BW. For the service, an action will be defined that “is able to” launch a BW process chain.
  2. Creation of SAC Analytic Application.
  3. An OData connection will be placed in Application to communicate with our Service.
  4. A button will be placed in Application. An onClick-script for this button will call the action in OData Service.

OData Service creation

Press the “Create Project” button in SEGW transaction, enter project technical name and description. Make sure that “OData 4.0 Service” is selected as project type. Also, select a development package if necessary. Here and below, in all examples, objects are created locally in the $TMP package:

You can ignore following pop-up message:

Select newly created Project and press the “Generate Runtime Objects” button:

All fields in this window will be filled in automatically, you don’t have to change them.

The ZCL_ZODATAV4_MPC and ZCL_ZODATAV4_DPC classes contain standard code that is automatically copied by the system. This code should not be changed. All custom logic need to be implemented by descendant classes whose names end in _EXT.

The Model Provider Class will declare the actions available in OData Service. It also should describe the returns produced by actions. If necessary, you can declare input parameters for these actions. For our example, we will create a StartProcessChain action with a ChainTechName input parameter. The SAP BW process chain technical name will be passed to input parameter when the action is called. The action will have a string type return, which will show the ID of the chain run that was started during the action execution. This is an artificial example, only to demonstrate that input parameter and return of action are provide a bi-directional data exchange between SAC and BW.

To implement the logic described, we’ll create the new method START_CHAIN in Model Provider Class. Also, standard method /IWBEP/IF_V4_MP_BASIC~DEFINE should be redefined. Our action will be called directly from the SAC Analytic Application without instantiating any helper types. Therefore, it must be unbound, that is, described as an Action Import.

Let’s create the START_CHAIN method as follows:

I added a code comments to clarify the logic:

METHOD start_chain.
    DATA:
      lo_action        TYPE REF TO /iwbep/if_v4_med_action,
      lo_action_import TYPE REF TO /iwbep/if_v4_med_action_imp,
      lo_parameter     TYPE REF TO /iwbep/if_v4_med_act_param,
      lo_return        TYPE REF TO /iwbep/if_v4_med_act_return.

    "Declare an Action Import to be implemented in this OData Service
    lo_action = io_model->create_action( 'STARTCHAIN' ).
    lo_action->set_edm_name( 'StartProcessChain' ).
    lo_action_import = lo_action->create_action_import( iv_action_import_name = 'STARTCHAIN' ).
    lo_action_import->set_edm_name( 'StartProcessChain' ).

    "Declare a local parameter type from standard OData types
    io_model->create_primitive_type( iv_primitive_type_name = 'STRING' )->set_edm_type( /iwbep/if_v4_med_element=>gcs_edm_data_types-string ).
    "Define an import parameter of this type for action described above
    lo_parameter = lo_action->create_parameter( 'CHAIN' ).
    lo_parameter->set_edm_name( 'ChainTechName' ).
    lo_parameter->set_primitive_type( 'STRING' ).

    "Declare that action should have a return and of which type it will be
    lo_return = lo_action->create_return( ).
    lo_return->set_primitive_type( 'STRING' ).
  ENDMETHOD.

The redefined DEFINE method first calls the ancestor class method with the same input parameters. And then transfers control to the START_CHAIN method, specific to our task:

METHOD /iwbep/if_v4_mp_basic~define.
    TRY.
        CALL METHOD super->/iwbep/if_v4_mp_basic~define
          EXPORTING
            io_model      = io_model
            io_model_info = io_model_info.
      CATCH /iwbep/cx_gateway .
    ENDTRY.

    start_chain( io_model ).

  ENDMETHOD.

Data Provider Class implements the action return calculation logic. It will be enough to redefine the method /IWBEP/IF_V4_DP_ADVANCED~EXECUTE_ACTION for ZCL_ZODATAV4_DPC_EXT class. In our example, a standard function module will be used to start the chain in BW:

METHOD /iwbep/if_v4_dp_advanced~execute_action.
    TYPES: BEGIN OF s_chain_action,
             chain TYPE string,
           END OF s_chain_action.

    DATA: ls_chain_params TYPE s_chain_action,
          l_chain         TYPE rspc_chain,
          l_logid         TYPE rspc_logid,
          l_response      TYPE string.

    DATA:
      lv_action_name        TYPE /iwbep/if_v4_med_element=>ty_e_med_internal_name,
      ls_done               TYPE /iwbep/if_v4_requ_adv_action=>ty_s_todo_process_list,
      ls_todo               TYPE /iwbep/if_v4_requ_adv_action=>ty_s_todo_list.


    "Read todo list of requested operations
    io_request->get_todos( IMPORTING es_todo_list = ls_todo ).

    "Check that action import 'STARTCHAIN' was requested
    IF ls_todo-process-action_import = abap_true.

      io_request->get_action_import( IMPORTING ev_action_import_name =   lv_action_name  ). 

      IF lv_action_name = 'STARTCHAIN'.

        io_request->get_parameter_data(
          IMPORTING
            es_parameter_data =  ls_chain_params "Structure with the action parameters
        ).

        ls_done-parameter_data = abap_true. "Parameters reading was executed

        l_chain = ls_chain_params-chain. "This is necessary for data types compatibility
        CALL FUNCTION 'RSPC_API_CHAIN_START'
          EXPORTING
            i_chain     = l_chain
            i_dont_wait = 'X'
            i_gui       = 'N'
          IMPORTING
            e_logid     = l_logid.

        l_response = l_logid.
        io_response->set_busi_data( ia_busi_data = l_response ). "logid transferred as response of the action
        ls_done-action_import = abap_true. "Action import was executed
        io_response->set_is_done( ls_done ). "We return a list of executed operations

      ENDIF.
    ENDIF.
  ENDMETHOD.

OData Service publishing

Service ZODATAV4 must be published in the system. Following steps are necessary for this:

  1. Create Service Group in transaction /IWBEP/V4_ADMIN, if there is no previously created group relevant for our purposes.
  2. Assign OData Service to Service Group.
  3. Publish Service Group, that is to make the services assigned to Group available for use by other users with the necessary authorizations (authorization object S_START). For that, the /IWFND/V4_ADMIN transaction is used.

Start transaction /IWBEP/V4_ADMIN and press the “Register Group” button. Enter Service Group technical name and description:

After creating a Group, the system will prompt you to immediately assign some Service to this Group:

You can choose “Assign”. If the relevant Group has already been created earlier, or if you want to assign a Service to a Group later, you can do so using the “Assign Service” button. Here you can simply select a Service and a Group from the list:

Service Group with the services assigned to it will be listed in the transaction:

If the Service Group has been newly created, then it must be published. To do this, go to transaction /IWFND/V4_ADMIN and click the “Publish Service Groups” button:

Since we are talking about objects in the current system, select LOCAL in the System Alias field, and then click “Get Service Groups”. A list of all unpublished groups will appear:

Highlight the desired group and click “Publish Service Groups”. Then return to the previous window. Now it shows the published group and the services assigned to it:

OData Service testing

The published service can be tested. This testing, if necessary, will allow you to fix the issues in the classes and methods described above by break-points enabling and switching to ABAP debugger. Testing can be started by highlighting our service as shown in the previous screenshot and selecting “Service Test -> Gateway Client”:

First of all, please, note the path displayed in the Request URI field:

The starting part of this path upto the $ sign is the End-Point URL, which will be required later in the SAC settings. In our example, the End-Point URL turns out to be as follows: /sap/opu/odata4/sap/zodatav4_group/default/sap/zodatav4/0001/

To start testing, you must first switch the HTTP Method to POST. Secondly, in the Request URI, after the End-Point URL, you need to specify the name of the action that will be called. Thirdly, you need to pass the value of the input parameter to the action using JSON-Syntax. To test the service, a dummy chain ZPC_ODATA4_TEST was prepared. It simply calls a program that contains no code. Therefore, the launch of testing in the end looks like this:

Now testing can be started by pressing the “Execute” button. The successful result looks like this:

In the RSPC transaction we can verify that the service actually started the execution of the chain and returned the run ID:

OData Service utilization in SAC Analytic Application

Now we can use our OData Service in the SAC Analytic Application. For the demonstration, the simplest application was prepared. It contains three components: OData Service named ZODATAV4, button PC_START_BUTTON (labeled “Start Process Chain”), text field CHAIN_RUN_ID, designed to display the value of action return:

In the ZODATAV4 component settings you must specify the name of the Live Connection to that BW system where the service was created. See above for the required CORS settings for this system. You also need to specify the End-Point URL. The value of this parameter, corresponding to our example, was also indicated earlier:

To check the connection to the Service, you need to click the “Refresh” button to the right of the Metadata section. If SSO between SAC and BW is not configured, then an additional window will appear at this moment asking for your BW credentials. If the connection to the Service is successful, then the Metadata section will display OData version used and a list of actions provided by this Service:

Now the action can be called with the onClick script for the PC_START_BUTTON button:

First line of the script shows how to call an action from the ZODATAV4 service connected to the SAC, how to specify the name of this action, how to pass the technical name of the chain to the input parameter of this action. Then, if there were no errors while calling the action, return value from the action is stored in the logid local variable. It, in turn, is displayed in the CHAIN_RUN_ID text field.

So, before the button is clicked, application window looks like this:

After clicking the button, chain run ID appears:

As before, we can check correctness of this number in the RSPC transaction.

Rating: 0 / 5 (0 votes)

The post How to call a BW ABAP-Backend from SAC Analytic Application appeared first on ERP Q&A.

]]>
Enable hidden Fields of SAP BW DataSource https://www.erpqna.com/enable-hidden-fields-of-sap-bw-datasource/?utm_source=rss&utm_medium=rss&utm_campaign=enable-hidden-fields-of-sap-bw-datasource Thu, 26 Jan 2023 04:35:01 +0000 https://www.erpqna.com/?p=71892 Fields hidden by SAP in Standard BW Data Sources Many a times you would have observed that there are a few fields present in the extract structure of the SAP BW datasource but are not available in RSA6 or the data source which is replicated in BW. This blog talks about all steps as to […]

The post Enable hidden Fields of SAP BW DataSource appeared first on ERP Q&A.

]]>
Fields hidden by SAP in Standard BW Data Sources

Many a times you would have observed that there are a few fields present in the extract structure of the SAP BW datasource but are not available in RSA6 or the data source which is replicated in BW.

This blog talks about all steps as to how can we enable fields present in the extract structure of the data source, make them visible in RSA3 and fetch the data in BW.

Let’s take an Example.

Here we take an example of a data source 0COMP_CODE_TEXT No. of Fields in the Extract structure is 7.

No. of fields in RSA3 output of the Datasource is ‘2’:

Fields present in RSA6 in the source system for the data source 0COMP_CODE_TEXT

Data source in SAP BW.

The Reason for this:

The value for the field SELECTION of the table ROOSFIELD for the fields which are not visible in the Data source or RSA3 is maintained as ‘A’.

If a request for a Data Source is scheduled in the Business Information Warehouse, selection conditions are specified across certain fields. The property that determines whether a selection in BW using a field is possible or required is established in the Data Source in the Source System.

In addition, the visibility of the field in BW can be set.

Definition of the individual values:

‘A’: Field is hidden in OLTP and BW; property cannot be changed by the customer.

‘M’: The Data Source requires a selection across this field before it is able to extract data (Required field for the generation of a request); property cannot be changed by the customer

‘X’: The Data Source can select across this field. The customer can change selections and visibility (the field is currently visible and selectable, compare with ‘P’, ‘3’)

‘1’: Pure selection field for the Data Source. The customer can change the selection, but not the visibility (the field is currently selectable, compare with ‘2’).

‘2’: Pure selection field for the Data Source. The customer can change the selection, but not the visibility (field is currently no selectable, compare with ‘1’).

‘3‘: The Data Source can select across this field. The customer can change selection and visibility (the field is currently not visible not selectable, compare with ‘P’, ‘X’)

‘4’: The Data Source cannot select across this field. The customer can change visibility (the field is currently not visible, compare with ‘ ‘)

Summary:

Filed Attribute Description 
A Field in OLTP and BW Hidden by SAP
Selection Required, Visible 
  No Selection Possible, Visibility Set 
Selection Adjustable, Visibility Set.
Selection Adjustable, Visibility Set 
Pure Selection Field, Selection Set 
Pure Selection Field, Selection Set 
Selection Adjustable, Visibility Adjustable 
No Selection Possible, Visibility Adjustable 

Here are Steps to make the field visible in RSA3 and to populate data into BI.

Let’s consider the filed DATETO (Valid-to date) to be displayed in RSA3 and in the Data source so that the data can be extracted into BI.

STEPS to be followed:

1. Create a test program through the transaction SE38. Write the below code in the program. Activate it and then execute.

2. After execution you would see that the value of the filed selection has been changed from ‘A’ to ‘P’.

3. Go to RSA6 and find the data source 0COMP_CODE_TEXT and select Change.

4. Put a check in the checkbox Selection if you want it to be a part of the selection in RSA3 and in the Infopackage and Generate the data source.

5. Go to RSA3 and you will find the new filed DATETO in your output.

PS. Note: If there is data it would automatically get populated in this case there isn’t any data in the new filed

6. Go to the BI System find the data source and Replicate it.

7. After replication the new field will appear in the data source.

To Summarize following the above steps you can un-hide / enable the fields present in the Extract Structure of an SAP BW Datasource.

Rating: 0 / 5 (0 votes)

The post Enable hidden Fields of SAP BW DataSource appeared first on ERP Q&A.

]]>
Conversion to SAP Data Warehouse Cloud: Conversion Paths and Cloud Transformation Steps https://www.erpqna.com/conversion-to-sap-data-warehouse-cloud-conversion-paths-and-cloud-transformation-steps/?utm_source=rss&utm_medium=rss&utm_campaign=conversion-to-sap-data-warehouse-cloud-conversion-paths-and-cloud-transformation-steps Tue, 26 Apr 2022 10:09:21 +0000 https://www.erpqna.com/?p=62411 Abstract Innovate your IT landscape with SAP Data Warehouse Cloud, which is SAP’s strategic target solution for all data warehousing use cases, in line with SAP’s data-to-value portfolio strategy. This blog post provides SAP BW and SAP BW/4HANA customers an overview of how existing on-premises investments can be converted to the cloud. More importantly, a […]

The post Conversion to SAP Data Warehouse Cloud: Conversion Paths and Cloud Transformation Steps appeared first on ERP Q&A.

]]>
Abstract

Innovate your IT landscape with SAP Data Warehouse Cloud, which is SAP’s strategic target solution for all data warehousing use cases, in line with SAP’s data-to-value portfolio strategy. This blog post provides SAP BW and SAP BW/4HANA customers an overview of how existing on-premises investments can be converted to the cloud. More importantly, a Cloud Transformation Checklist which can be used when preparing the project and during the actual conversion to SAP Data Warehouse Cloud. The checklist contains the various conversion paths (Shell- and Remote Conversion), and their individual steps in different tabs, including the corresponding SAP Notes. The checklist is in no way intended to replace a dedicated project plan, but rather to provide support for effort orientation. For initial project planning and rough structuring, it makes sense to think through all project parts one by one. The checklist is available for download as an attachment to SAP Note 3141688. Tip: Click on the figures of the individual project phases below for better readability.

Overview

SAP Data Warehouse Cloud is SAP’s strategic public cloud product for data warehousing. It improves agility, accelerates time-to-value, and unlocks innovation with unprecedented business user enablement, a new form of openness, and embedded partner offerings. Looking at conversion, SAP offers the possibility of a tool-supported move of SAP BW or SAP BW/4HANA investments to SAP Data Warehouse Cloud, SAP BW bridge (see figure 1). SAP BW bridge is a feature of SAP Data Warehouse Cloud, that provides a path to the public cloud for SAP BW and SAP BW/4HANA customers. It enables connectivity and business content via proven SAP BW-based data integration (extractors) from SAP source systems. In addition, it provides staging layers of SAP BW to manage data loads (including deltas) with partitioning, monitoring, and error handling. This allows customers to leverage their existing SAP BW skills and protect their SAP BW investments in the public cloud.

Figure 1: SAP BW to SAP Data Warehouse Cloud

In any conversion, manual interaction and re-design is required (see Figure 2). The degree of these manual tasks varies from customer to customer and depends on the configuration and state of the SAP BW or SAP BW/4HANA system. SAP has developed tools to automate this renovation where possible and feasible, but they are not built or intended to fix badly designed models or clean up neglected systems. The tools will transfer objects based on XML files via RFC from SAP BW Release 7.30 to 7.50 (on SAP HANA or Any-DB), and SAP BW/4HANA 2021 to SAP Data Warehouse Cloud, SAP BW bridge. Each transfer is based on a selection of a specific scope, i.e., a set of SAP BW objects, e.g., a data flow that can be transferred together in a consistent way. Please note that SAP BW 7.40 and lower are already out of maintenance.

Figure 2: Conversion Paths

SAP provides a Pre-Check Tool (see SAP Note 2575059) that identifies important steps customers need to take to ensure their system is compatible with the conversion process. The tool determines which objects are compatible with SAP Data Warehouse Cloud, SAP BW bridge and which objects are not available in SAP Data Warehouse Cloud, SAP BW bridge. In addition, it checks which objects can be automatically converted, deleted, or need manual adjustments (see figure 2).

Figure 3: Conversion Overview

Regardless of the type of conversion (see figure 3), the Simplification List (currently in preparation, see SAP Note 3154420) and the Conversion Guide are suitable as a starting point. The Simplification List describes in detail, on a functional level, what happens to individual data models and objects in SAP Data Warehouse Cloud, SAP BW bridge. The individual Notes explain what needs to be done in the conversion process. The Conversion Guide is the end-to-end documentation of a conversion to SAP Data Warehouse Cloud, SAP BW bridge. Note: There is an individual Conversion Guide for the Shell Conversion (see Link) and an individual Conversion Guide for the Remote Conversion (currently in preparation; Remote Conversion is planned for release 2205).

Shell Conversion to SAP Data Warehouse Cloud

Shell Conversion: General Sequence

Figure 4: Shell Conversion – General Sequence

Shell Conversion (see figure 4) is offered by SAP to convert an SAP BW or SAP BW/4HANA system into SAP Data Warehouse Cloud. It does not include the transfer and synchronization of existing data sets. Instead, customers can choose to load data from original sources, load data from the sender SAP BW or SAP BW/4HANA system, or simply ignore historical data and start fresh. For SAP BW systems on releases from 7.30 to 7.50 (running on SAP HANA or Any-DB) and SAP BW/4HANA 2021, a shell conversion can be performed.

Shell Conversion: (T1) System Requirements

Shell Conversion: (T2) Pre-Checks

Shell Conversion: (T3) Custom Code Check

Shell Conversion: (T4) System Provisioning

Shell Conversion: (T5) Remote Conversion

Shell Conversion: (T6) Post Conversion Tasks

Shell Conversion: (T7) SAP Data Warehouse Cloud Core Tasks

Shell Conversion: (T8) Go-Live

Remote Conversion to SAP Data Warehouse Cloud

Remote Conversion: General Sequence

Figure 5: Remote Conversion – General Sequence

Remote Conversion (see figure 5) is offered by SAP to convert an SAP BW or SAP BW/4HANA system into SAP Data Warehouse Cloud. It enables customers to move whole data flows or transfer only selected data flows including data. Customers are able to decide whether they want to build a clean system, leave old and unused objects behind. For SAP BW systems on releases from 7.30 to 7.50 (running on SAP HANA or Any-DB) and for SAP BW/4HANA 2021, a remote conversion is planned. Attention: Currently not available, the Remote Conversion is planned for release with SAP BW bridge 2205.

Remote Conversion: (T1) System Requirements

Remote Conversion: (T2) Pre-Checks

Remote Conversion: (T3) Custom Code Check

Remote Conversion: (T4) System Provisioning

Remote Conversion: (T5) Remote Conversion

Remote Conversion: (T6) Post Conversion Tasks

Remote Conversion: (T7) SAP Data Warehouse Cloud Core Tasks

Remote Conversion: (T8) Go-Live

Rating: 5 / 5 (1 votes)

The post Conversion to SAP Data Warehouse Cloud: Conversion Paths and Cloud Transformation Steps appeared first on ERP Q&A.

]]>
SAP Data Warehouse Cloud, SAP BW bridge: Overview and Technical Deep Dive https://www.erpqna.com/sap-data-warehouse-cloud-sap-bw-bridge-overview-and-technical-deep-dive/?utm_source=rss&utm_medium=rss&utm_campaign=sap-data-warehouse-cloud-sap-bw-bridge-overview-and-technical-deep-dive Fri, 19 Nov 2021 09:49:38 +0000 https://www.erpqna.com/?p=56926 Abstract This blog post is about a strategic feature of SAP Data Warehouse Cloud, namely the SAP Data Warehouse Cloud, SAP BW bridge. SAP offers its customers through RISE with SAP, the opportunity to move with a business transformation as a service (BTaaS) to the cloud. For BW customers, this means SAP BW to SAP […]

The post SAP Data Warehouse Cloud, SAP BW bridge: Overview and Technical Deep Dive appeared first on ERP Q&A.

]]>
Abstract

This blog post is about a strategic feature of SAP Data Warehouse Cloud, namely the SAP Data Warehouse Cloud, SAP BW bridge. SAP offers its customers through RISE with SAP, the opportunity to move with a business transformation as a service (BTaaS) to the cloud. For BW customers, this means SAP BW to SAP Data Warehouse Cloud. SAP positions SAP Data Warehouse Cloud as the strategic target solution for Data Warehousing in the public cloud, with SAP BW/4HANA Private Cloud Edition (PCE) as an option to start the transition. In this context, SAP BW bridge offers customers the opportunity to implement their new data warehousing use cases directly in the cloud environment while protecting, and retaining their existing on-premises investments. The blog provides an overview about SAP BW bridge, explains how to move from an existing SAP BW system to the cloud, and gives insights with a complete end-to-end greenfield scenario including a system demo in SAP Data Warehouse Cloud.

Overview

SAP Data Warehouse Cloud is SAP’s offering for all data warehousing use cases. This SaaS (Software-as-a-Service) is based on SAP HANA Cloud. It combines data, and analytics in a cloud solution that offers data integration, database, data warehouse, and analytics services. This enables customers to realize the full potential of a data-driven business. As a way to improve the integration with SAP ERP systems, the SAP BW bridge enables ABAP-based data extraction and staging capabilities within SAP Data Warehouse Cloud (see figure 1).

Figure 1: SAP Data Warehouse Cloud, SAP BW bridge. Overview of data integration

In the future, a tool-based transfer of existing SAP BW and SAP BW/4HANA staging scenarios will be enabled. Then the SAP BW bridge will enable a seamless transfer of existing ETL processes in a dedicated SAP BW bridge Space in SAP Data Warehouse Cloud. Here, the extensive functions of ODP extractors, and ABAP code within the SAP Business Technology Platform (SAP BTP) ABAP environment can be adopted in SAP Data Warehouse Cloud using the Cross-Space-Sharing approach.

SAP BW to SAP Data Warehouse Cloud

Within SAP BW bridge, customers are able to implement data extraction and staging scenarios up to the CompositeProvider level. In other words, it is not possible to create new queries within the SAP BW bridge environment. In this regard, within the SAP BW bridge, there is no support for OLAP engine, and functionality dependent on OLAP (e.g., analysis authorizations, query as info provider, query execution). Front-End tools do not have the possibility to access SAP BW bridge artefacts directly.

Figure 2: Future Modeling Capabilities in SAP Data Warehouse Cloud

The SAP BW bridge environment is primarily intended for ODP-based source systems, which means that the connection scenarios only become available via Operational Data Provisioning (ODP). Non-SAP sources will be connected directly to SAP Data Warehouse Cloud (see figure 2). Objects from source SAP BW system(s) will be converted to the SAP BW bridge environment using Conversion Tools, including the SAP BW queries.

To take the full advantage of SAP’s data warehousing offerings, customers today need to deploy both SAP BW/4HANA, and SAP Data Warehouse Cloud. In the future, the SAP BW bridge will enable customers to merge these offerings into a single data warehouse solution in the cloud. With SAP BW bridge, SAP addresses BW customers that are looking for a way forward from SAP BW NetWeaver, and SAP BW/4HANA (See figure 3).

Figure 3: Future Modeling Capabilities in SAP Data Warehouse Cloud

SAP BW Customers will have the option in 2022 to convert their existing on-premises investments to the cloud via remote, and shell conversion. First, the SAP BW bridge conversion will be offered for SAP BW 7.4, and SAP BW 7.5 systems (initially as shell conversion, followed as remote conversion), subsequently the conversion for SAP BW 7.3 systems (shell and remote conversion) will be available. Additionally, the conversion will be available for SAP BW/4HANA 2021 (shell and remote conversion). Regarding release coverage, please consider the details in the roadmap. Customers with lower SAP BW releases will need to upgrade their system(s) first, and then convert the required scope to SAP BW bridge in SAP Data Warehouse Cloud. Please note that SAP BW systems 7.40, and lower are already out of maintenance (see figure 4).

Figure 4: SAP BW bridge. SAP BW to SAP Data Warehouse Cloud

The SAP BW bridge artefacts out of the Business Technology Platform (SAP BTP) ABAP environment are available via remote tables using SAP HANA Cloud Smart-Data-Access (SDA) in a dedicated SAP BW bridge Space in SAP Data Warehouse Cloud. The remote tables in the SAP BW bridge Space can then be used in the regular SAP Data Warehouse Cloud Spaces via the SAP Data Warehouse Cloud cross-space sharing approach.

Add-ons are not supported in SAP BW bridge. Therefore, planning is not available within SAP BW bridge. In this regard, SAP positions SAP Analytics Cloud Planning as the planning application, and in the future (earliest end of 2022) SAP Data Warehouse Cloud as the planning foundation for the data. Application development is not supported in SAP BW bridge. Any app development should be done via the Business Technology Platform App Building on SAP HANA, for which customers need to license, and use the stand-alone version of SAP Business Technology Platform ABAP Environment.

Target Scenarios for SAP BW bridge

Greenfield with SAP Legacy Sources

Customers building a new data warehouse in the cloud with SAP Legacy systems as data sources that will only be migrated to a cloud-based system in the future. Expecting the same level of data integration, and convenience functions as known from SAP BW/4HANA.

Conversion with SAP BW NetWeaver & SAP BW/4HANA

Customers with an SAP BW (SAP BW 7.3 and upwards, any DB) moving their data warehouse to the cloud expecting to retain their data, and their legacy data flows, but renovating their data consumption layer with SAP Analytics Cloud or 3rd party clients on top, and expanding their data footprint to cloud, and non-SAP sources.

Hybrid with SAP BW/4HANA

Customers with an on-premise SAP BW/4HANA looking for a path into the cloud for their data warehouse workload. Starting with hybrid scenarios for consumption to combine SAP BW/4HANA data, and SAP Data Warehouse Cloud data, and then easily moving more and more of the SAP BW/4HANA data flows to the cloud, and successively transition them to modern SAP Data Warehouse Cloud data ingestion approaches.

End-To-End Greenfield Scenario

As an example, the Greenfield approach is demonstrated in the following use case (See Figure 5). This means when the customer operates SAP Data Warehouse Cloud together with SAP BW bridge to connect SAP on-premises systems. This provides proven SAP BW-based data integration technology for ABAP-based SAP systems, and enables the rich feature set of extractors in SAP Data Warehouse Cloud.

Figure 5: Architecture of End-To-End Greenfield Scenario

The SAP BW bridge data, and processes are administered, and managed via an SAP UI5 environment called SAP BW bridge Cockpit. The implementation of new objects is done within Eclipse via SAP BW Modeling Tools, and ABAP Development Tools. Within the SAP Data Warehouse Cloud, the SAP BW bridge artefacts are available via remote tables, and can be used via the SAP Data Warehouse Cloud cross-space sharing approach. An SAP GUI is not required to access the SAP BW bridge environment.

SAP BW bridge: Development-Environment

In the following data flow (see figure 6) it can be seen, that the Eclipse environment with SAP BW Modeling Tools, and ABAP Development Tools are used. The well-known SAP flight data model is the foundation for this use case.

Figure 6: Eclipse Environment for SAP BW bridge

In this scenario, the tables “Flight” for transaction data, and “Airline carrier” for master data are considered. The left branch of the data model handles transaction data, the data is loaded from an SAP ECC system respectively an SAPI Extractor. The right branch of the data model handles master data, the data is loaded from a CDS view. As it can be seen, there is still the subdivision into master data texts, and master data attributes. Within the data flow, transformations, and data transfer processes are used to load data into the advanced DataStore Object, and the master data-bearing InfoObject. Within the Composite Provider, the data is then combined with a join.

The SAP BW bridge component is primarily intended for ODP-based source systems. In this regard, customers have the option to create source systems in the context of ODP. This means that ODP-BW, ODP-HANA, ODP-SAP, ODP-CDS, and ODP-SLT based source systems can be connected. This offers the additional benefits of the Operational Data Provisioning Framework, such as Extract Once Deploy Many, data compression, and more.

Figure 7: SAP UI5 Environment for SAP BW bridge

Dedicated process chains are created for both branches in order to load data into the InfoProviders. The process chains are modelled in the SAP UI5 environment for SAP BW bridge called SAP BW bridge Cockpit (see figure 7).

SAP BW bridge Space in SAP Data Warehouse Cloud

If a customer wants to use SAP BW bridge, a provisioning process is triggered by SAP. With that, there will be a dedicated Space for SAP BW bridge in the SAP Data Warehouse Cloud Tenant itself, generated by the provisioning process. This space then has a specific type “SAP BW bridge” (see figure 8).

Figure 8: Space Type: SAP BW bridge

In the generated SAP BW bridge Space, a connection (see figure 9a) to the SAP BW bridge environment, within the SAP Business Technology Platform, will be generated that contains an SAP HANA Cloud Smart Data Access endpoint, and a HTTP ABAP endpoint. It is possible to connect only one SAP BW bridge System to an SAP Data Warehouse Cloud Tenant.

Figure 9a: Connection to SAP BW bridge

The SAP HANA Cloud Smart Data Access endpoint is used to connect to the external schema of SAP BW bridge’s SAP HANA Cloud Database that contains the read-only views to the data tables, for moving over the data. The HTTP ABAP endpoint is used to be able to call Monitor UIs via Single Sign on with a named user, and to get Meta Data e.g., for a value help or import of SAP BW bridge Objects (see figure 9b).

Figure 9b: Connection to SAP BW bridge

The new connection type cannot be edited by a user in the SAP BW bridge Space, as this connection will be generated by the SAP BW bridge provisioning process automatically. The credentials for the SAP HANA Cloud Smart Data Access connection are provided when the connection is generated. The data tables of the Business Technology Platform environment for SAP BW bridge are exposed as remote tables in the SAP BW bridge Space.

Important: The SAP BW Service Key should be copied, as this needs to be entered when an SAP BW bridge project is set up in Eclipse.

Figure 9c: Connection to SAP BW bridge

Inside the SAP BW bridge Space, the Create-Button for new connections is disabled, as this Space is restricted to the SAP BW bridge only. The Real-Time Replication Status is inactive for this connection, as it only allows Remote Tables (see figure 9c).

SAP BW bridge Space: Data Builder for importing remote tables

The main purpose of the Data Builder regarding the SAP BW bridge Space (see figure 10) is to import, and share the remote tables for other Spaces, using the Cross-Space Sharing Approach of SAP Data Warehouse Cloud. Unlike the regular Spaces of SAP Data Warehouse Cloud, it is not possible to create Tables, Graphical Views, SQL Views, Entity Relationship Models, or Data Flows within the Data Builder of a SAP BW bridge Space. SAP Data Warehouse Cloud artefacts using the SAP BW bridge remote tables can only be created in other spaces based on the shared remote tables.

Figure 10: SAP BW bridge Space: Data Builder

Using the Import button in the Data Builder, then via “Import Remote Tables”, the tables of SAP BW bridge InfoProvider can be accessed via the underlying connection (see figure 11).

Figure 11: Import Remote Tables

In the “Import Remote Tables” wizard there is only one connection available, the connection to the SAP BW bridge System (see figure 12). By selecting the defined connection, the connection gets validated. If the validation process is successful, the next step is available.

Figure 12: Connection to SAP BW bridge

The wizard for the data tables of SAP BW bridge InfoProvider contain the following data tables, which are then available as Remote Tables in SAP BW bridge Space in SAP Data Warehouse Cloud itself.

  • Advanced DataStore Object (Reporting View)
  • Composite Provider
  • Master Data Tables
    • Attributes
    • Texts

The data tables are displayed by InfoAreas (see figure 13). It is possible to multi select tables to support a mass take over. It is also possible to select an entire InfoArea, then all the tables underneath the objects are selected. Afterwards it is possible to deselect some tables (then the InfoArea can be deselected).

Figure 13: Select SAP BW bridge InfoProvider

The last step displays the list of objects which are ready for import. There is one more section for Remote Tables, which are already in the repository of SAP Data Warehouse Cloud. The user can also change the technical name, and the business name of the appropriate object. Via “Import and Deploy” the remote tables will be generated with Semantic Usage as Relational Dataset. The Master Data Text tables can be generated as either Dimension or Text (see figure 14).

Figure 14: Import and Deploy Remote Tables

Next, the remote tables located in the SAP BW bridge Space in SAP Data Warehouse Cloud need to be shared with the regular SAP Data Warehouse Spaces. As already outlined before, the main functionality here is to import, and share the remote tables for other spaces. SAP Data Warehouse Cloud artefacts using the remote tables can only be created in other spaces, based on the shared remote tables.

In my example, I have created a standard SAP Data Warehouse Cloud Space “DENIZBWBRIDGE”, which consumes the artefacts, and allows further implementation within the SAP Data Warehouse Cloud (see figure 15).

Figure 15: Share Remote Tables

SAP Data Warehouse Cloud Space: Consuming Shared SAP BW bridge Artefacts

Within standard SAP Data Warehouse Cloud Spaces, the shared SAP BW bridge Remote Tables can be accessed, and other SAP Data Warehouse Cloud functionality can be applied accordingly (see figure 16).

Figure 16: Graphical View in SAP Data Warehouse Cloud

The following SQL code of the previous graphical view (see figure 17) states that the view within the standard SAP Data Warehouse Cloud Space accesses the remote tables of the SAP BW bridge Space in SAP Data Warehouse Cloud.

SELECT *
FROM (("BWBRIDGEDEMO.ZDOFLIGHTREPORTING" AS "ZDOFLIGHTREPORTING"
INNER JOIN "BWBRIDGEDEMO.ZDO_AIRLATTRIBUTES" AS "ZDO_AIRLATTRIBUTES" 
ON "ZDOFLIGHTREPORTING"."CARRID" = "ZDO_AIRLATTRIBUTES"."ZDO_AIRL") 
INNER JOIN "BWBRIDGEDEMO.ZDO_AIRLTEXT" AS "ZDO_AIRLTEXT" 
ON "ZDO_AIRLATTRIBUTES"."ZDO_AIRL" = "ZDO_AIRLTEXT"."ZDO_AIRL");

Figure 17: SQL-Code of Graphical View

SAP Analytics Cloud: Story based on SAP BW bridge data

Finally, based on the analytical data set of SAP Data Warehouse Cloud, which in this case processes SAP BW bridge data, a visualisation of the data can be done via SAP Analytics Cloud (see figure 18) or any other 3rd party front end solution.

Figure 18: Story in SAP Analytics Cloud

Like for other data models in SAP Data Warehouse Cloud you can use SAP Data Warehouse Cloud Live Data Connection of SAP Analytics Cloud. However, SAP Analytics Cloud generally has certain limitations with SAP Data Warehouse Cloud Live Data Connection, detailed information is available in SAP Note 2832606.

SAP BW bridge Space: Data Integration Monitor

Figure 19: Data Integration Monitor for SAP BW bridge Space

Within the SAP BW bridge Space only Remote Tables are available. For the Data Integration Monitor of the SAP BW bridge Space that means, that the View Persistency Monitor, and Data Flow Monitor are not visible (see figure 19).The available functionalities here are the Remote Table Monitor, and Remote Query Monitor. In addition, access to the SAP BW bridge Cockpit is possible via the Data Integration Monitor of the SAP BW bridge Space.

Rating: 5 / 5 (1 votes)

The post SAP Data Warehouse Cloud, SAP BW bridge: Overview and Technical Deep Dive appeared first on ERP Q&A.

]]>
Using SAP BW authorization relevant InfoObject in SAP Profitability and Performance Management reporting https://www.erpqna.com/using-sap-bw-authorization-relevant-infoobject-in-sap-profitability-and-performance-management-reporting/?utm_source=rss&utm_medium=rss&utm_campaign=using-sap-bw-authorization-relevant-infoobject-in-sap-profitability-and-performance-management-reporting Sat, 17 Apr 2021 05:31:46 +0000 https://www.erpqna.com/?p=46497 Let me discuss and explain to you the infoObject’s analytical authorizations. Let’ begin. What is analytical authorization? To be quick in explanation let’s have an example which will visualize this thing. You are chief of controlling department of company X. This company is parent company of 3 other child companies: A, B and C. In […]

The post Using SAP BW authorization relevant InfoObject in SAP Profitability and Performance Management reporting appeared first on ERP Q&A.

]]>
Let me discuss and explain to you the infoObject’s analytical authorizations. Let’ begin.

What is analytical authorization? To be quick in explanation let’s have an example which will visualize this thing.

You are chief of controlling department of company X. This company is parent company of 3 other child companies: A, B and C. In every of those companies there is also chief of controlling department, but he/she is responsible only for his/hers company. You (as higher in company hierarchy) would like to see every information, e.g.: Net result from company X, however you don’t want them to see result of each other, so: controlling dept. of company A should only see data from company A and so on, so net result report should be shown for every chief as on screen below:

Fig. 1 Example of results, which will be displayed for different companies.

So you need to create 4 reports for every company member, which will have to be restricted by Company code for every specific company? The answer is: There is a better solution: You can use Authorization relevant InfoObject and create only one report and SAP BW backend will restrict data according to your authorizations and everything can happen on one report only. In this blog I will show you how to achieve such thing and how to combine this feature with SAP Profitability and Performance Query function.

So we need to create company code characteristic for those purposes:

1. Company code characteristic:

Fig. 2 Authorization-Relevant checkbox on HANA Studio creation view.

With given master data:

Fig. 3 HANA Studio Master data maintenance screen.

Now we can log into BW client and go to Transaction RSECADMIN.

Fig. 4 RSECADMIN screen in SAP GUI.

Individual maintenance and create new authorization named CC_A:

Fig. 5 RSECADMIN: Maintaining authorization CC_A.

Now you can press the button in green square like on screenshot above. It will fill table with 3 InfoObjects that are necessary for analytic authorization to work. Since those are in the table you can add this particular authorization relevant InfoObject – in our case YBW_COMP. Double click this InfoObject.

New screen will pop-up. Add new line like and edit row : I – for include. Then you can select list of single values to be included for analytical authorizations like on screenshot below:

Fig. 6 RSECADMIN: Maintaining authorization CC_A: Selecting single values.

Or you can select specific range of given values:

Fig. 7 RSECADMIN: Maintaining authorization CC_A: Selecting range.

The third and last option is to select contains pattern and it can look like this:

Fig. 8 RSECADMIN: Maintaining authorization CC_A: Selecting by pattern.

For this particular scenario we will select it as equal to “CC_A”:

Fig. 9 RSECADMIN: Maintaining authorization CC_A – Saved and activated authorization.

After that you can save and activate authorization object. Next you can make authorization object for companies B and C, and then one more: for all companies: CC_ALL

Fig. 10 RSECADMIN: Maintaining authorization CC_ALL.

Next thing is to assign authorization object to particular user. In our case user DBW_ATH_PAPM. To do that go to T-code RSECADMIN and select User tab and individual assignment:

Fig. 11 RSECADMIN: Assign authorizations for user.

And fill name of particular authorization and click “Insert”.

Fig. 12 RSECADMIN: Selecting authorizations for user.

In this scenario I will add CC_A and CC_B authorizations to this user.

Second idea is to add those authorizations to specific role:

Fig. 13A Applying authorizations for role.
Fig. 13B Applying authorizations for role.

And if you do so, those values are applied to tab “Role Based”

Fig. 14 RSECADMIN authorizations for user – Role-Based tab.

Now on SAP Profitability and Performance Management side lets create Model Table function that will contain this characteristic and key figure: Net result. Then fill table with data given below:

Company Net result 
CC_A  1.500.000
CC_B  2.000.000 
CC_C  2.500.000

And simple query function with this Model Table as an input and query source as Environment. Definition of query should look like this: Net result in columns and Company code in rows. Last step is to restrict Company code characteristic with authorization variable as given in screenshot below:

Fig. 15 SAP Profitability and Performance Management Query function definition.

And activate the query.

As user with no restrictions defined and full authorization this is how result would look like:

Fig. 16 Query result visible for user with all authorizations.

As you can see – all results are available. Now how this query will look from this specific user for whom we managed authorizations? Answer in screenshot below:

Fig. 17 Query result visible for user with restrictions via authorization objects.

As you can clearly see analytic authorization works for SAP Profitability and Performance Management query function. It can be relevant when you would like to restrict the results of your calculations for specific user. I encourage you to switch those authorizations to the one created before.

Rating: 5 / 5 (1 votes)

The post Using SAP BW authorization relevant InfoObject in SAP Profitability and Performance Management reporting appeared first on ERP Q&A.

]]>
ABAP in BW https://www.erpqna.com/abap-in-bw/?utm_source=rss&utm_medium=rss&utm_campaign=abap-in-bw Thu, 15 Apr 2021 10:55:27 +0000 https://www.erpqna.com/?p=46330 What Every BW/BI Developer Needs to Know About ABAP in BW Over time every BW developer needs to write ABAP in SAP NetWeaver BW to be able to meet user requirements. ABAP might be required in a start routine, end routine, expert routine, InfoPackage, Data Transfer Process, Analysis Process Designer, query, or field. Get basic […]

The post ABAP in BW appeared first on ERP Q&A.

]]>
What Every BW/BI Developer Needs to Know About ABAP in BW

Over time every BW developer needs to write ABAP in SAP NetWeaver BW to be able to meet user requirements. ABAP might be required in a start routine, end routine, expert routine, InfoPackage, Data Transfer Process, Analysis Process Designer, query, or field. Get basic knowledge on how to write ABAP code and the differences between each routine method.

Key Concept

Start routine is a routine in a transformation that is executed before transformation is executed.

End routine is a routine in a transformation that is executed after transformation is executed.

Expert routine is a routine in a transformation that is itself the transformation. In other words, it contains all three: start, end, and actual. [correct rewrite?]

SOURCE_PACKAGE is a structure that contains the inbound fields of the routine.

RESULT_PACKAGE is a structure that contains the outbound fields of the routine.

APD is a workbench with a graphical user interface (UI) for creating, executing, and monitoring analysis processes

SAP NetWeaver BW provides user exits throughout the system to take advantage of custom coding add-ons and to be able to meet user requirements. After reading this article, BW developers should be able to develop an idea of when to use ABAP code using different routine methods. I show the differences between the major methods for ABAP in the following:

  1. Start routine
  2. End routine
  3. Expert routine
  4. InfoPackage
  5. Data Transfer Process (DTP)
  6. Analysis Process Designer (APD)
  7. Query variable
  8. Transformation fields
  9. Start, end, and expert routines and the differences between them

Note

This article assumes:

  • You have basic knowledge of SAP NetWeaver BW.
  • You know how to create a DTP, transformations, InfoPackages, DataStore Objects (DSOs), InfoCubes, MultiProviders, and SAP ERP Central Component (ECC) database views.
  • You have ABAP knowledge, which is required to understand the coding parts.

ABAP in Start Routine

A start routine is used in the transformation when fetching information from other BW objects is necessary, or when you perform calculations or other data transformations and then store them in a global data structure or table before transformation is triggered. The start routine is run for each data package at the start of the transformation.

For example, consider a sales orders header and line items. Standard SAP objects have two transformations, one for the header and one for line items. When the two DTPs for these two transformations are executed, two different records are created in the InfoCube. A start routine is used to merge them into one record.

This is how it is done.

Step 1. Edit the transformation. If no start routine has been created, there is a Start Routine button to create a new one as shown in Figure 1.

If a start routine is already configured, a different Start Routine button is displayed to change the codes as shown in Figure 2.

Figure 2 Change a start routine

Step 2. Developers should declare TYPES and DATA in Figure 3 in the section labeled begin of 2nd part global.

Figure 3 Global section, which is visible throughout the transformation.

Step 3. Developers should place the body of the ABAP code under the beginning of the routine section (Figure 4).

Figure 4 Start routine

Note

Start routine works with SOURCE_PACKAGE, which is an SAP built-in variable to the start routine.

ABAP in End Routine

End routine is executed after transformation. In other words, end routine is used to execute the post-processing of data after transformation on a package-by-package basis. End routine is also used when you want to perform calculation on fields and assign fields from the start routine.

When an end routine is opened, the codes for TYPES and DATA that were entered in the global part are visible, as shown in Figure 5.

Figure 5 Global part

Figure 6 shows where you enter custom ABAP codes.

Figure 6 End routine section

To continue my example, I now populate fields from the internal table I populated in the start routine, as shown in Figure 7.

Figure 7 End routine

Note

End routine works with RESULT_PACKAGE, which is SAP’s built-in variable to an end routine. [end note]

Start and End Routine Summary

You can use a start routine without using an end routine and vice versa. It depends on the requirements. In a nutshell, a start routine without an end routine is used when you need to sort the records and delete duplicates or delete any rows with specific criteria.

An end routine without a start routine is used when you want to do some calculations on outgoing records before they saved to an InfoProvider. Both start and end routines are used when you need to read other objects from BW and merge them with data read from the source system.

ABAP in Expert Routine

Expert routine is a special case routine, which means you should use it in special situations where there is no SAP built-in rule type available.

  • Use this option to program the transformation without using available rule types.
  • Use this option when all fields in the transformation are to be affected.
  • Note to consider: If you already have created transformation rules, when you create an expert routine, the system deletes them.

An expert routine is activated when you are in transformation, in edit mode, by clicking the Edit button on the menu option. Figure 8 shows an expert routine.

Figure 8 Expert routine menu

Expert routine is used when there are not sufficient functions to perform a transformation.

Refer to Figure 9 for the routine sections.

Figure 9 Expert routine parts

Note

Since expert routine is both a start and end routine, it uses SOURCE_PACKAGE and RESULT_PACKAGE. As you know SOURCE_PACKAGE holds data packages coming in and RESULT_PACKAGE holds data packages going out of transformation.

When you declare an expert routine and save it, the transformation looks something like Figure 10.

Figure 10 Expert routine transformation

You see the new Expert Routine button on the transformation (Figure 11).

Figure 11 Expert Routine button

See appendix A for sample ABAP code for an expert routine.

ABAP in InfoPackage

You use ABAP in an InfoPackage when you want to program the InfoPackage to extract data from the source using criteria where delta cannot handle it. We all know delta gets only records that have changed from the last extract. Let’s say for some reason you need to extract data starting from the last seven days every time. You can do this using ABAP.

To enable an ABAP routine, edit the InfoPackage and under the Data Selection tab, click the pop-up icon shown in Figure 12.

Figure 12 InfoPackage routine ABAP selection

Then click ABAP Routine (Figure 13).

Figure 12 InfoPackage routine ABAP selection

Then click ABAP Routine (Figure 13).

Figure 14 InfoPackage routine name

Figure 15 shows sections of the routine and where you need to add code

Figure 15 InfoPackage routine sections

Figure 16 shows sample ABAP to select yesterday’s date.

Figure 16 ABAP in InfoPackage

ABAP in DTP

As in ABAP in an InfoPackage, you use ABAP in a DTP when you want to program the DTP to extract data from the source using criteria that delta cannot handle. Let’s say for some reason you need to extract data starting from the last seven days every time. This can be done using ABAP.

To implement ABAP in DTP do the following.

Step 1. Edit the DTP and click the Filter button (Figure 17).

Figure 17 Filter in DTP

Step 2. Click the last icon (routine icon) to the right by the field you want to program (Figure 18).

Figure 18 ABAP icon in DTP

Step 3. Give the DTP a name (Figure 19).

Figure 19 DTP ABAP name

Figure 20 shows selections of the routine and where to enter ABAP code.

Figure 20 DTP ABAP sections

In Figure 21, all data from seven days ago up to today is extracted by DTP for the CALDAY field.

Figure 21 ABAP in DTP

ABAP in APD

You might be familiar with APD but, in a nutshell, APD is a workbench with a graphical user interface (UI) for creating, executing, and monitoring analysis processes. Data can be merged from a number of sources, can go through different transformations, and can be saved to different data targets. You can access APD by using transaction code RSANWB or by clicking the Edit button in transaction code RSA1. Then click Analysis Process Designer in Figure 22.

Figure 22 APD Menu option

Note

How to use APD is out of scope of this article.

The following example shows how to transform and change data coming from a BEx report and then save it to a DSO. Here the report is the source. The result of the report goes through ABAP routing and you filter the data and finally save it into a DSO. Figure 23 shows the various options you have.

Figure 23 APD available options

Once you select the query from the Data Sources options and the ABAP routine from the Transformations options, right-click Routine 1 and then click the Properties line (not shown).

In the General tab shown in Figure 24, you give the routine a name.

Figure 24 Routine name in APD

Go to the Source fields (Source Flds) tab. From the Field List on the right panel, choose the fields you want to extract and move them to the Source Fields on the left panel (Figure 25).

Figure 25 Source Fields selection in APD

The system then automatically fills the TargetFlds tab (Figure 26).

Figure 26 TargetFlds tab in APD

Click the Routine tab in Figure 27 and enter the ABAP code.

Figure 27 APD routine sections

Enter the code after the LOOP statement and before MOVE-CORRESPONDING as shown in Figure 27.

In my example below, I want to change the content of the field Plant. If the data coming from the source system is 1200 and if I want it to be A2001018, and if the content of the field plant coming from the source system is 1300 or 1600 and I want it to be different, I would use the ABAP codes shown in Figure 28.

Note

In the routine tab between LOOP AT and MOVE-CORRESPONDING, you assign all fields from the source to the destination.

Figure 28 ABAP in APD

After checking for errors, you can then continue with the rest of APD.

ABAP in Query Variable

A Query Customer Exit Variable created in a BEx query provides the means to do a complex calculation (not possible with BEx) in ABAP.

Data BW queries can be restricted automatically by creating a variable customer exit. This is a two-step process: One is to create a variable in Query Designer where it is processed by a customer exit. The second step is to do some ABAP coding on the BW side in transaction code CMOD.

For example, you want to create a variable on Posting Date. This new variable extracts data from the system using the system date with the “offset” you choose.

In the Query Designer, locate the Posting Date and right-click the option Characteristic value variables. Then click the New Variable button.

Figure 29 Create a new variable

Give the new variable a technical name that starts with Z and a description. Then choose the Customer exit option from the drop-down under Processing By, as shown in Figure 30.

Figure 30 Customer exit option for variable

Save your variable by clicking on the save icon.

To write ABAP codes for the new variable just created, use transaction code CMOD in BW, which takes you to the screen shown in Figure 31.

Figure 31 Transaction code CMOD screen

Step 1. Enter the Project name. Click the Components radio button. Click the Display button.

Step 2. To access the screen where you are able to start coding, click the green checkmark shown in Figure 32.

Figure 32 EXIT_SAPLRRSO_001 screen

Then the system takes you to the screen shown in Figure 33.

Figure 33 Include ZXRSRU01 screen

Step 3. Double-click ZXRSRU01 as shown in Figure 33.

When you get to program ZXRSRU01, go to edit mode by clicking the pencil icon on the top left of the screen, and implement the ABAP codes.

Another way to get to this program is to use transaction code SE38. Enter ZXRSRU01 for the program name, and then click Edit.

The most important information here is that the customer exit program receives the variable name of the BW query in the ABAP variable i_vnam, which is defined as an import parameter in the function module EXIT_SAPLRRS0_001. The BW query’s variable value is handled in the table E_T_RANGE. You can then write your ABAP code after CASE I_VNAM, starting with the keyword When. After the keyword When, enter the technical name of the variable you created in Query Designer. Figure 34 shows ABAP codes that start with WHEN ‘ZDAT’ and end with ENDIF. The variable ZDAT gets the value of the system date.

Figure 34 ABAP in transaction code CMOD

Step 4. Check your work for errors and click the save icon.

Step 5. Click the activate icon to activate your codes.

Note that there are four types of SAP built-in parameter I_STEPs:

I_STEP = 1 – Call directly before the variable is entered

I_STEP = 2 – Call directly after the variable is entered. This step is activated only if the same variable could not be filled for I_STEP = 1 and it is not ready for input.

I_STEP = 3 – In this call, you can check the values of the variables. Activating an exception (RAISE) causes the variable screen to appear again. I_STEP = 2 is also then executed once more.

I_STEP = 0 – The enhancement is not called from the variable screen. The call can come from the authorization check or from the monitor.

Now I am going to set the variable offset for my new variable to as many previous days as I need. For example, in the Query Designer, I set the offset to -1500 as shown in Figure 35.

Figure 35 Variable offset

When the report runs, the system fetches and displays the data from 1,500 days ago up to today.

ABAP in Transformation Fields

You use ABAP in transformation fields when you want to perform a calculation on a specific single field. You can define the routine as a transformation rule for a key figure or a characteristic. For example, in a transformation, I want to change the contents of 0DOC_TYPE.

In the transformation, right-click 0DOC_TYPE in Figure 36 and then click the Rule Details option shown in Figure 37.

Figure 36 Transformation fields

Figure 37 Rule Details

A screen pops up (Figure 38). Choose Routine from the Rule Type drop-down options.

Figure 38 Routine Rule Type

Enter ABAP codes after this section as shown in Figure 39.

Figure 39 ABAP in transformation field

Note

Transformation field works with RESULT and SOURCE_FIELDS. As you know, RESULT holds the value to be assigned to the transformation (DOC_TYPE) field, and SOURCE_FIELD holds the value from the source system.

Differences Between Each Routine

Table 1 shows a summary of when to use each routine. Also the table shows which SAP-delivered variable is being used for each routine.

Overview of routines When to use it   Object works on 
Start Fetch information from other BW objects, or perform calculations or delete records BEFORE execution of transformation SOURCE_PACKAGE
End  Perform calculations and (or?) assign fields used in Start routine, AFTER execution of transformation  RESULT_PACKAGE 
Expert  Special cases, no rule type you need is available, applied to ALL fields  SOURCE_PACKAGE and RESULT_PACKAGE 
InfoPackage  Program the InfoPackage to extract data from the source using criteria where delta cannot handle it  Field name selected 
DTP  Program the DTP to extract data from the source using criteria where delta cannot handle it  Field name selected 
APD   To change the result fetched from a BW InfoProvider  LS_SOURCE and LS_TARGET 
Query  Complex calculation, restrict data by default  I_VNAM, WHEN, and I_STEP 
Transformation Fields   Perform calculation on a specific single field in the transformation  RESULT 

Table 1 Overview of routines

Sample Expert Routine ABAP Code

Appendix A

Figure A is a sample ABAP code in an expert routine. As you can see, custom codes are written between **$$ begin of routine – insert your code only below this line *-*
… “insert your code here and *$*$ end of routine – insert your code only before this line **

Also, you can see <SOURCE_FIELDS> and RESULT_FIELDS are declared by the system automatically and they cannot be changed.

The code below loops at the source_package does calculation and transformation, then populates the result_package which, in turn, puts the data being saved into an InfoProvider.

METHOD expert_routine.
*=== Segments ===

FIELD-SYMBOLS:
<SOURCE_FIELDS>    TYPE _ty_s_SC_1.

DATA:
RESULT_FIELDS      TYPE _ty_s_TG_1.

*$*$ begin of routine - insert your code only below this line        *-*
... "insert your code here
Data: gw_result_package TYPE _ty_s_SC_1.
Data: gw_result_package1  TYPE _ty_s_TG_1.
DATA: wa TYPE /BIC/AZSD_DFC00.
DATA:
m10          TYPE /BIC/OIZSFC_MN10,
m11          TYPE /BIC/OIZSFC_MN11,
m12          TYPE /BIC/OIZSFC_MN12,
m1           TYPE /BIC/OIZSFC_MON1,
m2           TYPE /BIC/OIZSFC_MON2,
m3           TYPE /BIC/OIZSFC_MON3,
m4           TYPE /BIC/OIZSFC_MON4,
m5           TYPE /BIC/OIZSFC_MON5,
m6           TYPE /BIC/OIZSFC_MON6,
m7           TYPE /BIC/OIZSFC_MON7,
m8           TYPE /BIC/OIZSFC_MON8,
m9           TYPE /BIC/OIZSFC_MON9,
version      TYPE  C LENGTH 6,
fiscyear     TYPE /BI0/OIFISCVARNT,
fiscalyear   TYPE /BI0/OIFISCYEAR,
plant        TYPE /BI0/OIPLANT,
material     TYPE /BI0/OIMATERIAL.

***************************************
data: versiona     TYPE  C LENGTH 6,
year         TYPE C length 4,
plantA       type /BI0/OIPLANT,
materialA    TYPE /BI0/OIMATERIAL.
LOOP at SOURCE_PACKAGE into gw_result_package.
version = gw_result_package-/BIC/ZVERSION.
plantA = gw_result_package-plant.
MaterialA = gw_result_package-Material.
year = gw_result_package-fiscyear.
versiona = VERSION - 1.
if version = '01'.
year = year - 1.
versionA = '10'.
endif.
if version = '02'.
year = year - 1.
versionA = '11'.
endif.
if version = '03'.
year = year - 1.
versionA = '12'.
endif.
**************************************
*Load previous version data************************
select * from /BIC/AZSD_DFC00 into wa where
/BIC/ZVERSION = VERSIONa and
plant = plantA and
FISCYEAR = year and
Material = MaterialA.
if sy-subrc = 0.
gw_result_package1-plant = wa-plant.
gw_result_package1-material = wa-material.
gw_result_package1-FISCVARNT = 'K4'.
gw_result_package1-fiscyear = wa-fiscyear.
gw_result_package1-/BIC/ZVERSION = version.
gw_result_package1-/BIC/ZSFC_MoN1 = wa-/BIC/ZSFC_MoN1.
gw_result_package1-/BIC/ZSFC_MoN2 = wa-/BIC/ZSFC_MoN2.
gw_result_package1-/BIC/ZSFC_MoN3 = wa-/BIC/ZSFC_MoN3.
gw_result_package1-/BIC/ZSFC_MoN4 = wa-/BIC/ZSFC_MoN4.
gw_result_package1-/BIC/ZSFC_MoN5 = wa-/BIC/ZSFC_MoN5.
gw_result_package1-/BIC/ZSFC_MoN6 = wa-/BIC/ZSFC_MoN6.
gw_result_package1-/BIC/ZSFC_MoN7 = wa-/BIC/ZSFC_MoN7.
gw_result_package1-/BIC/ZSFC_MoN8 = wa-/BIC/ZSFC_MoN8.
gw_result_package1-/BIC/ZSFC_MoN9 = wa-/BIC/ZSFC_MoN9.
gw_result_package1-/BIC/ZSFC_MN10 = wa-/BIC/ZSFC_MN10.
gw_result_package1-/BIC/ZSFC_MN11 = wa-/BIC/ZSFC_MN11.
gw_result_package1-/BIC/ZSFC_MN12 = wa-/BIC/ZSFC_MN12.

append gw_RESULT_PACKAGE1 to RESULT_PACKAGE.
endif.
ENDSELECT.
clear gw_RESULT_PACKAGE1.
endloop.

***************************************************
*Load current File

LOOP at SOURCE_PACKAGE into gw_result_package.
m10 = gw_result_package-/BIC/ZSFC_MN10.
m11 = gw_result_package-/BIC/ZSFC_MN11.
m12 = gw_result_package-/BIC/ZSFC_MN12.
m1 = gw_result_package-/BIC/ZSFC_MoN1.
m2 = gw_result_package-/BIC/ZSFC_MoN2.
m3 = gw_result_package-/BIC/ZSFC_MoN3.
m4 = gw_result_package-/BIC/ZSFC_MoN4.
m5 = gw_result_package-/BIC/ZSFC_MoN5.
m6 = gw_result_package-/BIC/ZSFC_MoN6.
m7 = gw_result_package-/BIC/ZSFC_MoN7.
m8 = gw_result_package-/BIC/ZSFC_MoN8.
m9 = gw_result_package-/BIC/ZSFC_MoN9.
version = gw_result_package-/BIC/ZVERSION.
fiscalyear = gw_result_package-FISCYEAR.
plant = gw_result_package-PLANT.
material = gw_result_package-MATERIAL.

gw_result_package1-material = material.
gw_result_package1-plant = plant.
gw_result_package1-FISCVARNT = 'K4'.
gw_result_package1-fiscyear = fiscalyear.
gw_result_package1-/BIC/ZVERSION = version.
*
if m1 ne 0.
gw_result_package1-/BIC/ZSFC_MoN1 = m1.
endif.
if m2 ne 0.
gw_result_package1-/BIC/ZSFC_MoN2 = m2.
endif.
if m3 ne 0.
gw_result_package1-/BIC/ZSFC_MoN3 = m3.
endif.
if m4 ne 0.
gw_result_package1-/BIC/ZSFC_MoN4 = m4.
endif.
if m5 ne 0.
gw_result_package1-/BIC/ZSFC_MoN5 = m5.
endif.
if m6 ne 0.
gw_result_package1-/BIC/ZSFC_MoN6 = m6.
endif.
if m7 ne 0.
gw_result_package1-/BIC/ZSFC_MoN7 = m7.
endif.
if m8 ne 0.
gw_result_package1-/BIC/ZSFC_MoN8 = m8.
endif.
if m9 ne 0.
gw_result_package1-/BIC/ZSFC_MoN9 = m9.
endif.
if m10 ne 0.
gw_result_package1-/BIC/ZSFC_MN10 = m10.
endif.
if m11 ne 0.
gw_result_package1-/BIC/ZSFC_MN11 = m11.
endif.
if m12 ne 0.
gw_result_package1-/BIC/ZSFC_MN12 = m12.
endif.

***Get previous Values if month is blanck in the file*******
select * from /BIC/AZSD_DFC00 into wa where
plant = gw_result_package-PLANT and
material = gw_result_package-MATERIAL and
fiscyear = gw_result_package-FISCYEAR and
/BIC/ZVERSION = VERSIONa.

if m1 = 0.
gw_result_package1-/BIC/ZSFC_MoN1 = wa-/BIC/ZSFC_MoN1.
endif.
if m2 = 0.
gw_result_package1-/BIC/ZSFC_MoN2 = wa-/BIC/ZSFC_MoN2.
endif.
if m3 = 0.
gw_result_package1-/BIC/ZSFC_MoN3 = wa-/BIC/ZSFC_MoN3.
endif.
if m4 = 0.
gw_result_package1-/BIC/ZSFC_MoN4 = wa-/BIC/ZSFC_MoN4.
endif.
if m5 = 0.
gw_result_package1-/BIC/ZSFC_MoN5 = wa-/BIC/ZSFC_MoN5.
endif.
if m6 = 0.
gw_result_package1-/BIC/ZSFC_MoN6 = wa-/BIC/ZSFC_MoN6.
endif.
if m7 = 0.
gw_result_package1-/BIC/ZSFC_MoN7 = wa-/BIC/ZSFC_MoN7.
endif.
if m8 = 0.
gw_result_package1-/BIC/ZSFC_MoN8 = wa-/BIC/ZSFC_MoN8.
endif.
if m9 = 0.
gw_result_package1-/BIC/ZSFC_MoN9 = wa-/BIC/ZSFC_MoN9.
endif.
if m10 = 0.
gw_result_package1-/BIC/ZSFC_MN10 = wa-/BIC/ZSFC_MN10.
endif.
if m11 = 0.
gw_result_package1-/BIC/ZSFC_MN11 = wa-/BIC/ZSFC_MN11.
endif.
if m12 = 0.
gw_result_package1-/BIC/ZSFC_MN12 = wa-/BIC/ZSFC_MN12.
endif.
ENDSELECT.
**********************************************************************
append gw_RESULT_PACKAGE1 to RESULT_PACKAGE.

clear m1.
clear m2.
clear m3.
clear m4.
clear m5.
clear m6.
clear m7.
clear m8.
clear m9.
clear m10.
clear m11.
clear m12.
CLEAR gw_RESULT_PACKAGE1.

endloop.

*$*$ end of routine - insert your code only before this line         *-*
ENDMETHOD.                    "expert_routine
MENTATION

Figure A Sample ABAP code in an expert routine

Rating: 0 / 5 (0 votes)

The post ABAP in BW appeared first on ERP Q&A.

]]>
ADSO Remodeling: Cases after implementation of notes 3006437 and 3019867 https://www.erpqna.com/adso-remodeling-cases-after-implementation-of-notes-3006437-and-3019867/?utm_source=rss&utm_medium=rss&utm_campaign=adso-remodeling-cases-after-implementation-of-notes-3006437-and-3019867 Fri, 12 Feb 2021 10:28:16 +0000 https://www.erpqna.com/?p=43006 Scope of this blog post will delve in to details of below use cases: As explained in the blog posts from Frank Riesner, note 3006437 brings a new RSADMIN parameter RSO_ADSO_ONLINE_CONV_THRESHOLD, that allows us to control the threshold limit of the number of records, beyond which the system triggers a remodeling request. However, in large […]

The post ADSO Remodeling: Cases after implementation of notes 3006437 and 3019867 appeared first on ERP Q&A.

]]>
Scope of this blog post will delve in to details of below use cases:

  • Re-folder the InfoObject grouping
  • Change the InfoObject sequence
  • Insert a new compounding child InfoObject
  • Insert a new compounding InfoObject (parent and child)
  • Maintain RSADMIN parameter RSO_ADSO_ONLINE_CONV_THRESHOLD

As explained in the blog posts from Frank Riesner, note 3006437 brings a new RSADMIN parameter RSO_ADSO_ONLINE_CONV_THRESHOLD, that allows us to control the threshold limit of the number of records, beyond which the system triggers a remodeling request. However, in large enterprises, the possibility of having ADSOs, with more than 100 million records is very common. Thanks to the note 3019867, this limitation has been addressed.

For the scope of this blog post, The following scenarios are tested. For space reasons, scenarios 1 – 4 and 7 are part of this blog. Scenarios 5 and 6 had the same results with all other regrouping scenarios. Scenarios are checked in both a SAP BW/4HANA 2.0 SP06 and in a SAP BW 7.5 SP15. Screenshots are from the SAP BW/4HANA system.

Scenarios

Let’s see the details now:

Implement Note 3006437

Scenario 1:

Conditions for scenario:

  • Add a compounding InfoObject into the dataflow
  • No RSADMIN parameter is maintained in both source and target system.
  • The ADSO is of type “Standard”
  • ADSO contains less than 50,000 records in the Development system.
  • The InfoObject is compounded with two parent InfoObjects, “Source Ssystem” and “Controlling Area”.
  • Parent InfoObjects are already part of the ADSO model:

Observations:

  • No remodeling request is generated.
  • System is checking if RSADMIN parameter is maintained.
  • If not, then the 100,000,000 takes care of the checks.
  • New InfoObjects are added in the entire data Flow: Composite provider, source ADSO, target ADSO.
  • Transport to target system does not generate a remodeling request and all objects are activated during the transport process.
  • No of records of the ADSO in target system is less than 100 million

Scenario 2:

  • Re-foldering of ADSO,
  • Change the InfoObjects sequence,
  • no maintenance of RSADMIN parameter RSO_ADSO_ONLINE_CONV_THRESHOLD in both source and target systems.

ADSO is type “standard” and scope of the test case is to validate the system doesn’t generate any remodeling request once you change folders, or change the InfoObject sequence of “Data fields”.

Sub scenario 2.1:

Conditions for scenario:

  • In Development system: No of records in ADSO = 0
  • Change InfoObjects sequence on “Data fields” area:

Observations:

  • No remodeling request is generated
  • No impact on dependent Composite Provider: it is still active, however Transformation and DTP are getting inactive: you need to activate and collect them in your transport.
  • SE11 Table structure: As there are no data, it seems that the table structure is changed

Sub scenario 2.2:

Condition for scenario:

  • We take the same ADSO as sud-scenario 2.1, but this time with data.
  • In Development system: No of records in ADSO is appr. 3.5 million
  • Change the sequence of 2 InfoObjects

Observations:

  • No remodeling request is generated:
  • TRFN and DTP are getting inactive: you need to collect them in your transport

Sub-scenario 2.3:

Conditions for Scenario:

  • Continue with same ADSO as sub-scenario 2.2.

Check 1: Add a new group in the ADSO and regroup the infoobjects:

Observation:

  • No remodeling request is generated:

Check 2: Add one more Group and move all the key figures:

Observations:

  • No remodeling request is generated but the interesting part is that table structure as per SE11 is not adjusted as well!

SE11: No changes in the table structure: however, the real picture of table can be taken from Hana only.

Check 3: Moving the changes to the target system, we see as per the transport log, no remodeling request is generated and all dependent objects are activated (1 Composite Provider, 2 ADSOs, 2 Transformations, 2DTPs): Target ADSO has 517.772 entries.

SE11 structure of ADSO before import the change:

Changes imported to Target system without any errors, and the new groups are created:

SE11 structure of ADSO after the change is imported into the target system:

Scenario 3:

Condition for scenario:

  • Re-foldering of ADSO
  • Change the InfoObject sequence, regrouping,
  • Maintenance of RSO_ADSO_ONLINE_CONV_THRESHOLD with a limit less than the ADSO no of records in target system only

Development system:

ADSO: No data, RSO_ADSO_ONLINE_CONV_THRESHOLD is not maintained.

Create an additional Group and move the change to target system. No remodeling request is generated.

Transport to target system:

Observation in Dev System:

No remodeling request is generated, all objects are activated (1 Composite Provider, 2 ADSOs, 2 Transformations, 2 DTPs).

Target system:

RSADMIN parameter is maintained with value as 50,000 (less that the entries in target DSO, which is 517,772):

SE11: No changes in the data structure:

Please take a look into table structure through Hana Studio: the only change is happening due to the addition of child compounding infoobjects. All other changes (regrouping, infoobject sequence) are not affecting the table structure.

Observation in Target System:

  • No remodeling request is generated, all objects are activated

Scenario 4:

Conditions for scenario:

  • Add a new compounding child infoobject in the ADSO.
  • Maintenance of RSO_ADSO_ONLINE_CONV_THRESHOLD with a limit less than the no or records of my ADSO in target only

In Development system:

  • ADSO: No data, RSO_ADSO_ONLINE_CONV_THRESHOLD is not maintained
  • Add a new compounding IO (child only)

Observation in Dev System:

ADSO is activated – no remodeling request is generated, however there are warning about potential remodeling: (Please do not be confused, It is just a warning!!)

In Target system:

  • RSADMIN parameter RSO_ADSO_ONLINE_CONV_THRESHOLD value set to 50,000.
  • Transport log of the target system:

Observations in Target System:

  • Remodeling request generated.

Is it very important to check the log; we are informed that a remodeling request is generated by ADSO and on top, all dependent objects are SKIPPED. All dependent objects should be taken care by remodeling request.

As we can understand, the remodeling request is hidden in the transport log. In most of the cases, developers or transport managers ignore the warning messages that generated during the transport.

As a result, developers recognize the issue only during their test and you can understand the risks its entails!!

Remodeling Request: (transaction RSMONITOR)

Execution of remodeling request was successful, ADSO is adjusted and all dependent objects are activated.

Scenario 7 – Implement note 3019867

After the implementation of note 3019867, SAP removes the fixed 100 million limit and it is customer responsibility to define the limit through the RSADMIN parameter.

Responsible of the check is method IS_ONLINE_CONV_POSSIBLE of class CL_RSCNV_ADSO_HDB. Default value 100 million is still valid, however RSADMIN parameter value controls the process of ADSO activation.

So, for our scenario, I maintained RSADMIN parameter RSO_ADSO_ONLINE_CONV_THRESHOLD with a value of 2 billion in my target system and I enhanced an existing ADSO type “DataMart” with a child compounding infoobject

My ADSO keeps 1.7 billion records in the target system. Total number of columns (characteristics and key figures) = 90.

ADSO active table view from Hana Studio:

Add child compounding infoobect in Development system:

As expected, no remodeling request is generated, however there are warning messages for potential remodeling if ADSO is not empty.

Target system:

RSADMIN parameter is maintained as 2 billion.

Changes moved to the target system; here the transport log;

  • Transport takes approximately 10 mins
  • All dependent objects are activated during the transport
  • As expected, no remodeling request generated.
Rating: 0 / 5 (0 votes)

The post ADSO Remodeling: Cases after implementation of notes 3006437 and 3019867 appeared first on ERP Q&A.

]]>