SAP Datasphere Archives - ERP Q&A https://www.erpqna.com/category/sap-datasphere/ Trending SAP Career News and Guidelines Fri, 06 Dec 2024 09:12:39 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP Datasphere Archives - ERP Q&A https://www.erpqna.com/category/sap-datasphere/ 32 32 Create a SQL user with access to multiple spaces in SAP Datasphere https://www.erpqna.com/create-a-sql-user-with-access-to-multiple-spaces-in-sap-datasphere/ Fri, 06 Dec 2024 09:12:35 +0000 https://www.erpqna.com/?p=88921 Suppose you want to access two SAP Datasphere spaces using the same SQL user, how do you create such a user? That’s what we will go through in this blog post. As an example, we have two spaces as the below screenshot shows. We will create a SQL user that can access both of these […]

The post Create a SQL user with access to multiple spaces in SAP Datasphere appeared first on ERP Q&A.

]]>
Suppose you want to access two SAP Datasphere spaces using the same SQL user, how do you create such a user? That’s what we will go through in this blog post.

As an example, we have two spaces as the below screenshot shows. We will create a SQL user that can access both of these spaces. For this we take the following steps.

  1. Create a Database User Group including an admin user;
  2. Create an SQL user with the new admin user;
  3. Create a Database User for each space;
  4. Grant read access to the views that you want the SQL user to have access to.
Figure 1: Two spaces for which we will create one SQL user with read access to both spaces

Step 1 – Create a Database User Group including an admin user

In the side navigation area of SAP Datasphere, choose System > Configuration > Database Access > Database User Groups. Here, create a Database User Group. This will provide you right away with administrator credentials. If you need documentation on the creation process, go to this help page. In there, there is also a link to the SAP HANA Cloud documentation that describes the Database User Group Concept.

Figure 2: Dialog for creating a Database User Group

Step 2: Create an SQL user with the new admin user

After the Database User Group is created, go to your preferred database client tool to log on with that user. In this blog post we use the SAP Database Explorer. With the admin user, we create a new SQL user. This will be the user getting read access to the two SAP Datasphere spaces.

CREATE USER DWCDBGROUP#READ_SPACES#BOBBY PASSWORD "<A1a etc.>" SET USERGROUP DWCDBGROUP#READ_SPACES

Now log on with the newly created SQL user. Tip: in DB Explorer, you can right click on the existing instance and choose “Add Database with Different User” – then it adds a new user while re-using the instance details, which saves you some time.

Figure 3: Add new Database Instance entry with the newly created SQL user

When we check the views in the two spaces that this user has access to, we see that currently no views are accessible.

Figure 4: The newly created SQL user has no privileges yet on either of the two spaces

Step 3: Create a Database User for each space

To grant read access from the two spaces to this SQL user, you first have to create a Database User in each of the spaces. You do this under Space Management > Database Access > Database Users for each of the spaces. Make sure you tick the checkboxes for Enable Read Access (SQL) and With Grant Option. Deploy the space to activate the user and fetch the credentials.

Figure 5: Dialog to create a Database User for a given space

Step 4: Grant read access to the views that you want the SQL user to have access to

Using the database users created in step 3 – one for each space – we go through the following for each of the spaces.

Check to which views the database user has access. The following statement gives you a list of the views that are exposed for consumption. If you want to add more views, you’ll have to expose them for consumption in the SAP Datasphere view editor.

select OBJECT_NAME from effective_privileges
where SCHEMA_NAME IN ('SEFAN_SALES','SEFAN_HIERARCH2') AND GRANTEE_TYPE = 'USER' AND IS_GRANTABLE = 'TRUE' AND user_name = current_user;

As you can see, this gives us a the list of views exposed for consumption in our space.

Figure 6: List of the views that the space database user has access to

The space database user also has grant select privileges, because we made that setting when creating the user. Therefore, we can fire GRANT SELECT statements and provide our SQL user with read access to the views. You cannot grant access on the entire schema, only on individual objects. SAP Datasphere does not provide you with any user that has full schema access with a grant option. Below we grant the SQL user access to one view.

GRANT SELECT ON SEFAN_SALES."vBusinessPartner" TO DWCDBGROUP#READ_SPACES#BOBBY

When now checking again the list of views with the SQL user, you can see that this view popped up. Below screenshot is made after we repeated all of the above step also for our other space.

Figure 7: Views in the SAP Datasphere space that the SQL user has read access to

Build a procedure to grant access to all consumable views of a space at once

Granting read access to all individual views in a space can be a bit cumbersome, but of course you can automate this by coding a procedure that loops over all consumable views for a space and then grants read access to each view. Below code creates such procedure. Make sure to create the procedure with the Database User Group Admin, and then grant access to that procedure to the database users of the space. These can then execute this procedure.

--execute with ADM user
CREATE OR REPLACE PROCEDURE grant_select_on_objects (
    IN IP_SPACE NVARCHAR(256),
    IN IP_SQL_USER NVARCHAR(256)
)
LANGUAGE SQLSCRIPT
SQL SECURITY INVOKER
AS
BEGIN
    -- Declare a cursor for the SELECT statement
    DECLARE CURSOR C FOR 
        SELECT SCHEMA_NAME, OBJECT_NAME 
        FROM effective_privileges 
        WHERE SCHEMA_NAME = :IP_SPACE
          AND GRANTEE_TYPE = 'USER'
          AND IS_GRANTABLE = 'TRUE'
          AND USER_NAME = CURRENT_USER;

    -- Variables to hold the fetched data
    DECLARE v_schema_name NVARCHAR(256);
    DECLARE v_object_name NVARCHAR(256);
    DECLARE v_sql NVARCHAR(5000);

    -- Loop through the result set
    FOR cur_row AS C DO
        v_schema_name = cur_row.SCHEMA_NAME;
        v_object_name = cur_row.OBJECT_NAME;

        -- Construct and execute the GRANT statement
        v_sql = 'GRANT SELECT ON "' || v_schema_name || '"."' || v_object_name || '" TO "' || :IP_SQL_USER || '"';
        EXECUTE IMMEDIATE :v_sql;
            
    END FOR;

END;

GRANT EXECUTE ON grant_select_on_objects TO SEFAN_SALES#READ_SPACES;
GRANT EXECUTE ON grant_select_on_objects TO SEFAN_HIERARCH2#READ_SPACES

When the procedure is created, execute it with the space database users as follows:

CALL DWCDBGROUP#READ_SPACES#ADM.grant_select_on_objects('SEFAN_HIERARCH2', 'DWCDBGROUP#READ_SPACES#BOBBY');
CALL DWCDBGROUP#READ_SPACES#ADM.grant_select_on_objects('SEFAN_SALES', 'DWCDBGROUP#READ_SPACES#BOBBY');

As you can see below, the SQL user now has access to the views from both spaces.

Figure 8: Views in the SAP Datasphere space that the SQL user has read access to

Because the access is for individual objects, when you want to add views that have been newly created or exposed since, you’ll have to run the procedure or the individual statements again.

Rating: 5 / 5 (1 votes)

The post Create a SQL user with access to multiple spaces in SAP Datasphere appeared first on ERP Q&A.

]]>
Accessing SharePoint files from Datasphere using BTP Open Connectors https://www.erpqna.com/accessing-sharepoint-files-from-datasphere-using-btp-open-connectors/ Fri, 25 Oct 2024 11:25:35 +0000 https://www.erpqna.com/?p=88451 Our customer had a requirement to combine data from SharePoint files with other data in an SAP Analytics Cloud (SAC) dashboard. Since there is no native SharePoint connector for SAP Datasphere, we created a connection using SAP BTP’s open connector, which we then utilize from Datasphere. 1. Register an app for SharePoint API Access in […]

The post Accessing SharePoint files from Datasphere using BTP Open Connectors appeared first on ERP Q&A.

]]>
Our customer had a requirement to combine data from SharePoint files with other data in an SAP Analytics Cloud (SAC) dashboard. Since there is no native SharePoint connector for SAP Datasphere, we created a connection using SAP BTP’s open connector, which we then utilize from Datasphere.

1. Register an app for SharePoint API Access in Azure Active Directory

  • Logon to your Azure Portal using your SharePoint online credentials
  • Navigate to Azure Active Directory and select App Registrations
  • Click New Registration to create an OAuth application

  • In the application registration prompt, enter an application name e.g. SharePointOAuthApp
  • Select the supported account types

Enter the redirect URL for SAP Cloud Platform Open Connectors:  https://auth.cloudelements.io/oauth

2. Configure the registered application’s SharePoint API permissions

The registered application by default has only User.Read permission from Microsoft Graph APIs, so you need to add in permission to access SharePoint REST APIs.

    • Select API permissions tab and then click on Add a permission to add permissions for SharePoint REST APIs.

    • Select SharePoint to add in the API permissions for SharePoint

    In SAP Cloud Platform Open Connectors, access to the API is via the signed-in user.

    • Select Delegated Permissions for accessing APIs as signed-in user

    • Select permissions shown below, then click Add permissions

    Some of the selected permissions require administrator consent 

    • After the permission is selected, click on Grant admin access 

    The permission may take some time to updated as shown in the warning, so wait for few minutes before selecting the Grant admin consent option.

    • Select Yes if you are prompted to confirm the administrator consent

    When successful, the status will change to Granted for your user.

    3. Generate certificates and secrets for your registered app

    For connecting to your SharePoint Online account from SAP Cloud Platform Open Connectors, an OAuth secret and client ID are required.

      • Select Certificates & secrets tab, click on New client secret.

      • Enter a description for your OAuth secret and add

      • Note! Copy and save the generated client secret. You need to provide the secret in order to create the SharePoint connector instance from SAP Open Connectors, and it cannot be retrieved later.

      • To get your OAuth Client ID , select Overview tab, copy the Application (client) ID value.

      4. Create a SharePoint Open Connector instance in SAP BTP Integration Suite

      • In the SAP BTP navigate to Integration Suite
      • Select Extend Non-SAP Connectivity. If this option is not visible, click  Manage Capabilities and enable Open Connectors capability.

      • Select the Connectors tab
      • Hover over the SharePoint connector and select Authenticate to connect to your own SharePoint account.

      • In the connection wizard, enter a name for your connector instance
      • Enter your SharePoint Site Address in the format {your_sharepoint_domain}.sharepoint.com
      • In API Key enter your copied OAuth Client ID
      • In the API Secret dialog enter your copied OAuth secret
      • Select Show Optional Fields.

      • Enable graph authentication to prompt user authentication (this corresponds with the delegated scoped defined on Azure)
      • Select Create Instance

      You may be prompted to enter your SharePoint user credentials if you are not already logged into your SharePoint account

      • Trust your app

      After successfully creating your authenticated connection to your SharePoint account, you can test it

      • Choose Test in the API docs

      • Select GET /files to read files from your SharePoint sites

      • Click on Try it Out

      • Insert the file path for a valid file in the folder (no sub folders) and choose execute

      (Note:- If your site contains spaces, then in the Subsite field enter the site name without spaces.)

      Once the test has run successfully, the updated file should be available for download.

      5. Establish a connection to your SharePoint Open Connector in Datasphere

      • Enter your SAP BTP Subaccount Region

      The BTP region can be found within the Account Explorer page in the BTP Cockpit

        • Enter your Organization Secret
        • Enter your User Secret

        For the Organizational and User secrets, after creating the instance to the app you need, make any type of API request, for example “GET”

          The authorization String will contain the necessary detail as well.

          6. In Datasphere, create a dataflow to read your SharePoint directory

          • Create a dataflow to read your SharePoint directory and post the data into a local table
          • Create a fact view to transform your data as required
          • Create an analytical model for external consumption

          7. In SAC create your story using your SharePoint-based Datasphere model

            Rating: 5 / 5 (1 votes)

            The post Accessing SharePoint files from Datasphere using BTP Open Connectors appeared first on ERP Q&A.

            ]]>
            Datasphere to Power BI via an ODBC Connection https://www.erpqna.com/datasphere-to-power-bi-via-an-odbc-connection/ Sat, 07 Sep 2024 09:56:43 +0000 https://www.erpqna.com/?p=87680 Introduction Welcome to this blog where I will provide an overview of the following key aspects in relation to creating an ODBC connection between Datasphere and PowerBI. Create a Database user in Datasphere To connect SAP Datasphere with PowerBI, the first step involves creating a dedicated database user in Datasphere. This user will serve as […]

            The post Datasphere to Power BI via an ODBC Connection appeared first on ERP Q&A.

            ]]>
            Introduction

            Welcome to this blog where I will provide an overview of the following key aspects in relation to creating an ODBC connection between Datasphere and PowerBI.

            • Create a Database user in Datasphere
            • Install an ODBC Driver for Third-party BI Client Access
            • Add your IP address to IP allowlist
            • Ensure entities in SAP Datasphere are consumable
            • Connect Microsoft PowerBI to SAP Datasphere
            • Refreshing the Data
            • F&Qs

            Create a Database user in Datasphere

            To connect SAP Datasphere with PowerBI, the first step involves creating a dedicated database user in Datasphere. This user will serve as the bridge for data access between Datasphere and your BI tool. Below, we outline the process to set up a database user, ensuring you have the necessary credentials for a smooth integration.

            1. In Datasphere, go to Space Management then go to Database Access.

            2. Click on create Database User

            3. Give your Database User a name. You will get the following details:

            Take note of the Database User Name, Host Name, Port and Password:

            TitleInfo
            User NameTEST_SPACE#TEST_SPACE
            Host Name****
            Port****
            Password****

            Installing and Configuring the ODBC Driver

            To enable your third-party BI tool to connect with SAP Datasphere, you need to set up an ODBC (Open Database Connectivity) driver on your Windows system. This driver acts as a bridge between your BI tool and the HANA database within Datasphere. In this section, we’ll guide you through the steps to install and configure the ODBC driver, ensuring that your data connection is properly established and ready for use. Follow the steps below to download, install, and set up the ODBC driver, allowing seamless integration with your BI tools.

            1. Download the ODBC Driver

              • Begin by searching for “ODBC” in the Windows search bar. Every Windows system comes with an ODBC Manager pre-installed. Depending on the version of the BI tool you are using, choose between the 32-bit or 64-bit ODBC Manager.

              • Navigate to the HANA developer tools website here to find and download the ODBC driver suitable for connecting to the HANA database. Be sure to save the download to your local laptop drive for easy access..

              2. Install the ODBC Driver

              • Once the download is complete, locate and run the hdbsetup.exe file from your local drive.

              • Follow the installation wizard to complete the setup. This will install the necessary ODBC client on your system.

              3. Configure the ODBC Data Source

              • Open the ODBC Data Sources application on your system. You should now see a new entry labelled “HDBODBC.”

              • Go to the “User DSN” tab and click on “Add…”

              • Select the HDBODBC (HANA Database ODBC) driver from the list and click “Finish.”

              • Enter a data source name, description, and the host address. You can find the host address in Datasphere by navigating to Space Management, selecting your space, going to Database Access, and checking the details of your database user.

              4. Finalize the Setup

              • Click “OK” to save your configuration. You have now successfully created an ODBC data source on your Windows system

              Add your IP address to IP allowlist

              To allow an external BI client, like PowerBI, in your local network to connect to the database of Datasphere, you need to add the external (public) IPv4 address of the client to an allowlist.

              1. Open Command prompt and enter:

              curl ifcfg.me

                2. Copy the external IPV4 address

                3. Within Datasphere, go to Configuration >> IP Allowlist >> click on Add and enter the IPv4 address of your system to add it to the allowlist.

                  Ensure entities in SAP Datasphere are consumable

                  Before you can utilize data from SAP Datasphere in your BI tool, it’s essential to ensure that the data entities and models you wish to access are properly configured for external use. This involves adjusting settings within Datasphere to make your data available for consumption.

                  1. Go to Datasphere
                  2. When you create model or data entity you need to make sure ‘Expose for Consumption’ is turned on.

                  Connect Microsoft PowerBi to SAP Datasphere

                  To leverage the power of SAP Datasphere data within Microsoft Power BI, you need to establish a connection using the ODBC driver you previously set up. This process allows you to seamlessly import and visualize your Datasphere data in Power BI for in-depth analysis and reporting. Follow these steps to connect Power BI to your SAP Datasphere data:

                  1. Open the Microsoft Power BI and click on the Get Data icon.

                    2. Search for and select ODBC

                      3. From the dropdown, select the data source you created

                      4. Once you select OK, you’ll be asked to get the username and password. These can be found back in Datasphere (Space Management >> Database Access >> Database Users >> Find your user and select the info icon)

                      5. Once the username and password is successful, you’ll get a list of the data models and entities within your selected test space. In this scenario, we are going to use ‘TELCO_CUSTOMERCHURN_VIEW’

                        6. Select the view you’d like to use in Power BI and click Load.

                        7. You can now create a dashboard based on the data from DSP.

                          Refreshing the Data

                          There are 2 ways to refresh your data in PowerBI:

                          • Manually
                          • Scheduled

                          1. The first steps into refreshing your data, you need to publish your dashboard.

                            2. Once you publish your dashboard, it will bring you to the PowerBI browser. Click on the “My Workspace” app on the left-hand side.

                              Manually

                              To manually refresh your data, simply navigate to ‘My Workspace’ and click the refresh now icon attached to your model.

                              Scheduled

                              1. You then need to click on the “Schedule refresh” icon on your Semantic model.

                                2. You’ll be presented with a window like this

                                3. Scroll down to the “Gateway connections”. If you have no gateways installed like this. You must select “Install now” to install a data gateway.

                                4. Accept the T&Cs and click “Install”

                                5. Once the gateway finishes installing, enter an email address and click “Sign in”

                                6. Once you have signed in, click on Close.

                                7. Dropdown the ‘Gateway and cloud connections’ header and you’ll be able to see your personal gateway

                                8. Dropdown the ‘Refresh’ header. Here you can now schedule a data refresh.

                                9. When you go back to your workspace, you can see when the next refresh is scheduled to occur.

                                Conclusions

                                In summary, connecting SAP Datasphere to Power BI through an ODBC connection enables powerful data integration and visualization capabilities. By following the outlined steps, you can seamlessly import and analyse your Datasphere data within Power BI. Additionally, the ability to refresh your data either manually or on a scheduled basis ensures that your reports and dashboards reflect the most current information.

                                  Rating: 0 / 5 (0 votes)

                                  The post Datasphere to Power BI via an ODBC Connection appeared first on ERP Q&A.

                                  ]]>
                                  How to Provision a SAP Datasphere Free-Tier Instance https://www.erpqna.com/how-to-provision-a-sap-datasphere-free-tier-instance/ Tue, 09 Jul 2024 09:39:02 +0000 https://www.erpqna.com/?p=86350 Introduction In this blog post, we’ll guide you step-by-step through the process of provisioning a SAP Datasphere Free-Tier instance. Whether you’re a data enthusiast, a business analyst, or an IT professional, by following these steps, you’ll have your free-tier instance up and running in no time. Prerequisites To create your SAP Datasphere service instance in […]

                                  The post How to Provision a SAP Datasphere Free-Tier Instance appeared first on ERP Q&A.

                                  ]]>
                                  Introduction

                                  In this blog post, we’ll guide you step-by-step through the process of provisioning a SAP Datasphere Free-Tier instance. Whether you’re a data enthusiast, a business analyst, or an IT professional, by following these steps, you’ll have your free-tier instance up and running in no time.

                                  Prerequisites

                                  To create your SAP Datasphere service instance in SAP BTP, you need the following prerequisites:

                                  • A SAP BTP Global Account
                                  • Your global account has a commercial entitlement either via cloud credits (in case of a consumption-based contract) or via a subscription-based contract.
                                  • You are using Google Chrome to properly view popups in SAP BTP.

                                  Step 1: Sign in to SAP BTP Cockpit

                                  1. Navigate to the SAP BTP Cockpit

                                  Open your web browser and go to the SAP BTP Cockpit website: SAP BTP Cockpit

                                  2. Log in to your account

                                  Enter your SAP credentials to log in.

                                  Step 2: Set Up a Subaccount

                                  1. Create a Subaccount

                                  In the SAP BTP Cockpit, go to your Global Account and click on “Subaccounts”.
                                  Click on “Create” to add a new subaccount.

                                    Fill in the required details such as “Name”, “Environment” (choose Cloud Foundry), and “Region”.

                                    2. Confirm the Subaccount

                                    Click “Create” and wait for the subaccount to be created.

                                      Step 3: Enable Cloud Foundry

                                      Enable Cloud Foundry

                                      In your subaccount, navigate to “Cloud Foundry” and click on “Enable Cloud Foundry”.

                                      Fill in the required details such as “Plan”, “Landscape”, and “Org Name”.

                                      Step 4: Create a Space

                                      Add a Space

                                      Within the Cloud Foundry section, click on “Spaces” and then “Create Space”.
                                      Provide a name for your space (e.g., “development”).

                                      Fill in the required details for “Space Name” and set the flags for “Space Developer” and “Space Manager”.

                                      Step 5: Assign Entitlements

                                      Manage Entitlements

                                      Go to your subaccount, select “Entitlements”, and click on “Edit”.

                                      Click on “Add Service Plans”.

                                      Add the required entitlements for SAP Datasphere and click on “Add 1 Service Plan”.

                                      Click on “Save”.

                                      Step 6: Subscribe to SAP Datasphere

                                      Add SAP Datasphere Subscription

                                      In the subaccount, navigate to “Instances and Subscriptions”. Click on “Create”.

                                      Fill in the required details for “Service”, “Plan”, “Runtime Environment”, “Space” and “Instance Name” as shown below and afterwards click on “Next” and then on “Create”

                                      Fill in the mandatory details for “First Name”, “Last Name” and “Email” (This is the Email where the SAP Datasphere link will be sent to).

                                      Step 7: Access SAP Datasphere

                                      After the creation of SAP Datasphere is finished you have two possibilities to access the SAP Datasphere tenant. You can access it through SAP BTP Cockpit or by clicking on the SAP Datasphere link that is provided to you in an Email.

                                      1. Launch SAP Datasphere via SAP BTP Cockpit

                                      Click on “Instances and Subscriptions”, scroll down to “Instances” and then click on the link to launch SAP Datasphere.

                                          2. Launch SAP Datasphere via the link from Email

                                          After provisioning is finished, an email is sent to the email address that was used during the creation of the SAP Datasphere tenant. You can find the email by searching for “Welcome to SAP Datasphere” in your email client. There you will find a link to your SAP Datasphere tenant.

                                          Rating: 0 / 5 (0 votes)

                                          The post How to Provision a SAP Datasphere Free-Tier Instance appeared first on ERP Q&A.

                                          ]]>
                                          Importance of Currency Conversion in SAP Datasphere https://www.erpqna.com/importance-of-currency-conversion-in-sap-datasphere/ Mon, 01 Jul 2024 07:41:49 +0000 https://www.erpqna.com/?p=86026 Accurate currency conversion ensures that financial statements reflect true value, enabling better decision-making and compliance with regulatory requirements. Without effective currency conversion, businesses may face issues such as financial discrepancies, inaccurate forecasting, and challenges in performance evaluation across different regions. “Twenty-four-hour global nature of currency markets, exchange rates are constantly shifting from day to day […]

                                          The post Importance of Currency Conversion in SAP Datasphere appeared first on ERP Q&A.

                                          ]]>
                                          Accurate currency conversion ensures that financial statements reflect true value, enabling better decision-making and compliance with regulatory requirements. Without effective currency conversion, businesses may face issues such as financial discrepancies, inaccurate forecasting, and challenges in performance evaluation across different regions.

                                          “Twenty-four-hour global nature of currency markets, exchange rates are constantly shifting from day to day and even from minute to minute, sometimes in small increments and sometimes quite dramatically” – Harvard Business Services.

                                          In an interconnected world where businesses operate across borders, efficient currency conversion is essential. Currency conversion significantly impacts reporting and business analytics in the following ways:

                                          Translation of Financial Statements:

                                          Multinational corporations translate foreign incomes, expenses, assets, and liabilities into their reporting currency using relevant exchange rates.

                                          Variances in Reported Financials:

                                          Exchange rate fluctuations can result in significant variances in reported financials.

                                          Currency conversion in SAP involves converting monetary values from one currency to another based on predefined exchange rates. This is important for multinational companies and businesses engaged in cross-border transactions.

                                          Exchange rates in SAP define how one currency is converted to another. Let’s explore some significant business use cases where reporting with currency conversion is extensively required:

                                          Case 1: Global Operations and Multinational Businesses

                                          Companies operating in multiple countries need to manage different currencies. Currency conversion allows them to integrate financial data across various locations and ensure accurate financial reporting.

                                          Case 2: Consolidated Financial Statements

                                          Currency conversion in SAP enables the creation of consolidated financial reports in a single group currency.

                                          Case 3: Budgeting and Forecasting

                                          Companies often need to budget and forecast in a specific currency while dealing with costs and revenues in other currencies. Currency conversion allows for accurate planning and forecasting, providing a unified view of the organization’s financial health.

                                          Currency conversion is used when posting financial information where the reporting currency differs from the transaction currency.

                                          Currency Conversion Methods In SAP Datasphere

                                          In an increasingly globalized business environment, companies often deal with transactions and data from multiple countries, involving various currencies. This complexity makes accurate currency conversion a critical aspect of financial reporting, budgeting, and analytics. SAP Datasphere, with its robust currency conversion capabilities, ensures businesses maintain financial accuracy and consistency across their operations.

                                          Steps to Create Currency Conversion

                                          Use Case

                                          Step 01:

                                          The client needs a report in Indian currency, but we have data in Datasphere in USD. So, using the currency conversion, we are converting from USD to INR.

                                          Step 02:

                                          Check whether the connection of the source is in an active state.

                                          • Confirm access to Data flows, Tables, and Views, because running the data flow loads data into the local table before it becomes available in the views.
                                          • To perform currency translation in SAP Datasphere, the following tables must be available in your space:
                                          1. TCURV – Exchange rate types
                                          2. TCURW – Exchange rate type text
                                          3. TCURX – Decimal places in currencies
                                          4. TCURN – Quotations
                                          5. TCURR – Exchange rates
                                          6. TCURF – Conversion factors
                                          7. TCURC – Currency codes
                                          8. TCURT – Currency text
                                          • DataFlows

                                          • Tables

                                          • Views

                                          Step 03:

                                          • In this case we don’t have data in the target table, so we have to run the data flow to load the data to local table.

                                          Step 04:

                                          • Here, we are converting the currency from USD to INR.
                                          • We have filtered “From Currency” as “USD” and “To Currency” as “INR” to obtain the exchange rate type.
                                          • After that, obtain exchange rate types M & P for the scenario.

                                          Step 05:

                                          • For this scenario we have selected the source measure as “Gross Amount,” so we have to change the “Semantic Type” to “Amount with Currency” and “Unit Column” can be selected accordingly.

                                          Step 06:

                                          • We have selected the billing document date as the transaction date because we are using the billing document fact model.

                                          Step 07:

                                          • The conversion from USD to INR is now complete.

                                          Best Practices for Currency Conversion

                                          • Address rounding and precision issues:

                                          Rounding Rules: Apply consistent rounding rules to avoid discrepancies in financial reports.

                                          Precision: Ensure that rounding practices maintain accuracy, especially withlarge datasets.

                                          • Maintain consistency and accuracy:

                                          Accurate Data Entry: Ensure accurate entry of financial data and exchange rates to minimize errors during currency conversion.

                                          Data Quality Checks: Regularly perform data quality checks to identify and rectify inaccuracies in exchange rates and financial data.

                                          • Regular updates and monitoring of exchange rates:

                                          Periodic Reviews: Conduct regular reviews of your currency conversion processes and update procedures as necessary to adapt to changing financial environments.

                                          Rating: 0 / 5 (0 votes)

                                          The post Importance of Currency Conversion in SAP Datasphere appeared first on ERP Q&A.

                                          ]]>
                                          Quick & Easy Datasphere – When to use Data Flow, Transformation Flow, SQL View? https://www.erpqna.com/quick-easy-datasphere-when-to-use-data-flow-transformation-flow-sql-view/ Fri, 07 Jun 2024 12:09:10 +0000 https://www.erpqna.com/?p=85367 Introduction In SAP Datasphere we can “enhance” a data set using different development artifacts like data flow, transformation flow or SQL view (Query/Table Function). In this blog article, I’ll share my point of view and decision process for development artefact selection. Datasphere Technical Limitations/Design Considerations Let’s first cover the most important factor; persistent data tables […]

                                          The post Quick & Easy Datasphere – When to use Data Flow, Transformation Flow, SQL View? appeared first on ERP Q&A.

                                          ]]>
                                          Introduction

                                          In SAP Datasphere we can “enhance” a data set using different development artifacts like data flow, transformation flow or SQL view (Query/Table Function). In this blog article, I’ll share my point of view and decision process for development artefact selection.

                                          Datasphere Technical Limitations/Design Considerations

                                          Let’s first cover the most important factor; persistent data tables performs better than views. (For the sake of clarity, persistent data is pre-computed data set derived from a query or a data flow and stored in a table.). So, if we face sql view performance problems, we should consider writing the results to a table.

                                          1- The choice between “SQL View – SQL Query” and “SQL View – SQLScript (Table Function)” is done regarding 2 factors: The complexity of the requirement and the readability of the code. If a requirement can be coded using one select statement in a readable manner, “SQL View – SQL Query” should be the one.

                                          2- In native Hana developments, graphical views’ execution plans are optimized automatically. So, theoretically graphical views should perform better than SQLScript views. I couldn’t find a noticeable difference myself…

                                          3- Data Flow Limitation 1: The script operator doesn’t allow us to read other tables within the code. We can only use the data of the input of the operator.

                                          4- Data Flow Limitation 2: The script operator (Python) has an unexpectedly poor performance. In my experience it’s 40-50 times slower than local Jupyter Notebooks and even slower than SQLScript/SQL view option.

                                          5- Data Flow Limitation 3: Local tables with delta capture capability are currently unavailable as target of Data Flow. Also, surprisingly we can only read active data table of these table as source.

                                          6- Data Flow Limitation 4: We cannot use views with a parameter as source.

                                          7- Local Table Limitation; we cannot turn on or off delta capture capability of a local table.

                                          8- Local Table Property: The delta capture fields (Change_Type, Change_Date) are updated automatically only if data is updated using excel upload or via Data Editor. For data flow or transformation flow updates, these fields should be updated within the data/transformation flow.

                                          9- Even if script operator in data flow has poor performance, in case we are not concerned about long running data loads, we can use data flow for tasks which are much easier to do in python like NLP, Data cleaning, time series analysis and so on.

                                          So, my “decision tree” for choosing one of these artifacts is as seen below. Bear in mind that the code that we write in the node 3 or 4 will be used in a data flow or a transformation flow as source. So, it’s not lost time…

                                          Example Scenario

                                          The idea of writing this blog article came to me with this question: “What should I use for a material master self-transformation?”. We wanted to flag materials in the material master data table (it’s a table with delta capture capabilities) which have not been moved within last 365 days to identify obsolete materials. For this example we’ll use 2 tables; Material Movement Data (IT_TR_MM_DOC_DEMO) and Material Master (IT_MD_MATERIAL_DEMO). So, here we are!

                                          Steps/Explanations

                                          1- You can find the material master data and material movements table data screenshots below. As it’s pretty straight forward, I’ll not walk through data loads. You can load csv files for testing following official documentation.

                                          2- For our scenario, we’ll read the material movement data table for each material in material master and change the valıe of material master “Obsolete” field to ‘X’ if there’s no entry for the material in question. According to screenshot above, materials 10000001 and 10000002 are not obsolete but 10000003 is.

                                          3- Go to Data Builder -> Select the space -> Click on “New Transformation Flow” -> Click on “View Transformation” and click on SQL View Transformation icon or the button.

                                          4- Select “SQLScript (Table Function)”, copy-paste the code in the appendix.

                                          5- Click on the Columns button and create the output table.

                                          6- Validate the code using the button at the right top of the SQL View Editor, see that the SQL Code is valid and go back.

                                          7- Search for the target table name, drag and drop it on the target box

                                          8- Do the mapping of source and target fields by dragging and dropping them in the Properties Panel.

                                          9- Click on an empty space of the canvas, rename the transformation flow, deploy it and wait for the deployment finished notification.

                                          10- Run the transformation flow, refresh the run status until the status turns to “Completed”.

                                          11- Go to data builder, open the target table, click on view data , filter the data for 3 materials we had in the material movements table and validate that none of the “Obsolete” values is null and 10000003 is marked as obsolete because latest movement was on 16.01.2022 which is more than 1 year before at the time of writing this article (22.04.2024).

                                          Conclusion

                                          In SAP DataSphere, creating a self-transformation is very easy using a transformation flow and pretty straight forward for BW/4HANA developers experienced in AMDP transformations (or anyone who codes SQLScript) but this method should only be considered for scenarios where we cannot achieve the desired result using a SQLView (Query or Table Function) due to complexity or performance reasons.

                                          Rating: 0 / 5 (0 votes)

                                          The post Quick & Easy Datasphere – When to use Data Flow, Transformation Flow, SQL View? appeared first on ERP Q&A.

                                          ]]>
                                          Securing Your Data with Data Access Controls in SAP Datasphere https://www.erpqna.com/securing-your-data-with-data-access-controls-in-sap-datasphere/ Thu, 30 May 2024 12:29:26 +0000 https://www.erpqna.com/?p=85138 Datasphere, a comprehensive data management platform, offers a suite of tools and features to help organizations safeguard their data assets and ensure compliance with security regulations. In this blog post, I will demonstrate how to implement data access controls in Datasphere to enhance security and protect your valuable data. Understanding Data Access Controls Data access […]

                                          The post Securing Your Data with Data Access Controls in SAP Datasphere appeared first on ERP Q&A.

                                          ]]>
                                          Datasphere, a comprehensive data management platform, offers a suite of tools and features to help organizations safeguard their data assets and ensure compliance with security regulations. In this blog post, I will demonstrate how to implement data access controls in Datasphere to enhance security and protect your valuable data.

                                          Understanding Data Access Controls

                                          Data access controls in Datasphere typically refer to the mechanisms and policies put in place to manage and restrict access to data within the Datasphere platform. These controls ensure that only authorized users or systems can access specific datasets or resources, helping to maintain data security, privacy, and compliance with regulations.

                                          Business Proposal:

                                          Business User Perspective

                                          Anna is the Sales Manager, she understands the business needs and the sensitivity of sales data.

                                          John is the Data Analyst from the Irish Sales Department, he uses sales data for analysis and reporting.

                                          Anna and John have a meeting to discuss the upcoming implementation of SAP Datasphere. Anna is aware of the importance of data security and compliance with data governance policies. They decide they need to set up a permissions table to define who can access what data which can then be shared with the IT team so they can implement these controls in Datasphere.

                                          Proposal for Permissions Access:

                                          Sales Executives: Full access to all sales data.

                                          Regional Sales Managers: Access restricted to their respective regions.

                                          Sales Analysts: Access to anonymized or aggregated data for analysis purposes.

                                          Technical User Perspective

                                          Sarah is the IT & Security Administrator: She specializes in data modelling and security and is responsible for setting up and managing Datasphere.

                                          Sarah receives the permissions table from Anna and John. She begins planning how to implement these controls in SAP Datasphere.

                                          Creating Data Access Controls in Datasphere

                                          Prerequisites

                                          • Before creating your data access control, you must have prepared a permissions entity with the following columns:
                                            • User ID column – Containing user ids in the format required by your identity provider (email addresses, logon names, or other identifiers). This column must be selected as the Identifier Column in your data access control.
                                            • One or more columns containing filter criteria (country or region names or ids, department names or ids, or any other criteria to control how your data is exposed to users). These columns must be selected as Criteria columns in your data access control.
                                          • Only users with the DW Space Administrator role (or equivalent privileges) can create data access controls in which criteria are defined as simple values. The organizations IT & Security Administrator Sarah should do this.

                                          Click on the Security Icon on the left-hand bar. Click Roles.

                                          Click on the DW Space Administrator role.

                                          Click on the users Icon. Here the security administrator can assign or remove Users to the role.

                                          Step 1: Setting up your tables.

                                          Business users Anna and John oversee the management of the Permissions table, which is subsequently shared with the IT department. They have set up this table up in the Permissions Space.

                                          Permissions Table Set up:

                                          The IT & Security admin Sarah, who is the technical user, use the shared Permissions table as the basis for Data Access Controls.

                                          Sales Products Table:

                                          Step 3: Set up your Data Access Control in the IT Space.

                                          Sarah logs into SAP Datasphere and navigates to the Data Access Control section.

                                          In the side navigation area Sarah will click Data Access Controls, select the IT space, and click New Data Access Control to open the editor.

                                          Select the input help and add the Permissions table to the ‘Permissions Entity’.

                                          The permissions entity should include at least two columns:

                                          One or more columns with filter criteria (e.g., Sales Org, Product) that determine how data is presented to users. These columns are known as Criteria columns in your data access control.

                                          One column containing user IDs in the required format (such as email address). This column serves as the Identifier Column in your data access control.

                                          Now Sarah can save and deploy the Data Access Control.

                                          Step 4: Applying your DACs to a Graphical View

                                          Sarah will create an analytical model in Datasphere, ensuring it respects the defined access controls.

                                          Open the desired Graphical view. Below she has added the Sales Products Data table into a graphical view.

                                          Click on the View Node. In the Model properties click on the plus icon to add your Data Access Control.

                                          In the Join section map the columns from the Sales Org View to the Data Access Control in order the filter the data appropriately.

                                          Change the Semantic Usage to Fact, turn Expose for Consumption On and add at least one Measure. This will allow you to create your Analytic Model which then Makes the object available for consumption in SAP Analytics Cloud.

                                          Now Sarah can Save and Deploy the View.

                                          Now she can share this view with the Sales Space to be viewed by the Sales Team.

                                          In the Sales Space users can create Analytical Models using the Graphical View as the source and users will be able to see data specific to their user ID.

                                          Step 5: Consume your Model in SAP Analytic Cloud

                                          Sarah ensures the analytical models are accessible in SAC.

                                          Anna and John are pleased with the setup. They log into SAC and find the Sales Performance Dashboard tailored to their access levels. John sees aggregated sales data for the Irish Sales Department, while Anna has a comprehensive view of all sales data. The system ensures that sensitive data is secure and only accessible by authorized personnel.

                                          Johns View in SAC:

                                          Annas View in SAC:

                                          Conclusion

                                          In conclusion, setting up DAC’s in Datasphere is crucial to protect your sensitive information, regulate compliance, mitigate data breach risks, and improve operational efficiency. By following the outlined steps, users can ensure that their data assets are securely managed and accessed only by authorized personnel, thereby minimizing the potential for security incidents, and ensuring the integrity and confidentiality of their data throughout its lifecycle.

                                          Rating: 0 / 5 (0 votes)

                                          The post Securing Your Data with Data Access Controls in SAP Datasphere appeared first on ERP Q&A.

                                          ]]>
                                          SAP Datasphere – “The Next Big Step in Unified Analytics” https://www.erpqna.com/sap-datasphere-the-next-big-step-in-unified-analytics/ Wed, 22 May 2024 08:56:58 +0000 https://www.erpqna.com/?p=84983 Overview: In today’s data-centric business landscape, organizations are seeking comprehensive solutions that enable them to harness the power of their data. SAP Datasphere, an advanced data management and analytics platform, emerges as a game-changer in helping enterprises leverage their data for informed decision-making. In this blog, we will explore the key benefits of SAP Datasphere […]

                                          The post SAP Datasphere – “The Next Big Step in Unified Analytics” appeared first on ERP Q&A.

                                          ]]>
                                          Overview:

                                          In today’s data-centric business landscape, organizations are seeking comprehensive solutions that enable them to harness the power of their data. SAP Datasphere, an advanced data management and analytics platform, emerges as a game-changer in helping enterprises leverage their data for informed decision-making. In this blog, we will explore the key benefits of SAP Datasphere and how it empowers businesses to unlock valuable insights and drive digital transformation.

                                          Unified Data Integration and Management:

                                          SAP Datasphere offers a unified data integration and management platform, allowing organizations to seamlessly connect and consolidate data from diverse sources. It eliminates data silos and provides a single, comprehensive view of data across the entire organization. By harmonizing data from various systems, SAP Datasphere enables faster and more accurate decision-making, while ensuring data integrity and consistency.

                                          Data Quality and Governance:

                                          Data quality is critical for deriving reliable insights. SAP Datasphere includes robust data quality and governance features, ensuring that data is accurate, complete, and consistent. It enables data profiling, cleansing, and enrichment, thereby improving the overall data quality. With built-in data governance capabilities, organizations can establish data policies, enforce compliance, and monitor data lineage, ensuring regulatory adherence and data security.

                                          Advanced Analytics and Machine Learning:

                                          SAP Datasphere leverages advanced analytics and machine learning capabilities to unlock hidden patterns and trends in data. It provides a wide range of analytical tools and algorithms, empowering organizations to perform complex data analysis, predictive modeling, and anomaly detection. By harnessing the power of machine learning, businesses can gain valuable insights, make data-driven decisions, and drive innovation.

                                          Cloud-Native Architecture and Scalability:

                                          Built on a cloud-native architecture, SAP Datasphere offers scalability and flexibility to meet the evolving needs of businesses. With its elastic infrastructure, organizations can seamlessly scale up or down based on their data processing requirements, ensuring optimal performance and cost-efficiency. The cloud-based nature of SAP Datasphere also enables easy integration with other SAP solutions and third-party applications, creating a unified and connected ecosystem.

                                          Collaboration and Self-Service Analytics:

                                          SAP Datasphere promotes collaboration and self-service analytics, empowering business users to explore data and gain insights without heavy reliance on IT teams. It provides intuitive data exploration and visualization tools, enabling users to create interactive dashboards, reports, and visualizations with ease. By fostering collaboration and data-driven decision-making across teams, organizations can accelerate innovation and drive business growth.

                                          Accelerated Time-to-Value:

                                          One of the significant advantages of SAP Datasphere is its ability to accelerate time-to-value. With pre-built data connectors, templates, and industry-specific content, organizations can quickly onboard and start deriving value from their data. The platform’s user-friendly interface and intuitive workflows further contribute to rapid adoption and faster deployment. By minimizing implementation time and effort, SAP Datasphere enables businesses to realize the benefits of data analytics sooner.

                                          Conclusion:

                                          SAP Datasphere empowers organizations to unlock the full potential of their data, enabling intelligent decision-making, improved operational efficiency, and driving digital transformation. With its unified data integration, data quality and governance, advanced analytics, cloud-native architecture, collaboration capabilities, and accelerated time-to-value, SAP Datasphere serves as a comprehensive data management and analytics solution for the modern enterprise. Embrace the power of SAP Datasphere to harness the value hidden within your data and gain a competitive edge in today’s data-driven world.

                                          Rating: 0 / 5 (0 votes)

                                          The post SAP Datasphere – “The Next Big Step in Unified Analytics” appeared first on ERP Q&A.

                                          ]]>
                                          Connect & Visualize: SAP Datasphere with Qlik Sense https://www.erpqna.com/connect-visualize-sap-datasphere-with-qlik-sense/ Mon, 01 Apr 2024 08:51:33 +0000 https://www.erpqna.com/?p=83103 In this Blog, We’ll explore how to consume data from SAP Datasphere through ODBC (Open Database Connectivity) and Visualize the data in Qlik Sense which is one of the leading Data Visualization Tools. Why SAP Datasphere over others? SAP Datasphere allows seamless connectivity with a wide range of data sources, including on-premises and cloud-based systems. […]

                                          The post Connect & Visualize: SAP Datasphere with Qlik Sense appeared first on ERP Q&A.

                                          ]]>
                                          In this Blog, We’ll explore how to consume data from SAP Datasphere through ODBC (Open Database Connectivity) and Visualize the data in Qlik Sense which is one of the leading Data Visualization Tools.

                                          Why SAP Datasphere over others?

                                          SAP Datasphere allows seamless connectivity with a wide range of data sources, including on-premises and cloud-based systems. SAP Datasphere is designed to handle large volumes of data efficiently, making it suitable for organizations of all sizes, from small businesses to large enterprises. Its scalable architecture ensures optimal performance even as data volumes grow over time. Graphical low-code/no-code tools to support self-service modeling needs for business users. It has Powerful built-in SQL and data flow editors for sophisticated modeling and data transformation needs.

                                          It has a graphical impact and lineage analysis to visualize data movements, transformations, and other dependencies.

                                          Steps to connect SAP Datasphere with Qliksense

                                          In SAP Datasphere go to the Space Management tab and click on edit in the appropriate space

                                          Make sure to turn on Expose for Consumption, so that the data can be retrieved from Datasphere and consumed in other tools.

                                          Now click on create in the database users and give the appropriate name

                                          Make sure to deploy the space to access the user credentials

                                          Make sure to copy the username, hostname and password in the database user details.

                                          Go to System-> Configuration-> IP Allowlist-> Trusted Ips

                                          Click on Add IP

                                          Get the IPv4 address of your appropriate Internet Service Provider

                                          and make sure to add the IP address in the IP allowlist.

                                          Below is the view I am going to consume in the Qlik Sense to visualize the data

                                          ODBC Part

                                          First download and Install the SAP HDODBC driver in the system. Here’s the URL to download SAP Development Tools (ondemand.com)

                                          Then open the ODBC

                                          Click on Add

                                          Click on HSODBC

                                          • Give any meaningful name to the Data source name, description.
                                          • Database type: SAP HANA Cloud or SAP HANA Single tenant.
                                          • Already copied Host URL in datasphere space, Paste the copied Host URL.

                                          • Click Test connection
                                          • Paste the Database username in Username and Password.

                                          Now the connection is successful.

                                          Qlik Sense Part

                                          In Qlik Sense Click on Create new app

                                          Here Click on Add data

                                          There will be several data connections, among that choose the ODBC connection

                                          Now click on appropriate connection which is already created in ODBC part and give the username and password

                                          Now Click on Owner and click on the space name in which the data (view) is available.

                                          Now click on the view that needs to be visualized in Qlik Sense, the data preview will be available.

                                          Now the data is loaded successfully. Hence the connection between SAP Datasphere and Qlik Sense is successful via ODBC connection

                                          Here’s the sample dashboard created using the fact view consumed from SAP Datasphere

                                          Rating: 0 / 5 (0 votes)

                                          The post Connect & Visualize: SAP Datasphere with Qlik Sense appeared first on ERP Q&A.

                                          ]]>
                                          Access SAP HANA Cloud Underneath of SAP Datasphere https://www.erpqna.com/access-sap-hana-cloud-underneath-of-sap-datasphere/ Fri, 01 Mar 2024 09:48:22 +0000 https://www.erpqna.com/?p=82024 We can leverage the HANA cloud as source in Datasphere. Also HANA cloud in which Datasphere is hosted can be accessed for HDI developments. Pre-requisite I have enabled the free datasphere subscription as of now. Datasphere instance is running in space “dev” under subaccount “free” in Business Technology Platform (BTP) If you have subscribed for […]

                                          The post Access SAP HANA Cloud Underneath of SAP Datasphere appeared first on ERP Q&A.

                                          ]]>
                                          We can leverage the HANA cloud as source in Datasphere. Also HANA cloud in which Datasphere is hosted can be accessed for HDI developments.

                                          Pre-requisite

                                          • You have SAP BTP – Pay as you go account
                                          • Create and add Datasphere instance in SAP BTP
                                          • You have a space in Datasphere and have a Database user created with read/write access and enable HDI consumption

                                          I have enabled the free datasphere subscription as of now. Datasphere instance is running in space “dev” under subaccount “free” in Business Technology Platform (BTP)

                                          If you have subscribed for free plan, subscribe for Business Application Studio and also make sure HANA Schemas and HDI container is part of service plan

                                          Mapping Datasphere and BTP space

                                          In Datasphere, navigate to HDI container section in space management. Raise SAP ticket, mentioning below details

                                          • BTP Org Id – <<Fetch from overview page of BTP subaccount section>>
                                          • Space Id – <<After logging into Space, get space id from the URL>>
                                          • Datasphere tenant Id – <<From Datasphere About section>>

                                          This is to enable HDI container mapping in SAP Datasphere – mapping BTP space and the Datasphere space. I’ am using the “Best Run Bikes” space.

                                          Once ticket is resolved ( after instance mapping is done by SAP) , we can add HDI container leveraging BAS, in HANA cloud on which Datasphere is hosted. Created HDI containers can then be invoked in Datasphere.

                                          HDI Development in BAS

                                          To enable the bi-directional access between Datasphere space and HDI container, below steps are to be followed. This would enable to access the objects in Datasphere in HDI container also access HDI containers in Datasphere space.

                                          STEP 1 – Create User Provider Service in BTP space

                                          • Log into BTP cockpit, navigate to the space where datasphere is running.
                                          • Create a UPS in the space. Details to create UPS can be fetched from Datasphere space. In the Database users section within the space management, click on the information icon and copy the json file below the HDI Consumption section.
                                          • Same credentials can be used to log into Open SQL Schema within space.
                                          {
                                          
                                          "user": "<<Database user name>>",
                                          
                                          "password":,
                                          
                                          "schema": "<<Space name>>",
                                          
                                          "tags": [
                                          
                                          "hana"
                                          
                                          ]
                                          
                                          }

                                          UPSNEW is the user provider service we will be leveraging

                                          STEP 2 HDI Development in Business Application Studio

                                          • Create a dev space in BAS and start the same. I referred project from git repository https://github.com/coolvivz/DWC2HDI which has all files and folders available to start development

                                          • Clone the project from GIT and adjust the same based on the UPS we have created
                                          • Bind the cross-container-service-2 to the UPS created in previous step
                                          • Bind the hdi_dbshare service to the default service instance

                                          • All the default “Database User” user in Datasphere is created with “#” symbol. But VCAP_SERVICES environment variable does not recognize “#” symbol. So create a variable user and use the parameter in the environment variable in the .env file
                                          USER="<<Datasphere Database User Name>>"
                                          
                                          
                                          VCAP_SERVICES=towards the end, replace the database user name with above variable like this "user":"${USER1}"
                                          • Replace the UPS name in the hdbsynonymconfig and hdbgrants file
                                          • Add synonyms for respective objects in Datasphere to be accessed in HDI container
                                          • Now deploy cfg folder followed by src folder. HDI container name is CROSSBTP. If required, this schema name can be changed in the yaml file

                                          • Now in Datasphere, you will be able to see the HDI containers

                                          • Leveraging the git repository, we can get “ready to use” project. We can then add required objects from Datasphere space as synonyms based on our requirement and build calculation views or procedures or flowgraphs.
                                          • This can also be integrated with data from other sources. This integration by utilizing the underlying HANA cloud of Datasphere supports complex implementations

                                          STEP 3 Objects in Datasphere that are exposed to HDI container

                                          • I have mapped standard space “Best Run Bikes” with BTP space. Below tables are available in Datasphere space – Material , Vendor and Material Document

                                          • To expose the tables in Open SQL schema and in turn leverage same in HDI development, one-to-one views are created and exposed for consumption

                                          • In the project cloned from Git, below changes are required
                                            • Add these views as synonyms
                                            • Include them in the hdbgrants file
                                            • default_access_role.hdbrole file should have entry for new views. “SELECT CDS METADATA” privilege is necessary to view columns of the view
                                            • Create calculation view or procedure leveraging objects from Datasphere. cv_matdoc is created in this case joining two views. This can also be integrated or blended with data from other sources.

                                          • After deployment, newly added objects would be accessible from Datasphere. cv_matdoc calculation view can be accessed in Datasphere and further enriched and exposed for reporting.
                                          • Below is screenshot from Datasphere “Best Run Bikes” space data builder leveraging the calculation view from CROSSBTP HDI container

                                          Summary

                                          We have now successfully leveraged the HANA cloud underneath Datasphere for HDI development. We can utilize Datasphere as a service in BTP , map Datasphere space with BTP space and this enables to perform complex developments in terms of blending different data sets in HDI containers. Same can be accessed back in Datasphere and exposed for reporting or even exposed as OData service for consumption.

                                          Rating: 0 / 5 (0 votes)

                                          The post Access SAP HANA Cloud Underneath of SAP Datasphere appeared first on ERP Q&A.

                                          ]]>