SAP Analytics Cloud, SAP BTP, ABAP Environment, SAP Datasphere, SAP S/4HANA

SAP Datasphere Connectivity With S/4 HANA System & SAP Analytics Cloud: Technical Configuration

Introduction:

SAP Datasphere is the next generation of SAP Data Warehouse Cloud and delivers a unified experience for data integration, data cataloging, semantic modeling, data warehousing, data federation, and data virtualization. It is an end-to-end warehouse in the cloud that combines data management processes with advanced analytics. It is built on SAP HANA Cloud and is, together with SAP Analytics Cloud, part of SAP Business Technology Platform.

In this blog we will see the technical configuration of SAP Datasphere connectivity with S/4 HANA system and enable the below functionalities –

  1. Remote table
  2. Data flows
  3. Replication flow
  4. Model import

After established connectivity with S/4HANA we will Create Live Data Connections between SAP Datasphere & SAP Analytics Cloud with this we can build stories in SAC directly against the data that is modeled in SAP Datasphere.

Prerequisites:

  1. Create SAP Datasphere Service Instance in SAP BTP
  2. Install the Data Provisioning Agent
  3. Installing cloud connector on windows/Linux

Creating Spaces and Allocating Storage in SAP Datasphere:

  1. Go to Space Management and create space and add users in it.

Register user in Datasphere’s Own subaccount:

In all Datasphere tenant provisioning, a new Subaccount is created, which is owned and managed by SAP. It’s UUID is visible in the “About” as the tenant ID. For Cloud Connector set up you should use Datasphere Tenant ID that you get in

System -> Administration -> Data Source Configuration -> SAP BTP Core Account

Connect and Configure the Data Provisioning Agent:

Procedure:

  1. Go to System -> Configuration -> Data Integration and create an On-Premise Agent
  2. After created the Agent Settings dialog opens and provides you with information required to configure the Data Provisioning Agent on your local host:
  • Agent name
  • HANA server (host name)
  • HANA port
  • HANA username for agent messaging
  • HANA user password for agent messaging

Now at the command line, connect the agent to SAP HANA using JDBC. Perform the following steps:

Run the below commands on the server where DP Agent is installed-

  1. Go to /usr/sap/dataprovagent/bin and run “./agentcli.sh –configAgent”
  2. Choose SAP HANA Connection.
  3. Choose Connect to SAP Datasphere via JDBC.
  4. Enter the name of the agent registration (agent name).
  5. Enter true to use an encrypted connection over JDBC.
  6. Enter the host name (HANA server) and port number (HANA port) for the SAP Datasphere instance.
  7. Enter the credentials for the HANA user for agent messaging.
  8. Now Stop and restart the Data Provisioning Agent.

Now public ip of the server where DP Agent is installed needs to be added in sap datasphere ip allowlist after that datasphere on-premise data agent will connect to DP Agent.

Now Agent is showing connected, now go to edit and enable the required adapters.

Configure Cloud Connector in SAP Datasphere:

“Add SAP Datasphere’s subaccount in cloud connector with location ID (The same location id would be use in connection properties letter)

Same location id to be added in SAP Datasphere in System–>Administration–>Data Source configuration”

Add on Premise system on in cloud connector (Using HTTPS and RFC protocol, system type ABAP) and add necessary resources on each connection:

In the RFC type connection add the below resources –

  1. For accessing data using CDS view extraction:
    DHAMB_ – Prefix
    DHAPE_ – Prefix
    RFC_FUNCTION_SEARCH
  2. For accessing data based on tables with SAP LT Replication Server:
    LTAMB_ – Prefix
    LTAPE_ – Prefix
    RFC_FUNCTION_SEARCH
  3. For accessing data using ODP connectivity (for legacy systems that do not have the ABAP Pipeline Engine extension or DMIS Addon installed):
    /SAPDS/ – Prefix
    RFC_FUNCTION_SEARCH
    RODPS_REPL_ – Prefix

Prepare Connectivity to SAP S/4HANA On-Premise:

Create TCP/IP RFC connection in Source system using program id (IM_HANA_ABAPADAPTER) & Register IM_HANA_ABAPADAPTER program at os level and test the RFC

Register IM_HANA_ABAPADAPTER program at os level with below command –
rfcexec -a IM_HANA_ABAPADAPTER -g <Hostname>-x SAPGW<Instance no.> -s<SID> &

Now do connection test

Create SAP S/4HANA Live Data Connection of Type Tunnel:

To enable model import we need to create SAP S/4HANA Live Data Connection of Type Tunnel.

  1. Implement SAP Note 3283282 in the SAP S/4HANA system.
  2. Run the report ESH_CSN_CDS_TO_CSN to prepare the CDS Views for the import.
  3. Activate the below INA services in SICF t-code-
  • /sap/bw/ina/GetCatalog
  • /sap/bw/ina/GetResponse
  • /sap/bw/ina/GetServerInfo
  • /sap/bw/ina/ValueHelp
  • /sap/bw/ina/BatchProcessing
  • /sap/bw/ina/Logoff
  1. Activate the OData service “ESH_SEARCH_SRV”
  2. For live data connection make sure HTTPS should be reachable in the cloud connector

Now to create live data connection go to System -> Configuration -> Data Integration -> Live Data Connections (Tunnel)

Enter location ID(same as maintained in cloud connector), hostname and HTTPS port and client number

Enter the username/password and save.

Create a connection:

Go to space which was created for SAP Datasphere and create connection as below –

  1. Choose SAP S/4HANA On-Premise

Enter the application server details which were added in the cloud connector, system number (Instance number), Client ID, System SID

Now enter the cloud connector details like- Location ID, Virtual host and port

Enter the username and password which exist in backend SAP S/4HANA system.

Data Provisioning Agent – Choose the on-premise agent which were created earlier
Streaming Read – On
Gateway Host/Port – Add host and port which were added in the cloud connector
RFC Destination – enter the RFC destination which was created in backend SAP S/4HANA system

In live data connection choose the connection which was created earlier

Click Next

Save the connection

Now Validate the connection

After validating the connection now we can see all the functionalities are enabled and connection with SAP S/4HANA system has been established.

Create Live Data Connections between SAP Datasphere & SAC:

We can create live data connections to SAP Datasphere systems, and access any SAP Datasphere Analytical Dataset, Analytic Model or Perspective within SAP Analytics Cloud.
Before creating live connection in SAP Analytics Cloud, we first need to add the URL of SAP Analytics Cloud as a trusted origin in SAP Datasphere.

  1. In SAP Datasphere go to System -> Administration -> App Integration
  2. Go to the Trusted Origins section and click Add a Trusted Origin.
  3. Add the URL of SAP Analytics Cloud system.

Click Save.

Procedure:

  1. In SAC go to Connections -> Add Connection
  2. The Select a data source dialog appears.
  3. Expand Connect to Live Data and select SAP Datasphere.
  4. In the dialog, enter a name and description for your connection.
  5. Add your SAP Datasphere host name and enter 443 as the HTTPS port.
  6. Enter a valid username and password for SAP Datasphere and click OK to set up the connection.

In this way we can create Live data connection between SAP Datasphere and SAC.

Test the connection:

We can test the connection by creating a new Story in SAC.

1. Go in SAC from the side navigation, choose Stories.
2. Under Create New, choose one of the story types.
3. Click Add Data in the left panel.
4. The Select a data source dialog appears.
5. Choose Data from an existing dataset or model.
6. In the Select Model dialog choose the SAP Datasphere Analytical Dataset option.
7. Select an SAP Datasphere connection
8. If you’re asked to log in to SAP Datasphere, enter your credentials.
9. Now you can see the Datasphere space which you created in SAP Datasphere

Now you can see the Datasphere space which you created in SAP Datasphere you can select the space and load the data and use for creating stories in SAC.

In this way we can test the connection between SAP Datasphere and SAP Analytics Cloud.

By creating a live connection of SAP Datasphere tenant with SAC, we can build stories directly against the data that is modeled in SAP Datasphere. Live connections between SAP Analytics Cloud and SAP Datasphere can be established across tenants and data centers.