SAP Datasphere - ERP Q&A https://www.erpqna.com/tag/sap-datasphere/ Trending SAP Career News and Guidelines Tue, 28 Apr 2026 10:18:06 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://www.erpqna.com/wp-content/uploads/2026/05/cropped-erpqna-32x32.png SAP Datasphere - ERP Q&A https://www.erpqna.com/tag/sap-datasphere/ 32 32 Step-by-Step Guide: Building an Analytical Dashboard in SAP Data Sphere Using Sales Order Data https://www.erpqna.com/step-by-step-guide-building-an-analytical-dashboard-in-sap-data-sphere-using-sales-order-data/?utm_source=rss&utm_medium=rss&utm_campaign=step-by-step-guide-building-an-analytical-dashboard-in-sap-data-sphere-using-sales-order-data Sat, 05 Jul 2025 09:20:39 +0000 https://www.erpqna.com/?p=88331 Introduction: In today’s data-driven world, having real-time insights at your fingertips is crucial for making informed business decisions. SAP Data Sphere is a cloud-based solution designed to integrate, model, and analyze data from various sources. In this blog, I’ll show you how to set up a trial account for SAP Data Sphere and create a […]

The post Step-by-Step Guide: Building an Analytical Dashboard in SAP Data Sphere Using Sales Order Data appeared first on ERP Q&A.

]]>
Introduction:

In today’s data-driven world, having real-time insights at your fingertips is crucial for making informed business decisions. SAP Data Sphere is a cloud-based solution designed to integrate, model, and analyze data from various sources. In this blog, I’ll show you how to set up a trial account for SAP Data Sphere and create a dashboard using SAP’s sample sales order data. The dashboard will include visualizations like charts and tables, along with predictive analysis. Let’s get started!

Step 1: Getting a Trial Account for SAP Data Sphere

Before we begin, you’ll need access to SAP Data Sphere. Here’s how to sign up for a free trial:

1. Visit the SAP Data Sphere Trial Page:

  • Go to SAP Data Sphere Trial.
  • Note: You will need an SAP account, which you can create for free if you don’t have one.

2. Sign Up for the Trial:

  • Click on the “Start your free trial” button. You’ll be prompted to log in or create an account. Once logged in, you can follow the instructions to activate your trial.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

3. Activate the Trial:

After creating your account, confirm your email address and follow the instructions to activate your trial account. Once your trial is active, you’ll be able to access the SAP Data Sphere interface.

4. Access Your SAP Data Sphere Workspace:

Once the trial is set up, navigate to the SAP Data Sphere dashboard. This is where you’ll create your data models, visualizations, and analytics.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Step 2: Importing Sample Sales Order Data

SAP provides sample datasets to help you get started. Follow these steps to import the Sales Order sample data:

1. Navigate to Data Builder:

On the SAP Data Sphere main dashboard, look for the “Data Builder” tab on the left-hand menu and click on it. The Data Builder allows you to import, transform, and model your data.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

2. Import Sample Sales Order Data:

In the Data Builder, click on the “Import” button at the top-right corner. This will open a menu of options. From the available sample datasets, select the “Sales Order Sample” dataset.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

3. Review and Confirm Import:

Once selected, you will see a preview of the data. Confirm that this is the dataset you want to import by clicking “Next,” and then complete the process by clicking “Import”. The Sales Order data will now appear in your workspace under the “Data Sources” section.

Step 3: Building the Data Model

Before creating visualizations, we need to create a data model that organizes the sales order data into useful fields. Follow these steps:

1. Create a New Space:

In SAP Data Sphere, data is organized into spaces. Go back to the main dashboard and click on “Spaces”. Here, create a new space by clicking “Create Space” and give it a relevant name like “Sales Order Analysis”.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

2. Assign Your Data Source to the Space:

In the newly created space, go to the “Data Builder” tab. You’ll see the Sales Order data that you imported. Drag and drop this dataset into your space.

3. Modeling the Data:

Now, click on “New Graphical View” in the Data Builder. Select the Sales Order dataset as your input and choose the fields you want to work with, such as:

  • Sales Order ID (SALESORDERID) – The unique identifier for each sales order.
  • Partner ID (PARTNERID) – The customer or partner associated with the sales order.
  • Gross Amount (GROSSAMOUNT) – The total amount of the sales order, including taxes.
  • Net Amount (NETAMOUNT) – The amount after applying discounts and excluding taxes.
  • Tax Amount (TAXAMOUNT) – The total tax applied to the order.
  • Delivery Status (DELIVERYSTATUS) – The current status of the delivery process.
  • Billing Status (BILLINGSTATUS) – The status of the billing process for the sales order.

These fields will help you create visualizations that show sales trends, financial summaries, customer distribution, and order status information.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Step 4: Creating Visualizations

With the data model in place, it’s time to create visualizations. Here’s how you can create charts, graphs, and tables:

1. Go to SAP Analytics Cloud:

To start building visualizations, navigate to the “Story Builder” tab in the left-hand menu. This is where you’ll create interactive charts and tables.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Bar Chart for Sales by Partner ID:

  1. Click on “Create New Story” and choose a bar chart as the visualization type.
  2. In the data source section, select the data model you created.
  3. Set the X-axis to “Partner ID (PARTNERID)” – this will represent different customers or partners.
  4. Set the Y-axis to “Gross Amount (GROSSAMOUNT)” – this will show the total sales for each partner.
  5. Customize the chart by adding labels, titles, and colors to make it more informative. For example, label the chart as “Sales by Partner” and add a currency format for the Y-axis.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Line Graph to Show Sales Trends Over Time:

  1. Add another visualization by selecting “Add Chart” and choosing a line graph.
  2. Set the X-axis to “Created Date (CREATEDAT)” – this will plot the sales over time.
  3. Set the Y-axis to “Net Amount (NETAMOUNT)” – this will display the net sales amount over time.
  4. Group the data by months or quarters to visualize trends over time, allowing users to identify sales patterns, seasonal fluctuations, or growth over specific periods.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Pie Chart for Market Share by Sales Organization:

  1. Add a new chart and select “Pie Chart” as the visualization type.
  2. Set the category to “Sales Organization (SALESORG)” – this will divide the sales data by different sales organizations.
  3. Set the value to “Gross Amount (GROSSAMOUNT)” – the pie chart will reflect the sales contribution of each organization.
  4. This chart will give you a clear view of the sales distribution across different sales organizations.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Table for Detailed Sales Data:

  1. Add a table to display more detailed information about individual sales orders.
  2. In the table, select fields such as:
    • Sales Order ID (SALESORDERID) – to uniquely identify each order.
    • Partner ID (PARTNERID) – to show which customer made the order.
    • Gross Amount (GROSSAMOUNT) – to display the total value of each order.
    • Created Date (CREATEDAT) – to display when the order was created.
    • Billing Status (BILLINGSTATUS) – to track the billing progress of the order.
    • Delivery Status (DELIVERYSTATUS) – to track the status of deliveries.
  3. Add sorting and filtering options for users to search for specific orders or filter data by criteria like date range, billing status, or delivery status.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

4. Now our complete story looking like below:

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Step 5: Adding Filters for User Interactivity

To make the dashboard more interactive, SAP Data Sphere allows you to add filters that help users explore the data in more detail. Here’s how we can add relevant filters based on your data:

Add a Date Range Filter:

  1. In Story Builder, click the “Filter” icon at the top of the page.
  2. Choose “Date Range” as the filter type.
  3. Connect the filter to the “Created Date (CREATEDAT)” field. This will allow users to select a custom date range and view sales orders created within that timeframe.
  4. This filter will help users analyze trends over specific periods, such as viewing sales in a particular quarter or year.

Add a Sales Organization Filter:

  1. Next, add a filter for Sales Organization so users can focus on sales from specific regions or sales divisions.
  2. Link this filter to the “Sales Organization (SALESORG)” field.
  3. By applying this filter, users can narrow down the data to only view sales from certain sales organizations, helping them understand performance at a regional level.

Add a Partner ID (Customer) Filter:

  1. Lastly, add a filter for Partner ID. This filter will allow users to isolate sales data for specific customers or partners.
  2. Connect this filter to the “Partner ID (PARTNERID)” field from your dataset.
  3. This filter will be useful for users who want to focus on the sales performance of specific customers, making it easier to drill down into customer-specific data and insights.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

How Filters Enhance the User Experience:

By adding these filters, we give users the flexibility to explore the data in ways that matter to them. They can:

  • Focus on sales from specific time periods (e.g., the last quarter or fiscal year).
  • Analyze sales performance by region or sales organization.
  • Drill down into data by specific customers (Partner ID), which is useful for customer analysis and segment-specific performance.

Step 6: Implementing Predictive Analytics

SAP Data Sphere allows us to use historical data to predict future trends and identify anomalies. Let’s implement predictive analytics to forecast future sales and detect unusual patterns in your sales data.

Go to Predictive Scenario:

  1. Navigate to the “Predictive Scenario” section from the left-hand menu in SAP Data Sphere.
  2. This is where you can create forecasting models using historical sales data to generate predictive insights.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Sales Forecast:

  1. Select “Time Series Forecast” as the type of predictive model.
  2. Choose “Net Amount (NETAMOUNT)” as the target variable for the forecast. This will allow us to predict future net sales based on past trends.
  3. Use “Created Date (CREATEDAT)” as the time field to predict future sales trends over time.
  4. After configuring the model (e.g., defining time intervals such as monthly or quarterly forecasts), SAP Data Sphere will generate a sales forecast based on our historical data.
  5. We can then display this forecast as a line graph in our dashboard to visualize future sales projections.

Anomaly Detection:

  1. In addition to forecasting, you can enable Anomaly Detection in the Predictive Scenario section.
  2. Choose “Net Amount (NETAMOUNT)” or “Gross Amount (GROSSAMOUNT)” as the target variable, depending on the type of anomalies we want to detect (e.g., unusually high or low sales amounts).
  3. SAP Data Sphere will analyze your historical sales data and automatically flag any anomalies, such as:
    • Sudden drops in sales.
    • Unusual spikes in demand for specific customers or regions.
  4. We can visualize these anomalies directly on your dashboard, helping your team investigate potential issues or capitalize on unusual growth patterns.

How Predictive Analytics Adds Value:

By using SAP Data Sphere’s predictive analytics capabilities, we can:

  • Plan more effectively by forecasting future sales and identifying seasonal patterns.
  • Detect and respond to anomalies in our sales performance, allowing you to address issues like unexpected sales drops or inventory problems.
  • Visualize predictions easily in the dashboard, enabling your stakeholders to make data-driven decisions based on future trends.

Step 7: Sharing and Publishing the Dashboard

Once our dashboard is ready, we can share it with others or publish it for your team to access.

1. Publish the Dashboard:

    • In the Story Builder, click on the “Publish” button at the top-right corner.
    • Choose the audience you want to share it with, such as specific colleagues or departments, and set appropriate permission levels to ensure data security.

    2. Sharing Links or Embedding the Dashboard:

    • SAP Data Sphere also allows us to share the dashboard via a link or embed it on your company’s internal portal.
    • To share via link, simply click “Get Shareable Link” and send it to your team.

    Conclusion:

    Congratulations! We’ve successfully created an interactive analytical dashboard in SAP Data Sphere using the sample sales order data. With visualizations, filters, and predictive analytics, you’ve transformed raw data into actionable insights. SAP Data Sphere’s flexibility and powerful tools make it easy to create meaningful dashboards for any business use case.

    Rating: 0 / 5 (0 votes)

    The post Step-by-Step Guide: Building an Analytical Dashboard in SAP Data Sphere Using Sales Order Data appeared first on ERP Q&A.

    ]]>
    Integration Between SAP Datasphere and SAP Analytics Cloud https://www.erpqna.com/integration-between-sap-datasphere-and-sap-analytics-cloud/?utm_source=rss&utm_medium=rss&utm_campaign=integration-between-sap-datasphere-and-sap-analytics-cloud Thu, 06 Jun 2024 10:33:11 +0000 https://www.erpqna.com/?p=85310 In this blog, I want to provide the steps we followed to create an OData Service Connection to send data from Datasphere to SAC. By integrating Datasphere Actuals into SAC, we lay the foundation for strategic decision-making, driving business growth and agility. Sending Data from Datasphere to SAC via OData Service Our goal for this […]

    The post Integration Between SAP Datasphere and SAP Analytics Cloud appeared first on ERP Q&A.

    ]]>
    In this blog, I want to provide the steps we followed to create an OData Service Connection to send data from Datasphere to SAC. By integrating Datasphere Actuals into SAC, we lay the foundation for strategic decision-making, driving business growth and agility.

    Sending Data from Datasphere to SAC via OData Service

    Our goal for this task is to consume actuals from Datasphere into SAC for reporting and planning purposes. We will combine the actuals from Datasphere with planning data in SAC.

    Here is a list of steps we took to import Datasphere actual data into SAC:

    1. Identify the OData Source (Data Service URL) in Datasphere

    In order for us the identify the Data Service URL we went to the following link. Within this URL, you can see all the assets available within the spaces available in your Datasphere tenant.

      Within this URL, we copied the assetRelationDataURL for the dataset we required. This is your Data Service URL. We then looked up our chosen assetRelationDataURL using the URL in the figure below. This will get us the meta data from the OData Service.

      NOTE: To view the data from your chosen asset, add the dataset name again at the end of the URL.

      2. Lookup your redirect URI in SAC

      The redirect URI in SAC is the specific endpoint URL. The URI is the destination to which the authentication server redirects the user’s browser after authentication has been successfully completed.

        We added an OData Services connection in our SAC tenant. We then change the Authentication Type to OAuth 2.0 Authorization Code. At the bottom of this window, we can see the Redirect URI.

        3. Create the OAuth Client in Datasphere

        We now want to create an OAuth client in Datasphere. We need the Redirect URI from the above step to create the OAuth client. We do this by going to System >> Administration >> App Integration and click on the ‘Add a New OAuth Client’. Once you paste the Redirect URI and click Add, you’ll get a list of the following information which you will need to take a note of.

        • OAuth Client ID
        • Secret
        • Authorization URL
        • Token URL

        4. Create OData Connection in SAC

        Within SAC, we go to Connections >> Add a Connection and chose OData Service. You need to now add the following connection parameters:

        • Data Service URL = assetRelationalDataUrl
        • OAuth Client ID
        • Secret
        • Token URL
        • Authorisation URL

        5. Import from OData Connection

        We now need to create a model in SAC. We chose OData Services as the data source and click the new connection we just made. We then create a new query and select which dimensions and measures you want when building that query.

        You can see that we have now consumed actuals data from datasphere to SAC! At this stage you can clean data and fix any issues. You have successfully imported the actuals data from OData Connection!!

        6. Create Story and Test

        As you can see here, we created a story. We added a table to our story. We chose the query we created above as the data source for the table. We added ‘Category’ as a column and using the version management functionality we can choose both the Actual and Plan category. This shows Actual data from Datasphere and Plan data within SAC.

        Rating: 0 / 5 (0 votes)

        The post Integration Between SAP Datasphere and SAP Analytics Cloud appeared first on ERP Q&A.

        ]]>
        Enhanced Data Analysis of Fitness Data using HANA Vector Engine, Datasphere and SAP Analytics Cloud https://www.erpqna.com/enhanced-data-analysis-of-fitness-data-using-hana-vector-engine-datasphere-and-sap-analytics-cloud/?utm_source=rss&utm_medium=rss&utm_campaign=enhanced-data-analysis-of-fitness-data-using-hana-vector-engine-datasphere-and-sap-analytics-cloud Sat, 20 Apr 2024 11:40:02 +0000 https://www.erpqna.com/?p=83955 Digging into further analysis of the data, using some newer technologies and datasets, I want to focus on a couple of areas: Also additional SAP Analytics Cloud examples developed will be demo’ed as part of a Webinar that will be linked to this blog. With a bit more details: 1. Finding Similar Events using the […]

        The post Enhanced Data Analysis of Fitness Data using HANA Vector Engine, Datasphere and SAP Analytics Cloud appeared first on ERP Q&A.

        ]]>
        Digging into further analysis of the data, using some newer technologies and datasets, I want to focus on a couple of areas:

        1. Finding Similar Activities using the HANA Vector Engine
        2. Using External Datasets to enrich data, and in my case find areas of concern using HANA Spatial Engine (not new but always good to use newer datasets).
        3. Restricting data between users using SAP Data Access Controls.

        Also additional SAP Analytics Cloud examples developed will be demo’ed as part of a Webinar that will be linked to this blog.

        With a bit more details:

        1. Finding Similar Events using the HANA Vector Engine

        Using a dataset of about 2000 different activities, the goal was to find similar efforts, irrespective of location. Here is a sample of a few records to get an idea of the dataset:

        Sample Activity Dataset

        Using the columns: Elapsed Time, Distance. Relative Effort and Elevation Gain – I created vectors in the HANA Cloud based on these values:

        ALTER TABLE LOCAL_ACTIVITIES ADD (EMBEDDING REAL_VECTOR); 
        
        UPDATE LOCAL_ACTIVITIES SET EMBEDDING = TO_REAL_VECTOR('[
        '||to_varchar("Elapsed Time")||',
        '||to_varchar("Distance")||',
        '||to_varchar("Relative Effort")||',
        '||to_varchar("Elevation Gain")||']')
        WHERE "Elapsed Time" IS NOT NULL
        AND "Distance" IS NOT NULL
        AND "Relative Effort" IS NOT NULL;

        Then creating a view on this, which can be used as a view / SQL query in Datasphere:

        CREATE VIEW RELATED_ACTIVITIES_VECTOR_V as
        SELECT 
        A."Activity Date" AS "SourceActivityDate", B."Activity Date"  AS "TargetActivityDate", A."Activity Name" AS  "SourceActivityName",
        B."Activity Name" as "TargetActivityName", A."Distance" AS  "SourceDistance",
        B."Distance" AS  "TargetDistance", A."Relative Effort"  AS "SourceRelativeEffort", B."Relative Effort" AS  "TargetRelativeEffort",
        A."Elevation Gain"  AS "SourceElevationGain", B."Elevation Gain" AS  "TargetElevationGain",
        cosine_similarity(A.EMBEDDING, B.EMBEDDING) COS,
        l2distance(A.EMBEDDING, B.EMBEDDING) L2
        FROM LOCAL_ACTIVITIES A, LOCAL_ACTIVITIES B
        WHERE A."Activity ID" > B."Activity ID"
        AND cosine_similarity(A.EMBEDDING, B.EMBEDDING) > .999999
        AND l2distance(A.EMBEDDING, B.EMBEDDING) < 100;

        Then making the query available in Datasphere and presenting the results in SAP Analytics Cloud:

        HANA Vector Query Output – in SAP Analytics Cloud

        2. Using External Datasets to enrich data, and in my case find areas of Concern using HANA Spatial Engine.

        In a separate Datasphere space, we have a range of Global Crisis Datasets. For me and my fitness data, I thought the simplest would be determining if I ran near any fires as part of my run. Yes, this data is hindsight – but either way an interesting use case. we easily build a BTP Build App that is preventative to the user, but I’ll leave that for someone else in my team to build.

        So picking Queensland and New South Wales fire data, and joining them together:

        Linking QLD and NSW Fire datasets

        In a relatively simple query, I can perform geospatial analysis query based on location and time:

        SELECT distinct "L_ROW", "Activity ID", "ACTIVITY_DATE", "Activity Type", "SOURCE", "DESCRIPTION", TO_INTEGER("GEO".ST_TRANSFORM( 3857 ).ST_Distance("ST_POINT_3857", 'kilometer')) "Distance_KM", 
            "A"."LATITUDE" "ACTIVITY_LATITUDE", 
            "A"."LONGITUDE" "ACTIVITY_LONGITUDE", 
            "C"."LATITUDE" "CRISIS_LATITUDE", 
            "C"."LONGITUDE" "CRISIS_LONGITUDE"
        FROM "INITIAL" "A", "BTP_CRISIS.Crisis_Dataset" "C"
        where "GEO".ST_TRANSFORM( 3857 ).ST_Distance("ST_POINT_3857", 'kilometer') < 2
        and "ACTIVITY_DATE" BETWEEN add_days("PUBLISHED", -1) AND add_days("PUBLISHED", 1)

        This query is finding all the fire warnings that were published within 1 day, and within 2 km of my start point of the run. I could have also made these parameters, so that the user can also dynamically choose the timescale and the distance. I could also have used the query to be based on any point on the run. Maybe as a V2.

        Once, I enable this data to be viewed in SAC, I can quickly get the output:

        Local Concerns from my Activity location

        3. Restricting data between users using SAP Data Access Controls.

        The data I am loading into the SAP Datasphere instance is multi-user and there is a requirement to keep it isolated across users. So to ensure data privacy, the Datasphere Data Access Controls

        The datasets is unique by user of the Strava Id, so we can simply apply Data Access Controls to limit to their own datasets. For this, a reference table / view is needed, and we simply create a Data Access Control over based on the requirements:

        Creating Data Access Controls in Datasphere

        Applying the Data Access Controls to the appropriate Views:

        Apply to the View
        Linking the Data Access Control

        Now we have a platform that allows us to performs some rich data analysis using modern tools that suit the individual – using their own datasets. Some examples that have been used are around Jupyter Notebooks, DBeaver Analysis and also using text input the Vector engine to match this to the existing datasets. More to come in future blogs and Webinars.

        Personally, I think it’s so much more interesting using you own datasets, in this case a personal fitness dataset – or in the business world , your own systems data – which can easily be supplemented into this technology to gain some invaluable insights.

        Rating: 0 / 5 (0 votes)

        The post Enhanced Data Analysis of Fitness Data using HANA Vector Engine, Datasphere and SAP Analytics Cloud appeared first on ERP Q&A.

        ]]>
        SAP BW Bridge In SAP Datasphere : Connectivity Between S/4HANA System & BW Bridge https://www.erpqna.com/sap-bw-bridge-in-sap-datasphere-connectivity-between-s-4hana-system-bw-bridge/?utm_source=rss&utm_medium=rss&utm_campaign=sap-bw-bridge-in-sap-datasphere-connectivity-between-s-4hana-system-bw-bridge Wed, 27 Mar 2024 10:24:54 +0000 https://www.erpqna.com/?p=82956 Introduction: SAP BW bridge is a functional enhancement of SAP Datasphere and enables ABAP-based data extraction and staging capabilities within SAP Datasphere. It provides customers who are running SAP Business Warehouse or SAP BW/4HANA with access to the public cloud as it offers SAP BW capabilities directly in SAP Datasphere. SAP BW Bridge for SAP […]

        The post SAP BW Bridge In SAP Datasphere : Connectivity Between S/4HANA System & BW Bridge appeared first on ERP Q&A.

        ]]>
        Introduction:

        SAP BW bridge is a functional enhancement of SAP Datasphere and enables ABAP-based data extraction and staging capabilities within SAP Datasphere. It provides customers who are running SAP Business Warehouse or SAP BW/4HANA with access to the public cloud as it offers SAP BW capabilities directly in SAP Datasphere.

        SAP BW Bridge for SAP Datasphere is a feature that makes certain elements from the on-premise SAP BW system available in the cloud. In order to transfer these elements to SAP Datasphere, SAP also supplies the appropriate conversion tools. SAP BW Bridge is a cost-effective and simple way to switch from an on-premise SAP BW system to SAP Datasphere.

        Provisioning the SAP BW Bridge Tenant:

        We need to define SAP BW Bridge Storage before creating SAP BW Bridge Tenant. Configure the size of your SAP BW bridge tenant in the Tenant Configuration:

        From the side navigation, go to System =>Configuration =>Tenant Configuration

        Choose the size & Save

        Now create SAP BW Bridge instance
        Go to System -> Configuration -> SAP BW Bridge and create

        Provide the Instance name and description

        For Development system: Select enable system for development
        For Production system: Deselect enable system for development

        A SAP BW Bridge Space has been created automatically as given below –

        We just need to add users in the BW Bridge space.

        Activate BW Bridge Tenant:

        When a new SAP BW bridge tenant is provisioned together with a new SAP Datasphere tenant, the system owner receives a welcome email. Click the Activate Account button to connect to the server and set your password.

        When a new SAP BW bridge tenant is provisioned to an already existing SAP Datasphere tenant, the first login to the SAP BW bridge tenant must be done by the user who was system owner of the SAP Datasphere tenant when the SAP BW bridge tenant was provisioned.

        System owner needs to create the other users in the BW Bridge Cockpit.

        After activating it get activated:

        Open SAP BW Bridge Cockpit:

        From the side navigation, choose Data Integration Monitor -> Choose BW Bridge space

        Now we can see the BW bridge cockpit, Click Open SAP BW Bridge Cockpit

        Preparing Connectivity for ODP Source Systems in SAP BW Bridge:

        Connecting an SAP on-premise system to SAP BW bridge requires a few more steps than connecting the same system to an SAP BW on SAP HANA or SAP BW∕4HANA system.

        We will required a Cloud Connector which serves as a link between the on-premise source system and your SAP BW bridge tenant which is technically based on an ABAP Platform in SAP BTP. RFC is used as a protocol for data exchange between on-premise source systems and SAP BW bridge.

        System needs to be added in the cloud connector with RFC protocol.

        The on-premise source system must be configured as communication system in BW Bridge Cockpit.

        The source system connectivity is established following these procedures:

        1. Add the SAP Datasphere subaccount in the Cloud Connector.
        2. Create the on-premise source system in the Cloud Connector.
        3. Add the relevant resources to the source system.
        4. Add a service channel to the SAP BW bridge tenant in the Cloud Connector.
        5. Create a communication system in the SAP BW bridge tenant.
        6. Create the source system in the SAP BW Modeling Tools.
        7. Optional: Select and activate preconfigured SAP BW bridge Content objects.

        Add a Service Channel to the SAP BW Bridge Tenant in the Cloud Connector:

        The on-premise source system must be able to call the SAP BW bridge tenant via RFC. Therefore, a service channel must be added in the Cloud Connector.

        Procedure:

        1. Log in to the Cloud Connector
        2. In the left-side menu of the administration UI, select On-Premise To Cloud
        3. In the Service Channels section, click (Add) to add a new service channel.
        4. In the Add Service Channel dialog, use the following values:
          1. Type: ABAP Cloud System
          2. ABAP Cloud Tenant Host: For SAP BTP ABAP based systems like SAP BW bridge, the tenant host is <serviceinstanceguid> .abap.<region>.hana.ondemand.com . The region is, for example, eu10 or us10.

        To retrieve the host name:

        • Log on to SAP Datasphere.
        • From the side navigation, choose Space Management.
        • Select the space BW Bridge.
        • Navigate to the section Connections.
        • Mark the local connection BWBRIDGE and choose Edit.
        • Under HTTP Access copy the host name to the clipboard (without https://).

        1. In the same dialog window, define the Local Instance Number under which the SAP BW bridge system is reachable for the source system(s). The <Local Instance Number> will later be used when maintaining the RFC destination in the on-premise source system pointing to the SAP BW bridge system (i.e. in the so-called callback destination).
        2. Leave Connections set to 1.
        3. Leave Enabled selected to establish the channel immediately after choosing Finish. Unselect it if you don’t want to establish the channel immediately.
        4. Select Finish.

        You can now see that Service channel is enabled.

        Create a Communication System in the SAP BW Bridge Tenant:

        The on-premise source system must be configured as communication system in the SAP BW bridge tenant. A communication system is a specification of a system that represents a communication partner and the technical information required for the communication (inbound/outbound), such as the host name and user information (inbound/outbound).

        Procedure:

        1. Log on to the SAP BW Bridge Cockpit.
        2. In the Communication Management section, select the app Communication Systems.
        3. Click New to add a new Communication System.
        4. In the New Communication System dialog, enter a System ID and a System Name and choose Create.

        1. Under Technical Data, maintain the following values:

        a. Enter the virtual host name maintained in the Cloud Connector as Host Name.
        b. Fill in the port number 33<instance number> as Port, e.g. 3301 if the instance number is 01.
        c. Switch on the property Cloud Connector.

        1. Under RFC Settings, maintain the following values:

        a. Set Load Balancing if the on-premise source system was created with the option With load balancing (system ID and message server) in the Cloud Connector.
        b. Maintain the client of the source system as Client.
        c. If Load Balancing was set, maintain a logon group of the source system as Group. Enter the virtual system ID maintained in the Cloud Connector as Target System. Enter the virtual message server maintained in the Cloud Connector as Message Server.
        d. If Load Balancing was not set, enter the virtual instance number maintained in the Cloud Connector as Instance Number. Enter the virtual application server maintained in the Cloud Connector as Target Host.
        e. Maintain the location ID of the Cloud Connector as SCC Location ID. In case you did not specify a location ID in the Cloud Connector, you can leave SCC Location ID blank.

        1. Under Users for Inbound Communication, click (Add) to maintain the user that is used in the SAP BW bridge tenant for the inbound communication. Select an existing user or click New User to create a new user.
        2. Under Users for Outbound Communication, Enter the username and password of an existing user in the on-premise source system.
        3. Save the new communication system.

        As a result, you have created the on-premise source system as communication system in the SAP BW bridge.

        Maintain Communication Arrangement:

        You need to maintain communication arrangement for the communication system that we have created in the BW Modeling tool.

        You need to choose the predefined communication scenario SAP_COM_0692. And enter the required details like –

        1. Use the technical name of the source system as Arrangement Name
        2. User for Inbound Communication
        3. User for Outbound Communication

        Maintain Callback Destination:

        Create a RFC in S4H system(Standard Name Given in Communication systems):

        1. Log in to the on-premise source system.
        2. Call transaction SM59.
        3. Choose Create to create a new RFC connection.
        4. In the Create Destination dialog, enter the name of the Callback Destination shown in the BW Modeling Tools as name of the Destination and choose RFC connection to ABAP system as Connection Type.
        5. On the Technical Settings tab, enter the hostname of the Cloud connector (without https:// and without port number) as Target Host and the local instance number you defined in the Cloud Connector for the service channel as Instance Number.
        6. On the Logon & Security tab, enter the Language, 100 as Client and User and Password of the user you defined as User for Inbound Communication.

        ODP_INBOUND User which is created in BW bridge cockpit, User for inbound communication

        Connection test:

        Authorization test:

        Create an SAP BW Bridge Project:

        To test the connection between we will create a project in BW modeling tool using service key.

        For service key we need to go to BW Bridge Space => Choose Connections and select the BW Bridge space => Choose Edit => Copy the SAP BW Service Key

        Now Create an SAP BW Bridge Project in BW modeling tool with the system details

        Logical destination and callback destination(which we have created in backend S/4 HANA system) should have status green as below –

        Now the connection has been established successfully between S/4HANA and BW Bridge cockpit.

        With SAP BW Bridge for SAP Datasphere, SAP has created an attractive and cost-effective way for companies to move from their SAP BW on – premise system to SAP Datasphere. SAP BW Bridge is particularly interesting if companies do not want to do a greenfield implementation but want to take existing content with them into the cloud.

        Rating: 0 / 5 (0 votes)

        The post SAP BW Bridge In SAP Datasphere : Connectivity Between S/4HANA System & BW Bridge appeared first on ERP Q&A.

        ]]>
        SAP Datasphere Connectivity With S/4 HANA System & SAP Analytics Cloud: Technical Configuration https://www.erpqna.com/sap-datasphere-connectivity-with-s-4-hana-system-sap-analytics-cloud-technical-configuration/?utm_source=rss&utm_medium=rss&utm_campaign=sap-datasphere-connectivity-with-s-4-hana-system-sap-analytics-cloud-technical-configuration Fri, 15 Mar 2024 10:16:44 +0000 https://www.erpqna.com/?p=82513 Introduction: SAP Datasphere is the next generation of SAP Data Warehouse Cloud and delivers a unified experience for data integration, data cataloging, semantic modeling, data warehousing, data federation, and data virtualization. It is an end-to-end warehouse in the cloud that combines data management processes with advanced analytics. It is built on SAP HANA Cloud and […]

        The post SAP Datasphere Connectivity With S/4 HANA System & SAP Analytics Cloud: Technical Configuration appeared first on ERP Q&A.

        ]]>
        Introduction:

        SAP Datasphere is the next generation of SAP Data Warehouse Cloud and delivers a unified experience for data integration, data cataloging, semantic modeling, data warehousing, data federation, and data virtualization. It is an end-to-end warehouse in the cloud that combines data management processes with advanced analytics. It is built on SAP HANA Cloud and is, together with SAP Analytics Cloud, part of SAP Business Technology Platform.

        In this blog we will see the technical configuration of SAP Datasphere connectivity with S/4 HANA system and enable the below functionalities –

        1. Remote table
        2. Data flows
        3. Replication flow
        4. Model import

        After established connectivity with S/4HANA we will Create Live Data Connections between SAP Datasphere & SAP Analytics Cloud with this we can build stories in SAC directly against the data that is modeled in SAP Datasphere.

        Prerequisites:

        1. Create SAP Datasphere Service Instance in SAP BTP
        2. Install the Data Provisioning Agent
        3. Installing cloud connector on windows/Linux

        Creating Spaces and Allocating Storage in SAP Datasphere:

        1. Go to Space Management and create space and add users in it.

        Register user in Datasphere’s Own subaccount:

        In all Datasphere tenant provisioning, a new Subaccount is created, which is owned and managed by SAP. It’s UUID is visible in the “About” as the tenant ID. For Cloud Connector set up you should use Datasphere Tenant ID that you get in

        System -> Administration -> Data Source Configuration -> SAP BTP Core Account

        Connect and Configure the Data Provisioning Agent:

        Procedure:

        1. Go to System -> Configuration -> Data Integration and create an On-Premise Agent
        2. After created the Agent Settings dialog opens and provides you with information required to configure the Data Provisioning Agent on your local host:
        • Agent name
        • HANA server (host name)
        • HANA port
        • HANA username for agent messaging
        • HANA user password for agent messaging

        Now at the command line, connect the agent to SAP HANA using JDBC. Perform the following steps:

        Run the below commands on the server where DP Agent is installed-

        1. Go to /usr/sap/dataprovagent/bin and run “./agentcli.sh –configAgent”
        2. Choose SAP HANA Connection.
        3. Choose Connect to SAP Datasphere via JDBC.
        4. Enter the name of the agent registration (agent name).
        5. Enter true to use an encrypted connection over JDBC.
        6. Enter the host name (HANA server) and port number (HANA port) for the SAP Datasphere instance.
        7. Enter the credentials for the HANA user for agent messaging.
        8. Now Stop and restart the Data Provisioning Agent.

        Now public ip of the server where DP Agent is installed needs to be added in sap datasphere ip allowlist after that datasphere on-premise data agent will connect to DP Agent.

        Now Agent is showing connected, now go to edit and enable the required adapters.

        Configure Cloud Connector in SAP Datasphere:

        “Add SAP Datasphere’s subaccount in cloud connector with location ID (The same location id would be use in connection properties letter)

        Same location id to be added in SAP Datasphere in System–>Administration–>Data Source configuration”

        Add on Premise system on in cloud connector (Using HTTPS and RFC protocol, system type ABAP) and add necessary resources on each connection:

        In the RFC type connection add the below resources –

        1. For accessing data using CDS view extraction:
          DHAMB_ – Prefix
          DHAPE_ – Prefix
          RFC_FUNCTION_SEARCH
        2. For accessing data based on tables with SAP LT Replication Server:
          LTAMB_ – Prefix
          LTAPE_ – Prefix
          RFC_FUNCTION_SEARCH
        3. For accessing data using ODP connectivity (for legacy systems that do not have the ABAP Pipeline Engine extension or DMIS Addon installed):
          /SAPDS/ – Prefix
          RFC_FUNCTION_SEARCH
          RODPS_REPL_ – Prefix

        Prepare Connectivity to SAP S/4HANA On-Premise:

        Create TCP/IP RFC connection in Source system using program id (IM_HANA_ABAPADAPTER) & Register IM_HANA_ABAPADAPTER program at os level and test the RFC

        Register IM_HANA_ABAPADAPTER program at os level with below command –
        rfcexec -a IM_HANA_ABAPADAPTER -g <Hostname>-x SAPGW<Instance no.> -s<SID> &

        Now do connection test

        Create SAP S/4HANA Live Data Connection of Type Tunnel:

        To enable model import we need to create SAP S/4HANA Live Data Connection of Type Tunnel.

        1. Implement SAP Note 3283282 in the SAP S/4HANA system.
        2. Run the report ESH_CSN_CDS_TO_CSN to prepare the CDS Views for the import.
        3. Activate the below INA services in SICF t-code-
        • /sap/bw/ina/GetCatalog
        • /sap/bw/ina/GetResponse
        • /sap/bw/ina/GetServerInfo
        • /sap/bw/ina/ValueHelp
        • /sap/bw/ina/BatchProcessing
        • /sap/bw/ina/Logoff
        1. Activate the OData service “ESH_SEARCH_SRV”
        2. For live data connection make sure HTTPS should be reachable in the cloud connector

        Now to create live data connection go to System -> Configuration -> Data Integration -> Live Data Connections (Tunnel)

        Enter location ID(same as maintained in cloud connector), hostname and HTTPS port and client number

        Enter the username/password and save.

        Create a connection:

        Go to space which was created for SAP Datasphere and create connection as below –

        1. Choose SAP S/4HANA On-Premise

        Enter the application server details which were added in the cloud connector, system number (Instance number), Client ID, System SID

        Now enter the cloud connector details like- Location ID, Virtual host and port

        Enter the username and password which exist in backend SAP S/4HANA system.

        Data Provisioning Agent – Choose the on-premise agent which were created earlier
        Streaming Read – On
        Gateway Host/Port – Add host and port which were added in the cloud connector
        RFC Destination – enter the RFC destination which was created in backend SAP S/4HANA system

        In live data connection choose the connection which was created earlier

        Click Next

        Save the connection

        Now Validate the connection

        After validating the connection now we can see all the functionalities are enabled and connection with SAP S/4HANA system has been established.

        Create Live Data Connections between SAP Datasphere & SAC:

        We can create live data connections to SAP Datasphere systems, and access any SAP Datasphere Analytical Dataset, Analytic Model or Perspective within SAP Analytics Cloud.
        Before creating live connection in SAP Analytics Cloud, we first need to add the URL of SAP Analytics Cloud as a trusted origin in SAP Datasphere.

        1. In SAP Datasphere go to System -> Administration -> App Integration
        2. Go to the Trusted Origins section and click Add a Trusted Origin.
        3. Add the URL of SAP Analytics Cloud system.

        Click Save.

        Procedure:

        1. In SAC go to Connections -> Add Connection
        2. The Select a data source dialog appears.
        3. Expand Connect to Live Data and select SAP Datasphere.
        4. In the dialog, enter a name and description for your connection.
        5. Add your SAP Datasphere host name and enter 443 as the HTTPS port.
        6. Enter a valid username and password for SAP Datasphere and click OK to set up the connection.

        In this way we can create Live data connection between SAP Datasphere and SAC.

        Test the connection:

        We can test the connection by creating a new Story in SAC.

        1. Go in SAC from the side navigation, choose Stories.
        2. Under Create New, choose one of the story types.
        3. Click Add Data in the left panel.
        4. The Select a data source dialog appears.
        5. Choose Data from an existing dataset or model.
        6. In the Select Model dialog choose the SAP Datasphere Analytical Dataset option.
        7. Select an SAP Datasphere connection
        8. If you’re asked to log in to SAP Datasphere, enter your credentials.
        9. Now you can see the Datasphere space which you created in SAP Datasphere

        Now you can see the Datasphere space which you created in SAP Datasphere you can select the space and load the data and use for creating stories in SAC.

        In this way we can test the connection between SAP Datasphere and SAP Analytics Cloud.

        By creating a live connection of SAP Datasphere tenant with SAC, we can build stories directly against the data that is modeled in SAP Datasphere. Live connections between SAP Analytics Cloud and SAP Datasphere can be established across tenants and data centers.

        Rating: 0 / 5 (0 votes)

        The post SAP Datasphere Connectivity With S/4 HANA System & SAP Analytics Cloud: Technical Configuration appeared first on ERP Q&A.

        ]]>