SAP Analytics Cloud Archives - ERP Q&A https://www.erpqna.com/category/sap-analytics-cloud/ Trending SAP Career News and Guidelines Thu, 03 Apr 2025 11:07:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP Analytics Cloud Archives - ERP Q&A https://www.erpqna.com/category/sap-analytics-cloud/ 32 32 SAP Business Data Cloud (BDC) Integration with SAP Apps and 3rd Party Apps https://www.erpqna.com/sap-business-data-cloud-bdc-integration-with-sap-apps-and-3rd-party-apps/ Thu, 03 Apr 2025 11:07:17 +0000 https://www.erpqna.com/?p=91363 Overview: SAP Business Data Cloud (BDC) is a fully managed SaaS solution designed to unify and govern all SAP data, while also enabling seamless integration with third-party data. By combining the capabilities of SAP HANA Cloud, SAP Datasphere, and SAP Analytics Cloud, BDC empowers businesses to make real-time, data-driven decisions—supported by secure connectivity and a […]

The post SAP Business Data Cloud (BDC) Integration with SAP Apps and 3rd Party Apps appeared first on ERP Q&A.

]]>
Overview:

SAP Business Data Cloud (BDC) is a fully managed SaaS solution designed to unify and govern all SAP data, while also enabling seamless integration with third-party data. By combining the capabilities of SAP HANA Cloud, SAP Datasphere, and SAP Analytics Cloud, BDC empowers businesses to make real-time, data-driven decisions—supported by secure connectivity and a trusted AI foundation.

I’m excited about BDC and the value it brings. Having experienced firsthand the daily challenges of data transfer from SAP systems to data repositories—such as outdated data, transfer failures, and the lack of end-to-end ownership when issues arise—I understand the critical need for a solution that ensures real-time, up-to-date data availability and especially to leverage AI Insights.

Background:

In (Understanding the Technology Behind SAP Business Data Cloud (BDC) – Out of the box ‘Insight App“), I provided an overview of BDC and its business benefits. To help you better understand the underlying technologies and concepts, I’ve broken things down into two key scenarios:

  1. Using “Out-of-the-Box” SAP Insights App with SAP S/4HANA and SAP SuccessFactors as source systems—exploring a step-by-step walkthrough of how the process works.
  2. BDC Integration with SAP Applications, Third-Party Applications, and SAP BW/4HANA Cloud Private Edition.

This blog dives deeper into Scenario 2.

The architecture below provides a detailed summary of how SAP Business Data Cloud (BDC) connects and harmonizes data across SAP and non-SAP applications using a unified data fabric to deliver real-time insights. The key components include data sources, the Business Data Fabric layer, and the consumption layer—enabled through Insight Apps.

Detailed steps:

Step 1: Browse available data products in SAP BDC Cockpit

To extend out-of-the-box Data Products in SAP Business Data Cloud with third-party data, first browse the available Data Products in the SAP Business Data Cloud Cockpit. Select and install the one that best meets your requirements before proceeding with integration.

Step 2: Data Product Installation and Transformation Process

When the Data Product installation begins, the bundled business data from your application is accessed, replicated, and processed through various steps within the Foundation Services of SAP Business Data Cloud and transforming it into a ready-to-use Data Product.

Step 3: Data is now available in SAP Datasphere.

Step 4: Integrating Third-Party Data in SAP Datasphere

You can now build the required data model artifacts directly on top of your Data Product in SAP Datasphere. This is where third-party data integration becomes essential. Leveraging SAP Datasphere’s advanced integration and data modeling capabilities, you can seamlessly enrich the out-of-the-box data model with third-party data.

Step 5: Integrating SAP BW/4 HANA Objects into SAP Business Data Cloud

You can also integrate your existing SAP BW/4HANA objects into SAP Business Data Cloud. Using a dedicated data provisioning tool, you can onboard SAP BW objects into SAP Datasphere and seamlessly combine them with the existing data model.

Step 6: Reports with AI-Powered Insights

SAP Business Data Cloud offers a variety of consumption options for your data model. One example is the AI-powered Just Ask feature in SAP Analytics Cloud. Leveraging natural language processing, it allows you to retrieve insights simply by typing a question about your data. The system instantly responds, presenting the information in the most suitable format or you can also use Insight App dashboards.

Conclusion:

SAP Business Data Cloud enables businesses to seamlessly integrate, harmonize, and analyze data from multiple sources, transforming raw data into actionable insights. By leveraging AI-driven analytics and automation, organizations can make data-driven decisions with confidence. With its out-of-the-box Insight Apps and seamless integration with SAP solutions, it provides a trusted and unified platform for enterprise data management.

Rating: 5 / 5 (1 votes)

The post SAP Business Data Cloud (BDC) Integration with SAP Apps and 3rd Party Apps appeared first on ERP Q&A.

]]>
Understanding the Technology Behind SAP Business Data Cloud (BDC) – Out of the box ‘Insight App” https://www.erpqna.com/understanding-the-technology-behind-sap-business-data-cloud-bdc-out-of-the-box-insight-app/ Mon, 31 Mar 2025 12:01:50 +0000 https://www.erpqna.com/?p=91314 SAP Business Data Cloud is a fully managed SaaS solution that unifies and governs all SAP data while seamlessly integrating third-party data. By combining SAP HANA Cloud, SAP Datasphere, and SAP Analytics Cloud, it empowers businesses to make real-time, data-driven decisions with seamless connectivity and a trustworthy AI foundation. I’m excited about BDC and the […]

The post Understanding the Technology Behind SAP Business Data Cloud (BDC) – Out of the box ‘Insight App” appeared first on ERP Q&A.

]]>
SAP Business Data Cloud is a fully managed SaaS solution that unifies and governs all SAP data while seamlessly integrating third-party data. By combining SAP HANA Cloud, SAP Datasphere, and SAP Analytics Cloud, it empowers businesses to make real-time, data-driven decisions with seamless connectivity and a trustworthy AI foundation.

I’m excited about BDC and the value it brings. Having experienced firsthand the daily challenges of data transfer from SAP systems to data repositories—such as outdated data, transfer failures, and the lack of end-to-end ownership when issues arise—I understand the critical need for a solution that ensures real-time, up-to-date data availability and especially to leverage AI Insights.

To understand the technology behind, I would like to break the process into 2 different scenarios for easy understanding:

  1. Use “out of the box” SAP insights App with “S/4 HANA” and “SAP Success Factors” as source systems and understand step by step on how the process works.
  2. BDC integration with SAP Apps, Third-Party Apps and SAP BW 4/HANA Private Edition integration

Scenario 1: Use “out of the box” SAP insights App with “S/4 HANA” and “SAP Success Factors” as source systems and understand step by step on how the process works.

Below is the high-level architecture of the business data cloud with SAP Standard out of the box “Insight Apps” and using “SAP Apps”

Steps:

Step 1: Selecting the Right Insight App

The first step in leveraging SAP Business Data Cloud is to identify the right Insight App for your business scenario.

  • In the SAP Business Data Cloud Cockpit, you can search for available Insight Apps and install the one that best meets your requirements.
  • Each Insight App is designed to provide tailored insights into specific business functions.

Step 2: Data Access and Replication

Once the Insight App installation is initiated:

  • Business Data Bundles – The relevant business data that has already been structured into a bundle within your business application (e.g., SAP S/4HANA and SAP SuccessFactors) is accessed.
  • Data Replication – This business data is then replicated to the Foundation Services of SAP Business Data Cloud for further processing.

Step 3: Data Harmonization and Transformation

Inside the Foundation Services of SAP Business Data Cloud:

  • Harmonization – The replicated business data is harmonized with other bundled data from different business applications (such as SAP SuccessFactors).
  • Transformation & Enrichment – The data undergoes processing and enrichment to become a structured Data Product.
  • Readiness for Analysis – At this stage, the Data Product is fully prepared for analysis, ensuring consistency and reliability across business functions.

Step 4: Generating Insights with SAP Datasphere

Once the Data Product is created:

  • In SAP Datasphere, all necessary artifacts for analysis are automatically generated
  • Visual insights are automatically generated and provided in various formats, including analytical dashboards.

Step 5: Generating Insights with SAP Data Bricks and Insights.

You can enhance your dashboard insights using AI and Machine Learning. Utilize the SAP Business Data Cloud Cockpit to share the relevant Data Product with SAP Databricks, where advanced AI and Machine Learning techniques generate refined results. These results are then integrated back into the data model, enriching your dashboard with AI-driven insights.

Conclusion:

SAP Business Data Cloud enables businesses to seamlessly integrate, harmonize, and analyze data from multiple sources, transforming raw data into actionable insights. By leveraging AI-driven analytics and automation, organizations can make data-driven decisions with confidence. With its out-of-the-box Insight Apps and seamless integration with SAP solutions, it provides a trusted and unified platform for enterprise data management.

Rating: 5 / 5 (1 votes)

The post Understanding the Technology Behind SAP Business Data Cloud (BDC) – Out of the box ‘Insight App” appeared first on ERP Q&A.

]]>
How to Nail the C_SAC_2501 Analytics Cloud Data Analyst Exam Without Stress https://www.erpqna.com/how-to-nail-the-c_sac_2501-data-analyst-analytics-cloud-exam/ Mon, 17 Mar 2025 07:52:42 +0000 https://www.erpqna.com/?p=87896 Want to breeze through the C_SAC_2501 Analytics Cloud Data Analyst exam? Grab the essential tips and study resources to pass the exam.

The post How to Nail the C_SAC_2501 Analytics Cloud Data Analyst Exam Without Stress appeared first on ERP Q&A.

]]>
Want to breeze through the C_SAC_2501 Analytics Cloud Data Analyst exam? In this guide, we’ll break down some practical tips to make studying simple and effective. Get ready to tackle the exam with confidence and set yourself up for success in the world of cloud analytics!

Key Details About the C_SAC_2501 Exam:

Here are the key details for the C_SAC_2501 SAP Certified Associate – Data Analyst – SAP Analytics Cloud exam.

  • Delivery Methods: SAP Certification
  • Number of Questions: 60
  • Passing Score: 70%
  • Duration: 120 minutes
  • Languages: English

Overview of the C_SAC_2501 Certification:

The C_SAC_2501 certification validates fundamental and essential knowledge for the SAP Analytics Cloud Data Analyst role.

It confirms that candidates have a general understanding and the technical expertise needed to contribute as part of a project team.

The beginner-level Learning Journey, Exploring SAP Analytics Cloud, is recommended for acquiring the basic skills necessary to pass the certification.

Topic Areas Covered in the C_SAC_2501 Exam:

Below are the topics likely to be included in the certification exam. Note that SAP may update the exam content at any time:

  • Story Design (21% – 30%)
  • Planning (21% – 30%)
  • Data modeling, analysis, and integration (21% – 30%)
  • Connections and data preparation (11% – 20%)
  • Performance, troubleshooting, and security management (≤10%)

Study Tips for the C_SAC_2501 Analytics Cloud Data Analyst Certification:

Have A Good Grasp Of the C_SAC_2501 Exam Blueprint:

Before jumping into study mode, take a few minutes to check out the official C_SAC_2501 exam blueprint. This gives you a clear roadmap of what to expect on exam day. Focus on key topics like data modeling, analytics, and the SAP Cloud interface. Knowing where to spend your energy saves you from feeling overwhelmed and helps you make the most of your study time.

Study with SAP Learning Hub:

The SAP Learning Hub is packed with official resources made by SAP experts. It’s like having access to the best tutors in the world. You’ll find video lessons, hands-on labs, and deep dives into everything you need for the C_SAC_2501 exam. Plus, it’s super helpful for understanding those tricky topics that might trip you up during the test.

Break Up Your C_SAC_2501 Study Plan in Manageable Chunks:

A solid study plan can be your best friend. Break your prep into small, manageable chunks and stick to a weekly schedule. This helps you stay on track and avoid cramming everything the night before. Be sure to set aside time for review and hands-on practice as well—both are key to nailing this exam.

Take the Help of C_SAC_2501 Mock Exams:

One of the best ways to know if you’re ready is by taking C_SAC_242 practice tests. Mock exams help you get used to the exam format and show you which areas need a little more love. Plus, they can give you a real confidence boost when you see your progress.

Tap Into the SAP Community:

The SAP Community is full of people who’ve already passed the C_SAC_2501 exam. Join the conversation to get advice, ask questions, or just learn from their experiences. It’s like having a whole team cheering you on and offering tips as you study.

Having Practical Experience Is A Must:

You can’t just rely on theory for the C_SAC_2501 exam—you’ll need real-world experience, too. Use the SAP Analytics Cloud demo environment to play around with data modeling, reporting, and visualization. The more you practice, the better you’ll understand the software and workflows.

Visual Learning Works for the C_SAC_2501 Exam Preparation:

Don’t underestimate the power of a good diagram or mind map! Sketching out workflows or creating visual notes can help you better remember key concepts. If you’re a visual learner, this can be a game-changer in breaking down complicated topics.

Read SAP Docs and Whitepapers:

The official SAP docs and whitepapers are your secret weapon. These are detailed, authoritative resources that dive deep into the nuts and bolts of SAP Analytics Cloud. They might take a little extra time to read, but they’re worth it to fully understand the exam material.

Find A Study Buddy for the C_SAC_2501 Exam:

Sometimes studying with others makes all the difference. Join a study group or find a study buddy who’s also working on the C_SAC_2501 exam. You can compare notes, quiz each other, and share resources—it helps to stay motivated when you’re in it together.

Stay Up-to-Date with SAP News:

SAP is always rolling out new updates, especially in the cloud world. Make sure you’re keeping up with the latest news and features for SAP Analytics Cloud. This helps ensure you’re not studying outdated material and can give you an edge on exam day.

Exam Day Strategies for C_SAC_2501 Analytics Cloud Data Analyst Certification:

Show Up Early, Stay Calm for the C_SAC_2501 Exam:

Whether you’re taking the test online or in person, be ready ahead of time. Arriving early gives you a few minutes to collect your thoughts and review any last-minute notes. Plus, it helps keep your nerves in check!

Read Everything Carefully:

Don’t rush through the instructions. Take a moment to carefully read each question and understand what’s being asked. This simple step can save you from making avoidable mistakes.

Tackle the Easy Questions First During the C_SAC_2501 Exam:

Not every question is going to be tough. Knock out the easier ones first to build up your confidence and save more time for the trickier questions later on.

Keep an Eye on the Clock:

It’s easy to lose track of time during the C_SAC_2501 exam. Make sure you’re pacing yourself so that you don’t run out of time before answering all the questions. Leave a little time at the end to review your answers, too.

Solve the Hard Questions Later:

Everyone hits a tough question now and then. Don’t let it throw you off. Stay focused, keep moving forward, and come back to the hard ones later. A positive attitude can help you push through when the going gets tough.

Challenges in Preparing for the C_SAC_2501 SAP SAC Certification Exam:

1. Wide Range of Topics:

The C_SAC_2501 exam covers a lot of ground, from data modeling to reporting tools. It can feel like there’s a ton of material to study, and it’s easy to get overwhelmed. Start early and focus on breaking it down into smaller sections.

2. Balancing Theory with Practical Knowledge:

Finding the right balance between studying the theory and actually using the tools is tricky. While it’s tempting to dive into books, remember the C_SAC_2501 certification demands hands-on knowledge too. Make sure you’re spending time in SAP Analytics Cloud, not just reading about it.

3. Manage Your Time with C_SAC_2501 Study Plan:

Life gets busy, and finding time to study isn’t always easy. Try to create a realistic study plan that fits into your routine, whether it’s an hour a day or a full day on weekends. Sticking to a schedule is key!

4. Understanding the Jargon:

If you’re new to SAP Analytics Cloud, the technical terms can feel like learning a new language. Make a list of all the key terms and definitions to review regularly—it’ll help you avoid confusion during the exam.

5. Set Small Acievable Goals for the C_SAC_2501 Exam:

Studying can be a long process, and it’s easy to lose steam. Set small, achievable goals, and reward yourself when you hit them. Little victories help you stay motivated as you work toward the big exam day.

Concluding Thoughts:

Preparing for the C_SAC_2501 Analytics Cloud Data Analyst certification might seem like a challenge, but with the right strategies, it’s absolutely doable! By using official resources, practicing hands-on, and sticking to a solid study plan, you’ll be well on your way to passing the exam. Remember to stay positive, take breaks, and reach out to the SAP community for extra support. You’ve got this!

Rating: 5 / 5 (3 votes)

The post How to Nail the C_SAC_2501 Analytics Cloud Data Analyst Exam Without Stress appeared first on ERP Q&A.

]]>
Step-by-Step Guide: Building an Analytical Dashboard in SAP Data Sphere Using Sales Order Data https://www.erpqna.com/step-by-step-guide-building-an-analytical-dashboard-in-sap-data-sphere-using-sales-order-data/ Thu, 17 Oct 2024 11:52:05 +0000 https://www.erpqna.com/?p=88331 Introduction: In today’s data-driven world, having real-time insights at your fingertips is crucial for making informed business decisions. SAP Data Sphere is a cloud-based solution designed to integrate, model, and analyze data from various sources. In this blog, I’ll show you how to set up a trial account for SAP Data Sphere and create a […]

The post Step-by-Step Guide: Building an Analytical Dashboard in SAP Data Sphere Using Sales Order Data appeared first on ERP Q&A.

]]>
Introduction:

In today’s data-driven world, having real-time insights at your fingertips is crucial for making informed business decisions. SAP Data Sphere is a cloud-based solution designed to integrate, model, and analyze data from various sources. In this blog, I’ll show you how to set up a trial account for SAP Data Sphere and create a dashboard using SAP’s sample sales order data. The dashboard will include visualizations like charts and tables, along with predictive analysis. Let’s get started!

Step 1: Getting a Trial Account for SAP Data Sphere

Before we begin, you’ll need access to SAP Data Sphere. Here’s how to sign up for a free trial:

1. Visit the SAP Data Sphere Trial Page:

  • Go to SAP Data Sphere Trial.
  • Note: You will need an SAP account, which you can create for free if you don’t have one.

2. Sign Up for the Trial:

  • Click on the “Start your free trial” button. You’ll be prompted to log in or create an account. Once logged in, you can follow the instructions to activate your trial.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

3. Activate the Trial:

After creating your account, confirm your email address and follow the instructions to activate your trial account. Once your trial is active, you’ll be able to access the SAP Data Sphere interface.

4. Access Your SAP Data Sphere Workspace:

Once the trial is set up, navigate to the SAP Data Sphere dashboard. This is where you’ll create your data models, visualizations, and analytics.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Step 2: Importing Sample Sales Order Data

SAP provides sample datasets to help you get started. Follow these steps to import the Sales Order sample data:

1. Navigate to Data Builder:

On the SAP Data Sphere main dashboard, look for the “Data Builder” tab on the left-hand menu and click on it. The Data Builder allows you to import, transform, and model your data.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

2. Import Sample Sales Order Data:

In the Data Builder, click on the “Import” button at the top-right corner. This will open a menu of options. From the available sample datasets, select the “Sales Order Sample” dataset.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

3. Review and Confirm Import:

Once selected, you will see a preview of the data. Confirm that this is the dataset you want to import by clicking “Next,” and then complete the process by clicking “Import”. The Sales Order data will now appear in your workspace under the “Data Sources” section.

Step 3: Building the Data Model

Before creating visualizations, we need to create a data model that organizes the sales order data into useful fields. Follow these steps:

1. Create a New Space:

In SAP Data Sphere, data is organized into spaces. Go back to the main dashboard and click on “Spaces”. Here, create a new space by clicking “Create Space” and give it a relevant name like “Sales Order Analysis”.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

2. Assign Your Data Source to the Space:

In the newly created space, go to the “Data Builder” tab. You’ll see the Sales Order data that you imported. Drag and drop this dataset into your space.

3. Modeling the Data:

Now, click on “New Graphical View” in the Data Builder. Select the Sales Order dataset as your input and choose the fields you want to work with, such as:

  • Sales Order ID (SALESORDERID) – The unique identifier for each sales order.
  • Partner ID (PARTNERID) – The customer or partner associated with the sales order.
  • Gross Amount (GROSSAMOUNT) – The total amount of the sales order, including taxes.
  • Net Amount (NETAMOUNT) – The amount after applying discounts and excluding taxes.
  • Tax Amount (TAXAMOUNT) – The total tax applied to the order.
  • Delivery Status (DELIVERYSTATUS) – The current status of the delivery process.
  • Billing Status (BILLINGSTATUS) – The status of the billing process for the sales order.

These fields will help you create visualizations that show sales trends, financial summaries, customer distribution, and order status information.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Step 4: Creating Visualizations

With the data model in place, it’s time to create visualizations. Here’s how you can create charts, graphs, and tables:

1. Go to SAP Analytics Cloud:

To start building visualizations, navigate to the “Story Builder” tab in the left-hand menu. This is where you’ll create interactive charts and tables.

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Bar Chart for Sales by Partner ID:

  1. Click on “Create New Story” and choose a bar chart as the visualization type.
  2. In the data source section, select the data model you created.
  3. Set the X-axis to “Partner ID (PARTNERID)” – this will represent different customers or partners.
  4. Set the Y-axis to “Gross Amount (GROSSAMOUNT)” – this will show the total sales for each partner.
  5. Customize the chart by adding labels, titles, and colors to make it more informative. For example, label the chart as “Sales by Partner” and add a currency format for the Y-axis.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Line Graph to Show Sales Trends Over Time:

  1. Add another visualization by selecting “Add Chart” and choosing a line graph.
  2. Set the X-axis to “Created Date (CREATEDAT)” – this will plot the sales over time.
  3. Set the Y-axis to “Net Amount (NETAMOUNT)” – this will display the net sales amount over time.
  4. Group the data by months or quarters to visualize trends over time, allowing users to identify sales patterns, seasonal fluctuations, or growth over specific periods.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Pie Chart for Market Share by Sales Organization:

  1. Add a new chart and select “Pie Chart” as the visualization type.
  2. Set the category to “Sales Organization (SALESORG)” – this will divide the sales data by different sales organizations.
  3. Set the value to “Gross Amount (GROSSAMOUNT)” – the pie chart will reflect the sales contribution of each organization.
  4. This chart will give you a clear view of the sales distribution across different sales organizations.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Table for Detailed Sales Data:

  1. Add a table to display more detailed information about individual sales orders.
  2. In the table, select fields such as:
    • Sales Order ID (SALESORDERID) – to uniquely identify each order.
    • Partner ID (PARTNERID) – to show which customer made the order.
    • Gross Amount (GROSSAMOUNT) – to display the total value of each order.
    • Created Date (CREATEDAT) – to display when the order was created.
    • Billing Status (BILLINGSTATUS) – to track the billing progress of the order.
    • Delivery Status (DELIVERYSTATUS) – to track the status of deliveries.
  3. Add sorting and filtering options for users to search for specific orders or filter data by criteria like date range, billing status, or delivery status.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

4. Now our complete story looking like below:

SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Step 5: Adding Filters for User Interactivity

To make the dashboard more interactive, SAP Data Sphere allows you to add filters that help users explore the data in more detail. Here’s how we can add relevant filters based on your data:

Add a Date Range Filter:

  1. In Story Builder, click the “Filter” icon at the top of the page.
  2. Choose “Date Range” as the filter type.
  3. Connect the filter to the “Created Date (CREATEDAT)” field. This will allow users to select a custom date range and view sales orders created within that timeframe.
  4. This filter will help users analyze trends over specific periods, such as viewing sales in a particular quarter or year.

Add a Sales Organization Filter:

  1. Next, add a filter for Sales Organization so users can focus on sales from specific regions or sales divisions.
  2. Link this filter to the “Sales Organization (SALESORG)” field.
  3. By applying this filter, users can narrow down the data to only view sales from certain sales organizations, helping them understand performance at a regional level.

Add a Partner ID (Customer) Filter:

  1. Lastly, add a filter for Partner ID. This filter will allow users to isolate sales data for specific customers or partners.
  2. Connect this filter to the “Partner ID (PARTNERID)” field from your dataset.
  3. This filter will be useful for users who want to focus on the sales performance of specific customers, making it easier to drill down into customer-specific data and insights.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

How Filters Enhance the User Experience:

By adding these filters, we give users the flexibility to explore the data in ways that matter to them. They can:

  • Focus on sales from specific time periods (e.g., the last quarter or fiscal year).
  • Analyze sales performance by region or sales organization.
  • Drill down into data by specific customers (Partner ID), which is useful for customer analysis and segment-specific performance.

Step 6: Implementing Predictive Analytics

SAP Data Sphere allows us to use historical data to predict future trends and identify anomalies. Let’s implement predictive analytics to forecast future sales and detect unusual patterns in your sales data.

Go to Predictive Scenario:

  1. Navigate to the “Predictive Scenario” section from the left-hand menu in SAP Data Sphere.
  2. This is where you can create forecasting models using historical sales data to generate predictive insights.
SAP Analytics Cloud, SAP Datasphere, Data and Analytics

Create a Sales Forecast:

  1. Select “Time Series Forecast” as the type of predictive model.
  2. Choose “Net Amount (NETAMOUNT)” as the target variable for the forecast. This will allow us to predict future net sales based on past trends.
  3. Use “Created Date (CREATEDAT)” as the time field to predict future sales trends over time.
  4. After configuring the model (e.g., defining time intervals such as monthly or quarterly forecasts), SAP Data Sphere will generate a sales forecast based on our historical data.
  5. We can then display this forecast as a line graph in our dashboard to visualize future sales projections.

Anomaly Detection:

  1. In addition to forecasting, you can enable Anomaly Detection in the Predictive Scenario section.
  2. Choose “Net Amount (NETAMOUNT)” or “Gross Amount (GROSSAMOUNT)” as the target variable, depending on the type of anomalies we want to detect (e.g., unusually high or low sales amounts).
  3. SAP Data Sphere will analyze your historical sales data and automatically flag any anomalies, such as:
    • Sudden drops in sales.
    • Unusual spikes in demand for specific customers or regions.
  4. We can visualize these anomalies directly on your dashboard, helping your team investigate potential issues or capitalize on unusual growth patterns.

How Predictive Analytics Adds Value:

By using SAP Data Sphere’s predictive analytics capabilities, we can:

  • Plan more effectively by forecasting future sales and identifying seasonal patterns.
  • Detect and respond to anomalies in our sales performance, allowing you to address issues like unexpected sales drops or inventory problems.
  • Visualize predictions easily in the dashboard, enabling your stakeholders to make data-driven decisions based on future trends.

Step 7: Sharing and Publishing the Dashboard

Once our dashboard is ready, we can share it with others or publish it for your team to access.

1. Publish the Dashboard:

    • In the Story Builder, click on the “Publish” button at the top-right corner.
    • Choose the audience you want to share it with, such as specific colleagues or departments, and set appropriate permission levels to ensure data security.

    2. Sharing Links or Embedding the Dashboard:

    • SAP Data Sphere also allows us to share the dashboard via a link or embed it on your company’s internal portal.
    • To share via link, simply click “Get Shareable Link” and send it to your team.

    Conclusion:

    Congratulations! We’ve successfully created an interactive analytical dashboard in SAP Data Sphere using the sample sales order data. With visualizations, filters, and predictive analytics, you’ve transformed raw data into actionable insights. SAP Data Sphere’s flexibility and powerful tools make it easy to create meaningful dashboards for any business use case.

    Rating: 0 / 5 (0 votes)

    The post Step-by-Step Guide: Building an Analytical Dashboard in SAP Data Sphere Using Sales Order Data appeared first on ERP Q&A.

    ]]>
    Composites in SAP Analytics Cloud https://www.erpqna.com/composites-in-sap-analytics-cloud/ Wed, 31 Jul 2024 11:47:07 +0000 https://www.erpqna.com/?p=86990 Introduction A composite is a set of widgets that can be used again and again throughout optimized stories, together with scripting elements and customized data. It might be a single corporate header or footer, a page navigation panel, a part that folds up and displays graphics, or a chart with an option to change the […]

    The post Composites in SAP Analytics Cloud appeared first on ERP Q&A.

    ]]>
    Introduction

    A composite is a set of widgets that can be used again and again throughout optimized stories, together with scripting elements and customized data. It might be a single corporate header or footer, a page navigation panel, a part that folds up and displays graphics, or a chart with an option to change the table view or chart type. Story designers and developers save time when composites are ready for use in stories. They can also perform further actions and changes on the composite as a whole, like adding scripts and altering styling.

    A composite can be created, imported, and used immediately in your optimized story, just like other built-in SAP Analytics Cloud widgets.

    Permissions and Roles –

    Here’re composite related permissions:

    • To create a composite, ensure that you have the Create permission for the object type Composite.
    • To import and use a composite in stories, ensure that you have the Read permission.
    • To modify a composite, ensure that you have the Update permission.
    • To delete a composite, ensure that you have the Delete permission.
    • To share a composite, ensure that you have the Share permission.

    Here’s an overview of the default composite permissions available to different roles:

    NOTE –

    Currently, composites are only supported on canvas pages in Optimized Story Experience

    Steps to create Composites –

    1. From Side navigation select Story > Select composite or you can also create a composite from Files by selecting (Create) Composite.

      2. Select Composite tab.

      3. Under Create New, choose Composite.

      You’re now in the composite design environment.

      4. From Assets in the left side panel, drag and drop a widget to the page.

      5. For example, select any widgets ( Eg – Text widget ) and change the color to Pink from Styling panel.

      6. Save the created Composites and choose where to save your composite from the file repository.

      NEXT STEP

      A composite has been made and saved by you. It can then be imported into stories and used as other built-in widgets by story designers or developers.

      7. Create a story with canvas page in Optimized design Mode.

      8. To import a composite, in the Assets panel, hover over Composites, and choose  (Import Composite).

      You can also import a composite from Insert in the toolbar by selecting  (Add) Composite Import Composite….

      A dialog appears where you can select the composite.

      9. Select a composite from the file repository.

      Once the composite’s imported to the story, it’s listed under Composites in the Assets panel as other available widgets.

      To add a composite to your story, drag and drop it from Assets to the page

      Note – If you’d like to import multiple composites, import them one by one.

      Imported composites only belong to the current story.

      Once a composite’s added, you can do actions on it as other widgets in your story, such as edit ID, find reference, enable or disable mouse actions to resize or reposition, show or hide, styling, and edit scripts.

      Note – Only edit mode is supported in composites, while view mode isn’t yet available.

      If a composite’s modified while it’s actively used in stories, the stories get updated when reloaded.

      Rating: 0 / 5 (0 votes)

      The post Composites in SAP Analytics Cloud appeared first on ERP Q&A.

      ]]>
      Connecting SAP Analytics Cloud to Power BI Using Azure Data Factory https://www.erpqna.com/connecting-sap-analytics-cloud-to-power-bi-using-azure-data-factory/ Fri, 21 Jun 2024 04:31:37 +0000 https://www.erpqna.com/?p=85740 Introduction Many companies are using SAP Analytics Cloud (SAC) for planning alongside Power BI for reporting. However, users often request to utilize SAC planning data within Power BI. This integration can be challenging due to differences in data handling and capabilities. If you are facing similar demands, our blog provides a practical solution to this […]

      The post Connecting SAP Analytics Cloud to Power BI Using Azure Data Factory appeared first on ERP Q&A.

      ]]>
      Introduction

      Many companies are using SAP Analytics Cloud (SAC) for planning alongside Power BI for reporting. However, users often request to utilize SAC planning data within Power BI. This integration can be challenging due to differences in data handling and capabilities. If you are facing similar demands, our blog provides a practical solution to this issue.

      In this blog post, we will guide you through creating an export service for SAC that integrates seamlessly with Power BI using Azure Data Factory (ADF). This straightforward setup leverages the SAC Data Export Service API, ADF and Multi-Action API Step to ensure minimal compute usage and cost efficiency.

      To demonstrate how this setup works in practice, we’ve created a demo video. Watch the video below to see how updating data in an SAC story seamlessly updates the data in Power BI. This will give you a clear understanding of the real-time integration between SAC and Power BI using ADF.

      Prerequisites

      Before we begin, ensure you meet the following prerequisites

      • SAC Tenant, including Planning Model and related story
      • Azure Data Factory Account
      • Azure Storage Account in ADLS
      • Power BI

      Step-by-Step Guide

      1. Creating the Data Export Pipeline in ADF
      2. Triggering ADF Pipeline through Multi-Action API Step
      3. Connecting Power BI with Azure Data Lake Storage
      4. Running the entire process

      Creating the Data Export Pipeline

      The pipeline consists of four main parts:

      1. REST Service: Create a REST service for the desired model.
      2. Copy Table Names: Identifies all endpoints for the specified model.
      3. Lookup Tables: Outputs the names obtained from “Copy Table Names”.
      4. ForEach Loop: Uses the SAC REST API to retrieve data from the endpoints and copy it to ADLS.

      To help you visualize the above-mentioned steps, we’ve created a short GIF that quickly demonstrates the process.

      Setting up REST Service

      1. Create a REST Service:

      1.1 Set up a REST Service with base URL:
      {your SAC tenant}/api/v1/dataexport/providers/sac/{model ID}

      1.2 Choose “OAuth 2.0 Client Credential” for the authentications type. We will name this service “REST_EXPORT_API_DEMO”.

      2. Generate OAuth Credentials:

      2.1 In your SAC tenant, navigate to ‘System > Administration > App Integration’ to find your token endpoint and create a client to get the client ID and client secret.

      Setting up “Copy Table Names”

      1. Create a REST Dataset:

      1. Set up a REST dataset using the linked service “REST_EXPORT_API_DEMO”.
        Name this table “Tables”.

      2. Data Source Configuration:

      1. In “Copy Table Names” select the REST dataset you just created.

      2. Create a CSV file to store the table names. The file path should be:
      {Storage account}/{model ID}/Table_names.csv
      This will be used as the sink for “Copy Table Names”.

      3. Use the just created CSV file as the sink dataset.

      4. Ensure that the mapping includes only the table names.

      Setting up “Lookup Tables”

      Use the sink dataset from the “Copy Table Names” step as the source for “Lookup Tables”.

      Configuring the “ForEach” Loop

      Under settings, set the “Items” to: @activity(‘Lookup tables’).output.value

      This configuration ensures the loop iterates over all table names, using them in the subsequent copy data activities within the loop.

      Configuring “Copy Data” within the “ForEach” loop

      1. Set up a new REST Dataset: This will be called “REST_API_ForEach”. We will use the “REST_EXPORT_API_DEMO” as linked service, but will now add a dynamic relative URL.

      2. Setting up Source: We will use the above dataset as source and the item setting that configured under “Configuring the “ForEach” Loop”. This way we will access the endpoint for each table in the given model.

      3. Creating a dynamic JSON sink dataset: The JSON sink dataset is created such that we will have one unique JSON file for each table in the model, which will inherit its name from the model.

      Setting the sink in “Copy Data”: We now simply use the above created table along with the “Items” configured under “Configuring the “ForEach” Loop”.

      The ADF pipeline is now fully configured and should export the model from SAC into the Azure blob storage, under the path defined in the first step of “Creating a dynamic JSON sink dataset”.

      Triggering ADF Pipeline through Multi-Action

      Register an App in Azure

      1. Go to ‘Home > App Registration > Endpoint‘ and copy the Token URL:
        “OAuth 2.0 token endpoint (v2)”.
      2. Go to ‘Home > App Registration‘ and click ‘+ New Registration‘.
      3. Name your app and select ‘Accounts in this organizational directory only (Single tenant)’.
      4. Copy the ‘Application (client) ID‘, this is your “OAuth Client ID”.

      Configure OAuth Credentials

      1. Go to ‘Certificates & Secrets’ and create a new client secret. Copy the ‘Value‘ as your client secret.

      2. Ensure that the created App, “SAC”, has the permission ‘user_impersonation‘.

      3. Assign the app the role of ‘Data Factory Contributor‘:
      Azure Portal > Data factories > Your Data Factory > Access Control (IAM) > Add Role Assignment‘.

      Create the Connection in SAC

      Create a connection using the OAuth Client ID, Secret and OAuth 2.0 token endpoint (v2) retrieved in the previous steps.

      The OAuth client ID is a unique, autogenerated ID that identifies your Azure application to SAP Cloud Platform and verifies its authorization to access requested resources. The specific OAuth client ID used above is given as the “Application ID” retrieved in the last step of “Register an App in Azure”.

      The secret is a confidential password that is associated with the client application. It is used by SAC to prove its identity to the Azure platform. It is the ‘Value’ retrieved in the first step of “Configure OAuth Credentials”.

      The “Token URL” is the endpoint that SAC will use to request an access token. The access token is a temporary authorization that is used by the client application to access the requested data or services.

      Creating the Multi-Action

      • Add a data action: to publish changes made to the story before exporting.

      Configure the export API step: To do this, choose the connection made earlier and use the API URL:

      The URL is given by:
      https://management.azure.com/subscriptions/{Subscription ID}/resourceGroups/{Resource Group Name}/providers/Microsoft.DataFactory/factories/{Data Factory ID}/pipelines/{Pipeline ID}/createRun?api-version=2018-06-01

      Secondly, choose the request method “Synchronous Return”.

      • Add the Multi-Action to a story which is uses the same model as the one in ADF.

      Connecting Power BI with ADLS

      Retrieve the Endpoint of the ADLS

      • Go to your storage account in Azure where the tables are saved and search for endpoints.
      • Copy the Data Lake Storage endpoint (not the resource ID).

      Configure Power BI

      • In Power BI, click ‘Get Data‘ and select ‘Azure Data Lake Storage Gen 2‘.
      • Paste the endpoint and add the path to the folder:
        https://{storageaccount}.dfs.core.windows.net/{Blob container}/{model_id}

      Unpack Data

      For each table, unpack the data. Repeat this process for all tables.

      Running the entire process

      1. Modify the Story in SAC: Make changes and trigger the multi-action.
      2. Monitor ADF: Go to ADF and ensure the pipeline is triggered and completed successfully.
      3. Refresh Power BI Data: Refresh the data in Power BI to reflect the changes.
      Rating: 0 / 5 (0 votes)

      The post Connecting SAP Analytics Cloud to Power BI Using Azure Data Factory appeared first on ERP Q&A.

      ]]>
      NDC Financial Consolidation https://www.erpqna.com/ndc-financial-consolidation/ Wed, 19 Jun 2024 12:14:33 +0000 https://www.erpqna.com/?p=85708 Ideal financial consolidation extension for SAP Analytics Cloud and a migration path from SAP Financial Consolidation (SAP BO FC) and SAP BPC. While SAP Group Reporting remains the preferred legal consolidation solution for companies utilizing SAP S/4HANA as their principal ERP solution, many companies are seeking a more flexible option that either fits their hybrid […]

      The post NDC Financial Consolidation appeared first on ERP Q&A.

      ]]>
      Ideal financial consolidation extension for SAP Analytics Cloud and a migration path from SAP Financial Consolidation (SAP BO FC) and SAP BPC.

      While SAP Group Reporting remains the preferred legal consolidation solution for companies utilizing SAP S/4HANA as their principal ERP solution, many companies are seeking a more flexible option that either fits their hybrid ERP landscape, follows a holistic EPM (Enterprise Performance Management) approach, or they simply favour a stand-alone EPM solution, decoupled from the ERP systems.

      In this post, I would like to introduce NDC Financial Consolidation, a solution developed by NDC Group Swiss AG to address this need.

      NDC Financial Consolidation – Overview

      NDC Financial Consolidation provides end-to-end legal consolidation functions, ranging from data collection to external and internal reporting of the consolidated results (thus supporting the entire close-to-disclose process).

      Besides the legal (statutory) consolidation functions, the solution also provides support for managerial consolidation for internal stakeholders (based on the internal accounting rules), as well as consolidation of plans and non-financial indicators (e.g. data for ESG reporting).

      The tool is very easy to set up and manage thanks to its no-code comprehensive user interface exposing all the needed functions directly to the financial teams, reducing the dependency on IT or implementation partners in the long run.

      Functionality-wise, NDC Financial Consolidation provides a similar concept known from SAP FC (SAP BusinessObjects Financial Consolidation) and it supports even the advanced consolidation requirements such as transaction-level intercompany eliminations, journal-based adjustments, complex currency conversions, or support of ongoing scope changes during the accounting period.

      Figure 1: Solution Overview

      Migration from SAP FC and SAP BPC

      Besides the well-known consolidation concept, NDC Financial Consolidation offers partially automated migration mechanisms from SAP FC (SAP BusinessObjects Financial Consolidation) and SAP BPC.

      Combination of these makes the solution very suitable for current SAP FC and SAP BPC users as a simple and hassle-free cloud migration option, unlocking the potential of the other SAP cloud tools and solutions, while retaining a similar and known consolidation logic if required.

      This approach not only shortens the implementation time (typically ranging between 6-12 months) but also allows easy replication of the currently operational consolidation process with the elimination of re-engineering and significant unwanted redesign (option to fully update and redesign the consolidation processes as part of the migration of course still stays available).

      Similar option is also available for companies currently using SEM-BCS where typically the migration mechanism can easily convert the used dimensions and data model setup, chart of accounts and other mappings, journal carry-forwards, and basic pre-sets of consolidation rules.

      Standalone or as a SAP Analytics Cloud extension

      The solution is built on top of SAP Business Technology Platform thanks to which it can not only be deployed as a standalone consolidation tool, but also as an SAP Analytics Cloud extension granting a seamless user experience and data integration.

      The combination with SAP Analytics Cloud therefore extends the standard functionalities of SAP Analytics Cloud, offering the option to implement a holistic EPM (Enterprise Performance management) platform supporting planning, forecasting, analysis, reporting, and consolidation functions and processes.

      Figure 2: Planning & Consolidation Environment

      Open data integration and data submission concept

      The open and flexible design philosophy of the solution makes it independent on the ERP landscape and thus very well suits companies with mixed SAP and non-SAP ERP systems (as well as a mixture of on-premise and cloud ERP systems such as SAP R/3 and SAP S/4HANA).

      Following this philosophy, the solution offers three main ways of data submission:

      1. Automated interfaces to SAP and non-SAP systems,
      2. Excel imports/exports based on the automatically generated templates for each consolidation statement,
      3. Web input forms (masks) which allow direct entry through the solution’s user interface.

      These data submission options can be individually combined and tailored for each reporting unit (company’s entity) to address the individual needs and abilities.

      Key advantages

      Compared to the other consolidation solutions in the market, NDC Financial Consolidation therefore offers the following advantages:

      • End-end support of the consolidation process, incl. advanced consolidation functions such as invoice-based intercompany elimination, ongoing scope changes, or custom FX rates,
      • Consolidation of budgets, plans, and forecasts (on top of consolidation of actuals), enhancing the existing SAC consolidation functionality by additional controls and detailed processing logic,
      • Incorporation of detailed disclosures and notes, providing further details and explanations of the consolidated statements.
      • Open integration concept suitable for companies with mixed ERP landscape (SAP and non-SAP),
      • User-friendliness and no-code implementation and maintenance (requiring no coding or scripting skills),
      • Seamless extension with reporting, planning, and analytics thanks to SAC seamless integration (user interface AND data integration),
      • An unlimited number of custom dimensions and data attributes (editable through the user interface).

      Positioning

      As introduced at the beginning of this blog, SAP Group Reporting remains the preferred technology for the groups with a strong S/4HANA ERP landscape, which can utilize the Group Reporting’s tight integration to S/4HANA and all advantages it offers such as real-time availability of the restatements and possibility to drill-down from consolidated statements up to the level of individual transactions.

      While both solutions offer a vast range of benefits and functionalities (such as automation functions, full consolidation overview and workflows, complex validations, etc.), the positioning of the two solutions can be therefore summed up as:

      Group reporting

      • Preferred consolidation tool for S/4HANA-based companies,
      • Tight integration to S/4HANA data model with all the advantages,
      • Near real-time data collection and propagation of changes.

      NDC Financial Consolidation

      • Open integration and data model concept, making it suitable for companies with hybrid ERP landscape,
      • Easy consolidation extension with additional notes, disclosures, and non-financial data (e.g. ESG indicators),
      • Lightweight and simple user interface, making it easy to set up and manage directly by the financial teams,
      • Looking to implement a holistic EPM platform.

      Supported by global partners

      NDC Financial consolidation is currently supported and provided by more than 10 global SAP partners, who offer a range of services, including full configuration and deployment works and ongoing support and maintenance.

      This makes the solution not only independent of NDC Group but also allows everyone to select the preferred partner for the delivery and associated services of the solution.

      Conclusion

      NDC Financial Consolidation is a welcome addition to the SAP-based solution portfolio, highly suitable for companies seeking a simple, open, yet powerful consolidation solution, which can be additionally embedded into SAP Analytics Cloud to provide a holistic unified planning, forecasting, analysis, reporting and consolidation platform.

      Utilising the SAP Business Technology platform makes the solution not only fast and robust but also easily maintainable, extensible, and connectible to other SAP Cloud tools and solutions.

      Rating: 0 / 5 (0 votes)

      The post NDC Financial Consolidation appeared first on ERP Q&A.

      ]]>
      Pass URL-Filter on BW Live Hierarchy Nodes from Story to Story https://www.erpqna.com/pass-url-filter-on-bw-live-hierarchy-nodes-from-story-to-story/ Mon, 10 Jun 2024 07:17:15 +0000 https://www.erpqna.com/?p=85413 Introduction: Currently it is not possible to pass an URL-Filter on Hierarchy Nodes for BW-Live Models with the standard options. In this blog I´ll show how this can be accomplished with some simple scripting and story setup. Use Case: There is one story which contains input controls (or story filters) for specific dimensions with hierarchies […]

      The post Pass URL-Filter on BW Live Hierarchy Nodes from Story to Story appeared first on ERP Q&A.

      ]]>
      Introduction:

      Currently it is not possible to pass an URL-Filter on Hierarchy Nodes for BW-Live Models with the standard options.

      In this blog I´ll show how this can be accomplished with some simple scripting and story setup.

      Use Case:

      There is one story which contains input controls (or story filters) for specific dimensions with hierarchies and we would like to pass the selected members of the hierarchy (be it nodes or leafs) to another story where these values should be applied as respective filters again.

      Solution:

      The jump to the other story is bound to a specific action, e.g. click on a button. This Event will be used to derive the selected members and pass them as URL parameters.

      The event contains the following script (example for two relevant dimensions on input controls):

      var dim1_string = "";
      //only retrieve selected members if not all selected
      if(IC_Dim1.getInputControlDataSource().isAllMembersSelected() === false){
      	var mem1 = IC_Dim1.getInputControlDataSource().getActiveSelectedMembers(10000);
      	for(var i=0;i<mem1.length;i++){
      		if(mem1[i].id.startsWith("PSEUDO")){ //really selected members
      			if(mem1[i].id.endsWith("-")){ //is a node
      				dim1_string = dim1_string.concat("/0HIER_NODE!".concat(mem1[i].displayId));
      			}
      			else{ //is a leaf
      				dim1_string = dim1_string.concat("/!".concat(mem1[i].displayId));
      			}
      		}
      	}
      }
      	var p_dim1 = UrlParameter.create("p_dim1", dim1_string);
      
      var dim2_string = "";
      //only retrieve selected members if not all selected
      if(IC_Dim2.getInputControlDataSource().isAllMembersSelected() === false){
      	var mem2 = IC_Dim2.getInputControlDataSource().getActiveSelectedMembers(100000);
      	for(var j=0;j<mem2.length;j++){
      		if(mem2[j].id.startsWith("PSEUDO")){ //really selected members
      			if(mem2[j].id.endsWith("-")){ //is a node
      				dim2_string = dim2_string.concat("/0HIER_NODE!".concat(mem2[j].displayId));
      			}
      			else{ //is a leaf
      				dim2_string = dim2_string.concat("/!".concat(mem2[j].displayId));
      			}
      		}
      	}
      }
      	var p_dim2 = UrlParameter.create("p_dim2", dim2_string);
      
      	NavigationUtils.openStory("STORY_ID", "PAGE_ID", [p_dim1, p_dim2]);

      Important to note is:

      • Also the implicitly selected members (aka leafs below a node) will be retieved from the “getActiveSelectedMembers” API. Hence, the allowed members need to be increased for most hierarchies (because the default is 100) – if multiple extremely big hierarchies are involved the performance and response time should be checked.
      • The “really” selected members are starting in “id” with “PSEUDO” and nodes are ending with “-“

      • This concept is used to find the relevant IDs and decide which enhancement needs to be made to fit the expected BW syntax – either “0HIER_NODE!” or just “!”
      • All members are concatenated into a string and seperated by “/” (other characters also possible, should not interfer with the expected member IDs)
      • These strings are handed over to an URL parameter and used in the NavigationUtils.openStory API

      If the same should be done for story filters, the API “Application.getFileDataSource.getDimensionFilters” can be used which immediately only gives back the chosen nodes/leafs with the correct BW syntax. They can directly be concatenated into a string.

      In the story on the receiving end these parameters are maintained as script variables:

      E.g. in the onInit script, the parameters can be split into the members and applied to an input control (example of first dimension):

      if(dim1.length>0){
      var dim1_param = dim1.slice(1); //remove first char, is "/"
      var dim1_arr = dim1_param.split("/");
      IC_Dim1.getInputControlDataSource().setSelectedMembers(dim1_arr);
      }

      With this the correct hierarchy filters from Story 1 are passed to Story 2.

      Rating: 0 / 5 (0 votes)

      The post Pass URL-Filter on BW Live Hierarchy Nodes from Story to Story appeared first on ERP Q&A.

      ]]>
      Integration Between SAP Datasphere and SAP Analytics Cloud https://www.erpqna.com/integration-between-sap-datasphere-and-sap-analytics-cloud/ Thu, 06 Jun 2024 10:33:11 +0000 https://www.erpqna.com/?p=85310 In this blog, I want to provide the steps we followed to create an OData Service Connection to send data from Datasphere to SAC. By integrating Datasphere Actuals into SAC, we lay the foundation for strategic decision-making, driving business growth and agility. Sending Data from Datasphere to SAC via OData Service Our goal for this […]

      The post Integration Between SAP Datasphere and SAP Analytics Cloud appeared first on ERP Q&A.

      ]]>
      In this blog, I want to provide the steps we followed to create an OData Service Connection to send data from Datasphere to SAC. By integrating Datasphere Actuals into SAC, we lay the foundation for strategic decision-making, driving business growth and agility.

      Sending Data from Datasphere to SAC via OData Service

      Our goal for this task is to consume actuals from Datasphere into SAC for reporting and planning purposes. We will combine the actuals from Datasphere with planning data in SAC.

      Here is a list of steps we took to import Datasphere actual data into SAC:

      1. Identify the OData Source (Data Service URL) in Datasphere

      In order for us the identify the Data Service URL we went to the following link. Within this URL, you can see all the assets available within the spaces available in your Datasphere tenant.

        Within this URL, we copied the assetRelationDataURL for the dataset we required. This is your Data Service URL. We then looked up our chosen assetRelationDataURL using the URL in the figure below. This will get us the meta data from the OData Service.

        NOTE: To view the data from your chosen asset, add the dataset name again at the end of the URL.

        2. Lookup your redirect URI in SAC

        The redirect URI in SAC is the specific endpoint URL. The URI is the destination to which the authentication server redirects the user’s browser after authentication has been successfully completed.

          We added an OData Services connection in our SAC tenant. We then change the Authentication Type to OAuth 2.0 Authorization Code. At the bottom of this window, we can see the Redirect URI.

          3. Create the OAuth Client in Datasphere

          We now want to create an OAuth client in Datasphere. We need the Redirect URI from the above step to create the OAuth client. We do this by going to System >> Administration >> App Integration and click on the ‘Add a New OAuth Client’. Once you paste the Redirect URI and click Add, you’ll get a list of the following information which you will need to take a note of.

          • OAuth Client ID
          • Secret
          • Authorization URL
          • Token URL

          4. Create OData Connection in SAC

          Within SAC, we go to Connections >> Add a Connection and chose OData Service. You need to now add the following connection parameters:

          • Data Service URL = assetRelationalDataUrl
          • OAuth Client ID
          • Secret
          • Token URL
          • Authorisation URL

          5. Import from OData Connection

          We now need to create a model in SAC. We chose OData Services as the data source and click the new connection we just made. We then create a new query and select which dimensions and measures you want when building that query.

          You can see that we have now consumed actuals data from datasphere to SAC! At this stage you can clean data and fix any issues. You have successfully imported the actuals data from OData Connection!!

          6. Create Story and Test

          As you can see here, we created a story. We added a table to our story. We chose the query we created above as the data source for the table. We added ‘Category’ as a column and using the version management functionality we can choose both the Actual and Plan category. This shows Actual data from Datasphere and Plan data within SAC.

          Rating: 0 / 5 (0 votes)

          The post Integration Between SAP Datasphere and SAP Analytics Cloud appeared first on ERP Q&A.

          ]]>
          Consolidation Extension for SAP Analytics Cloud – Automated Eliminations and Adjustments (Part 1) https://www.erpqna.com/consolidation-extension-for-sap-analytics-cloud-automated-eliminations-and-adjustments-part-1/ Tue, 21 May 2024 11:46:54 +0000 https://www.erpqna.com/?p=84977 In this blog, I will describe the automated eliminations and adjustments that will happen once the user triggered the consolidation logic. Due to the size of the content, i will break the Automated Elimination and Adjustments blog into a few smaller, more bite size blogs. The Consolidation extension for SAP Analytics Cloud comes preconfigured with […]

          The post Consolidation Extension for SAP Analytics Cloud – Automated Eliminations and Adjustments (Part 1) appeared first on ERP Q&A.

          ]]>
          In this blog, I will describe the automated eliminations and adjustments that will happen once the user triggered the consolidation logic. Due to the size of the content, i will break the Automated Elimination and Adjustments blog into a few smaller, more bite size blogs.

          The Consolidation extension for SAP Analytics Cloud comes preconfigured with a set of consolidation methods and elimination rules that should cover over 80% of the IFRS consolidation requirements; this is based on 14 years of experience in developing consolidation solutions for holding companies with subsidiaries and affiliates in multiple industries.

          It is worth noting from the beginning, that all the below Eliminations and automated adjustments are based on data action triggers linked to advanced formulas and logic, that run purely on member attributes (except one logic related to net income for the period). So, no coding is required to adjust and adopt the solution, as long as the requirements are within the set of defined rules. Assigning specific attributes to accounts, entities, scopes, and intercompany members in their respective dimensions is suffice to have the rules applied. This can be done in a guided manner using the Administrator Analytic Application.

          Additional automated elimination rules that aren’t being listed can be easily incorporated into another version of the data action trigger.

          In this blog, we will be covering automated elimination based on different consolidation methods, whereas each elimination has a specific sequence of logic. From a technical perspective, we selected a set of these eliminations to showcase the methods of sequential manipulation of data to generate eliminated values. In some cases, the values are picked from the Intercompany breakdown of the entity, and posted on the entity itself. In other cases, the logic will invest entity with its intercompany and intercompany with its entity, and it additional cases, the intercompany breakdown is not even used in the calculations. Furthermore, some logic will knock off a set of accounts and reflect their residual value, while other logic would have a three leg entry. From a technical perspective, the combination of all these methods of data manipulation to eliminate and adjust can be reused to create any array of elimination and adjustments. Since the eliminations and adjustments work on Attribute Values, and not fixed members, by adding new attributes and reusing the formulas to target the new attributes, the logic can be composed as desired.

          In the above screenshot, we highlight three properties in the account dimension (as an example).

          The Account A1810 has a property value of CO_INV under Elimination Code and a clearing account A189B, with Elimination Audit member ELM_INVESTMENT. By assigning these property values to another account, the rules will be applied to it. The Eliminations and Adjustments rules do not need to be changed if we have to apply the logic to your COA. Also, We can add new audit members in the audit ID dimension, and specify in the Elimination Audit Member column which audit member will host the elimination without changing the rules. It will write back to the destination audit member automatically, based on the assignment in the account dimension. There are other properties in the intercompany, entity, scope, flow, currency and other dimensions that behave in the same manner, ensuring new zero additional coding when adopting the solution and applying your own master data.

          By just maintaining the properties values for the master data, you would have configured over 90% of the solution. Few calculations are based on fixed accounts (Such as retained earning – E1610 or Net income (loss) attributable to noncontrolling interest – P8000) and can be easily changed in the Advanced Formulas.

          They are even highlighted with a text “Change Account ID here”

          The Consolidation Extension for SAP Analytics Cloud comes with an extensive list of Advanced Formulas and Java Script embedded in Analytics Applications that rely on each other. Each Advanced Formula is composed of multiple sequential steps. For the same logic, you can have alternative methods of calculations by selecting from the list. Example: You can run currency conversion based on global rates only, or, run based on global and entity specific rates. You can run consolidation based on direct and indirect ownership without staging, or run staging method consolidation. All these advanced formulas and java script preparatory logic rely on the same architecture and member attributes, and will work consistently based on the assignment of property values.

          Consolidation Methods

          The Consolidation extension for SAP Analytics Cloud includes three methods of consolidation, since they are the most commonly used and adopted.

          • Holding
          • Full
          • Equity
          • Proportionate

          The methods are entered through the ownership manager interface and will dictate the set of rules that will be used at the execution of the consolidation process. In the Managing Ownership Structure blog, we discuss this in further details, but for now, the below screenshot shows Scope 1 being a parent scope to Scope 2 (this is not a hierarchy/Parent Child relationship), and within Scope 1, Holding Germany 1 is the holding company and the rest are subsidiaries (Full Consolidation method) except for France 2 being an affiliate (Equity Consolidation Method). The ownership and consolidation rates have been entered for each entity within Scope 1.

          • The Holding method is assigned to the holding company within a Scope. This is the investor company.
          • The full method is assigned to the subsidiaries within the scope (investees with majority control by the holding). These are the companies that the holding company has majority control over.
          • The Equity method is assigned to the affiliate companies within the scope (investees with no majority control). These are the companies that the holding company doesn’t have majority control over. Only the equity pickup rule and investment revaluation is applied to companies assigned the equity method.

          By assigning the methods through the ownership manager and clicking on Run Consolidation, the system should run all the sequence of logic to generate a set of fully consolidated financial statements based on the relationship between the entities within a scope.

          The set of eliminations that have been used for this demonstrative blog do not include elimination of hedging reserve, dividends paid or Share in Earning. The values entered onto the balance sheet and P&L with their intercompany and investment breakdown have not been done to uphold financial validity; the purpose is to show how the eliminations are done and what will happen to differences between investment and equity accounts, and between intercompany transactions.

          Rating: 0 / 5 (0 votes)

          The post Consolidation Extension for SAP Analytics Cloud – Automated Eliminations and Adjustments (Part 1) appeared first on ERP Q&A.

          ]]>