SAP Ariba Extensibility - ERP Q&A https://www.erpqna.com Trending SAP Career News and Guidelines Wed, 22 Nov 2023 11:54:49 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP Ariba Extensibility - ERP Q&A https://www.erpqna.com 32 32 Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 3 https://www.erpqna.com/ariba-analytics-using-sap-analytics-cloud-data-intelligence-cloud-and-hana-docstore-part-3/?utm_source=rss&utm_medium=rss&utm_campaign=ariba-analytics-using-sap-analytics-cloud-data-intelligence-cloud-and-hana-docstore-part-3 Tue, 13 Sep 2022 12:11:57 +0000 https://www.erpqna.com/?p=67678 This is Part Three of a blog series on Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore. If you would like to start with Part One, please click here Recap SAP Analytics Cloud makes it easy for businesses to understand their data through its stories, dashboards and analytical applications. Our worked […]

The post Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 3 first appeared on ERP Q&A.

]]>
This is Part Three of a blog series on Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore. If you would like to start with Part One, please click here

Recap

SAP Analytics Cloud makes it easy for businesses to understand their data through its stories, dashboards and analytical applications. Our worked example is using SAP Ariba Data to create an SAC Story that lets you know how much spend has been approved within Ariba Requisitions created in the last thirty days

A simple SAC Story tracking Approved Requisitions

In Part One of our blog series we discussed how we can retrieve data from SAP Ariba’s APIs using SAP Data Intelligence Cloud. We stored this data as JSON Documents in the SAP HANA Document Store

In Part Two of our blog series we built SQL and Calculation Views on top of our JSON Document Collection

In this blog post we’ll use the Calculation View in SAP Analytics Cloud as a Live Data Model, which will provide the data to our SAP Analytics Cloud Story

Viewing our HANA DocStore Collection data in an SAP Analytics Cloud Story

Accessing our HDI Container from SAP Analytics Cloud

Before we can consume our Calculation View in SAP Analytics Cloud, we’ll need to connect SAC to our HDI Container. We can do this from within SAC itself

Click on Connections
Click on Add Connection
Select SAP HANA under Connect to Live Data

Next, we’ll have to enter the host and credentials for our HDI Container. If you’re not sure where to find these, refer to Part One of this blog series where we retrieved these for Data Intelligence Cloud (under the heading Creating Our Connections in Data Intelligence)

Choose HANA Cloud as the Connection Type, enter HDI details then click OK

Our HANA Cloud Connection has been created and now we’re ready to create in SAC

Creating a Live Data Model

Within SAP Analytics Cloud we’re going to use a Live Data Model to access our Calculation View in real time. This means that the data in our Document Store Collection will be available immediately after our Data Intelligence Pipeline updates it

Another benefit of using this Live Data Model compared to creating a Model on Acquired Data is that data doesn’t need to be copied to SAP Analytics Cloud for use

Click on Modeler, then Live Data Model
Select SAP HANA as the System Type, our AribaHDI Connection then click on the Input Help for Data Source
Click on our Calculation View
Click OK

Now we’re looking at the Live Data Model in the SAP Analytics Cloud Modeler. We can see our Calculated Measure, ReportingAmount

Viewing the Measures for our Live Model

We can also check the Live Model’s Dimensions

Click on All Dimensions
Our Dimensions are all here
Click on Save
Enter a Name and Description then click Save

Now that we’ve got our Live Data Model, we’re ready to create our Story and visualize our Ariba Data

Creating an SAC Story

Stories within SAP Analytics Cloud let us use visualize data in a number of ways, including charts, visualizations and images

Click on Stories

Within a Story there are a number of different Page Types available. For our example we’re going to add a Responsive Page. A Responsive Page allows you to create layouts that resize and adapt when viewed on different screen sizes

Click on Responsive Page
Leave Optimized Design Experience selected and click Create

First we’re going to give our Page a title – for example: Approved Requisitions (Past 30 Days)

Double Click then give your Page a title

Next, we’re going to attach the Live Data Model we created to our Story

Click on Add New Data
Click on Data from an existing dataset or model
Click on Select other model
Click on our Live Data Model

Now we’re able to use the data from our Live Data Model in our Story

Click on Chart in the Insert Toolbar
Click on our Chart, then on the Chart Type dropdown and select the Numeric Point

Now it’s time to give our chart some data

Click on Add Measure under Primary Values
Click the checkbox next to ReportingAmount

Now we can see a sum of all of our Approved Requisitions. However, we may (or rather we probably will) have Requisitions in more than one currency. To separate these Requisitions we’ll need to use Chart Filters

Click on our Numeric Point again to exit the Measure Selection
Click on Add Filters
Select the Column ReportingCurrency

From here we’ll be able to select which values of ReportingCurrency we’d like to see reflected in our ReportingAmount total. Given that it doesn’t make sense to sum totals in different currencies without first converting them, we’re going to select only a single currency

The data in your system may have different currencies than mine, so feel free to adjust accordingly

Select your currency, unselect Allow viewer to modify selections and click OK
We now have our total ReportingAmount for our first currency

While it’s great to have our total, the average end user is not going to know what ReportingAmount means. It’s time to give our Numeric Point a meaningful label

Click on Primary Value Labels under Show/Hide to turn off the lower ReportingAmount label
Double click on the Chart Title to edit
Type your Chart Title and use the Styling panel as desired
We can adjust the size of our Chart using the arrow in the corner

At this point, we have our Numeric Point set up and ready to go

Our finished Numeric Point

If your system only has the one currency, you can leave it here. If your system has more than one currency, you can duplicate the Numeric Point, then change the Chart Filter and Chart Title using the same steps we just followed

Click on Copy, then click Duplicate

Once we’ve finished with our currencies, it’s time to save

Click on Save
Enter a Name and Description then click on OK

Our Story is now finished and ready to be viewed

Sharing our SAP Analytics Cloud Story

Now we’ve created our Story, but we don’t want it to just sit inside our Files on SAP Analytics Cloud – we want people to use it. Let’s share our story with our colleagues

Click on Files
Click on the checkbox next to our Story, then click on Share under the Share menu
Click on Add Users or Teams
Select users you’d like to share the Story with then click OK
Click the checkbox to notify the users by email, then click Share
If we’d like to change the Access Type, we can do that here

Now we’ve shared our Story with users, and decided what kind of access we’d like them to have. This isn’t the only type of sharing available in SAP Analytics Cloud – for example you can learn about publishing to the Analytics Catalog here, and read about your other options in the Help Documentation

The Analytics Catalog is a single access point for SAP Analytics Cloud content vetted by your own Content Creators

Scheduling the Data Intelligence Pipeline

Now that our setup work is done and the users can view our SAP Analytics Cloud Story, we want to make sure the underlying data in our SAP HANA Document Store Collection is kept up to date on a regular basis

Since our SAP Data Intelligence Pipeline is responsible for truncating the data and supplying a new batch of data, we want to schedule it to run automatically. We can do this from Data Intelligence Cloud itself

Click on Monitoring
Click on Create Schedule
Write a description for our Schedule then choose our Pipeline under Graph Name
Choose how often our Pipeline will be run
Our Schedule has been created

When we create our first Schedule, we’ll see My Scheduler Status: Inactive. We don’t need to worry – our Scheduler’s Status is actually Active. To see it, we can click on Refresh

Click Refresh
Our Scheduler Status is Active

An Important Note on Scheduling Ariba APIs

You may remember back in our first Blog Post that our Pipeline waits twenty seconds between each call of the Ariba APIs. This is because each of Ariba’s APIs has rate limiting. These rate limits are cumulative

What that means for us is that for each Realm and API (for example MyCorp-Test Realm and Operational Reporting for Procurement – Synchronous API) we have a shared rate limit, no matter how it’s called

The Pipeline we provided in Part One of this blog series is optimized for performance – i.e. it makes calls as fast as Ariba’s rate limiting will allow

If there’s more than one instance of this Pipeline running at once, both will receive a rate limit error and no data will be uploaded to our Document Store Collections

Please keep this in mind when you plan for the scheduling of these pipelines, and refer to the Ariba API Documentation for up-to-date information on Ariba Rate Limits as well as how many Records are returned each time the API is called

Wrap-Up

Throughout this blog series we’ve shown how we can set up a pipeline that will get data from Ariba and persist it in HANA Cloud Document Store, as well as how we can schedule it to run periodically

We’ve also shown how we can create a Calculation View on top of these JSON Documents, and finally how we can create a Story in SAP Analytics Cloud that will let us visualise our Ariba Data

The data and models in these blog posts have been rather simple by design, in a productive scenario we will likely want much more complex combinations and views.

Rating: 0 / 5 (0 votes)

The post Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 3 first appeared on ERP Q&A.

]]>
Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 2 https://www.erpqna.com/ariba-analytics-using-sap-analytics-cloud-data-intelligence-cloud-and-hana-docstore-part-2/?utm_source=rss&utm_medium=rss&utm_campaign=ariba-analytics-using-sap-analytics-cloud-data-intelligence-cloud-and-hana-docstore-part-2 Mon, 12 Sep 2022 11:02:31 +0000 https://www.erpqna.com/?p=67612 This is Part Two of a blog series on Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore. If you would like to start with Part One, please click here Recap SAP Analytics Cloud makes it easy for businesses to understand their data through its stories, dashboards and analytical applications. Our worked […]

The post Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 2 first appeared on ERP Q&A.

]]>
This is Part Two of a blog series on Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore. If you would like to start with Part One, please click here

Recap

SAP Analytics Cloud makes it easy for businesses to understand their data through its stories, dashboards and analytical applications. Our worked example is using SAP Ariba Data to create an SAC Story that lets you know how much spend has been approved within Ariba Requisitions created in the last thirty days

A simple SAC Story tracking Approved Requisitions

In Part One of our blog series we discussed how we can retrieve data from SAP Ariba’s APIs using SAP Data Intelligence Cloud. We stored this data as JSON Documents in the SAP HANA Document Store

In this blog post, we’re going to build SQL and Calculation Views on top of our JSON Document Collection

Viewing our HANA DocStore Collection data in an SAP Analytics Cloud Story

Design-Time Artifacts in Business Application Studio

As we discussed in our last blog post, objects within HANA usually have both a design-time and runtime artifact. Design-time artifacts are useful because they fully describe the object and can be deployed consistently across multiple HDI Containers or even HANA instances

When we deploy our design-time artifacts, they will be created as runtime artifacts inside our HDI Container

Our JSON Document Collection has already been created, and is already storing our Ariba JSON Documents. From here, it’s time to model our other artifacts

Creating our SQL View

JSON Documents are useful in a variety of situations where you don’t have strict, predefined schemas. When we retrieve our data from the Ariba APIs, we may retrieve data that doesn’t map cleanly to a table schema (for example, data that is nested). Putting this data in the HANA DocStore Collection allows us to store the complete document, ensuring nothing is lost

In order for us to use this data for analytics, we’ll need to map it to some sort of schema. We can create a logical schema on top of our Collection using a SQL View. This allows us to access a predefined subset of our data for analytics while leaving the full data untouched in our Collection

We’ll create the SQL View in Business Application Studio

Click on View, then Find Command or press Ctrl+Shift+P
Use Find Command to find Create SAP HANA Database Artifact, then click on it
Select SQL View as the artifact type, and enter the artifact name then click on Create

SQL Views use the following format:

VIEW "aribaRequisitionSQLView"
AS SELECT "UniqueName", "Name", [...]
FROM "aribaRequisition"

If you’re familiar with SQL, you may recognise this as the same syntax that you would use to create a standard SQL View, just missing the word “CREATE”

The SQL View doesn’t duplicate any data, just provides a schema that we can use to access the underlying data

Our data in JSON Documents are Key-Value pairs

"Status":"Complete"

To retrieve the value “Complete”, we would SELECT “Status”

JSON Documents may also have nested data

"Content":{"ItemId":"3728754507"}

To retrieve the value “3728754507”, we would SELECT “Content”.”ItemId”, with the full stop marking nested keys

Our example will use the following SQL View:

VIEW "aribaRequisitionSQLView"
AS SELECT "UniqueName", "Name", 
"TotalCost"."AmountInReportingCurrency" AS "AmountInReportingCurrency", 
"ReportingCurrency"."UniqueName" AS "ReportingCurrency", 
"ApprovedState", "Preparer"."UniqueName" AS "Preparer",
"Requester"."UniqueName" AS "Requester", "StatusString", 
"CreateDate", "SubmitDate", "ApprovedDate", "LastModified", 
"ProcurementUnit"."UniqueName" AS "ProcurementUnit"
FROM "aribaRequisition"

The fields we’re using are only a fraction of the fields available in the Documents within our Collection – if we want to customize the scenario later, there are plenty more to choose from

We want to make sure this SQL View is deployed and ready for use, so click on the Deploy rocket

We can deploy our SQL View under SAP HANA Projects on the left

Creating our Calculation View

While we’re in Business Application Studio, we’re going to create our Calculation View. This Calculation View is what we’ll be consuming in SAP Analytics Cloud

As before, we’re using View->Find Command then Create SAP HANA Database Artifact

Choose Calculation View, enter a Name then click on Create

Business Application Studio has an inbuilt editor for Calculation Views, which we’ll use to create ours

Click on Aggregation, then click the Plus symbol
Search for our SQL View, select it, then click Finish

Now that our SQL View is available as a Data Source, we want to make sure its columns end up in our Calculation View

Click on Aggregation, then click on Expand Details
Click on our SQL View on the left then drag and drop to Output Columns on the right
Our SQL view columns will now be available in our Calculation View

Because this is a Calculation View of type Cube (rather than Dimension), we’ll need to make sure it includes at least one Measure

The columns in our SQL View all have the default data type NVARCHAR(5000). If we try to mark this column as a Measure directly, it will treat it as a string – giving us the Aggregation options COUNT, MIN and MAX

We want to treat this column as the number it is – as a workaround, we’ll need to create a Calculated Column

Creating our Calculated Column

A Calculated Column is an output column that we create within the Calculation View itself. Rather than being persisted, the values are calculated at runtime based on the result of an expression

For our example, we’re using a very simple expression. First, we have to make our way to the Expression Editor

Click on Calculated Columns
Create a Calculated Column using the Plus symbol, then Calculated Column
Click on the Arrow

Next we’re going to give our Calculated Column a name and data type. Because the granularity of our example is the Requisition-level and not the item-level, the decimal points won’t meaningfully change the results. Given that, we’re going to use the Integer data type

Give the Calculated Column a Name, and choose the Data Type Integer
Choose Measure as the Column Type
Click on Expression Editor

The Expression Editor is where we’ll define how the column is calculated. Select our AmountInReportingCurrency Column

Select our Column from the left
Our Column is in the Expression

Our Created Column will take the value of AmountInReportingCurrency and convert it to an Integer

Now we want to validate the syntax of our Expression

Click on Validate Syntax
Our Expression is valid

We have one last thing to do inside our Calculation View – we want to filter the data to only include Approved Requisitions. If we want to use the Value Help to set our Filter, we’ll need to Deploy the Calculation View

Deploy our Calculation View
Click on Filter Expression
Click on ApprovedState under Columns
Add an Equals Sign (=) then click on the Value Help
Select Approved then click OK

Now we can check the syntax of our Filter

Click on Validate Syntax
Our Filter is valid

Before we Deploy our Calculation View, we want to make sure that we’re only sending our integer Calculated Column and not the string version. To do this, we go the Semantics Node

Click on Semantics, then Columns
Check Hidden for our AmountInReportingCurrency Column to exclude it from our Calculation View

All of the Columns we need, including our new Calculated Column are available within the Calculation View. Now we’re ready to Deploy it one last time

Once again, click on the Deploy Rocket under SAP HANA Projects

Checking our Runtime Artifacts

Now that we’ve finished deploying our Design-time artifacts, we’ll have the corresponding Runtime artifacts inside of our HDI Container. We can check these by going to SAP HANA Database Explorer from within Business Application Studio

Click on Open HDI Container on the left under SAP HANA Projects

In the Database Explorer, we want to first check on our SQL View

Click Views on the left, then click on our SQL View
Our SQL View

We can see all of the Columns in our created SQL View. If we want to check out some of the data returned by our SQL View, we can click on Open Data

Click on Open Data
Data from our SQL View is displayed

Next it’s time to check on our Calculation View

Click Column Views on the left, then click on our Calculation View
Our Calculation View
Click on Open Data

Database Explorer will open our Calculation View for Analysis. We’re going to do our analysis in SAP Analytics Cloud, so for now we just want to verify the Raw Data

Click on Raw Data
Data from our Calculation View is displayed

Wrap-Up

During this blog post we’ve built a SQL View and Calculation View on top of our HANA DocStore Collection. We’ve also made sure that our Calculation View only contains Approved Requisitions

In the third and final blog post we’ll consume our Calculation View as a Live Data Model before visualizing it in an SAP Analytics Cloud Story. We’ll also schedule the Data Intelligence Pipeline we created in our first blog post so that the data in our HANA DocStore Collection is updated on a regular basis automatically

Rating: 0 / 5 (0 votes)

The post Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 2 first appeared on ERP Q&A.

]]>
Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 1 https://www.erpqna.com/ariba-analytics-using-sap-analytics-cloud-data-intelligence-cloud-and-hana-docstore-part-1/?utm_source=rss&utm_medium=rss&utm_campaign=ariba-analytics-using-sap-analytics-cloud-data-intelligence-cloud-and-hana-docstore-part-1 Sat, 10 Sep 2022 11:22:24 +0000 https://www.erpqna.com/?p=67541 Introduction SAP Analytics Cloud makes it easy for businesses to understand their data through its stories, dashboards and analytical applications. However, sometimes we might not be sure how we can leverage SAC to create these based on data from other applications For this worked example, we’re going to make use of SAP Data Intelligence Cloud […]

The post Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 1 first appeared on ERP Q&A.

]]>
Introduction

SAP Analytics Cloud makes it easy for businesses to understand their data through its stories, dashboards and analytical applications. However, sometimes we might not be sure how we can leverage SAC to create these based on data from other applications

For this worked example, we’re going to make use of SAP Data Intelligence Cloud to retrieve data from SAP Ariba through its APIs, before storing it in SAP HANA Cloud’s JSON Document Store

In further blog posts, we will build a model on top of this stored data and show how this can be consumed in an SAP Analytics Cloud Story. The focus of this series of blog posts is to show a technical approach to this need, not to provide a turnkey ready-to-run SAC story. After following this series you should have an understanding of how you can prepare your own stories using this approach

Ariba Analytics in SAP Analytics Cloud

For this example, we’re going to create a simple story that lets you know how much spend has been approved within Ariba Requisitions created in the last thirty days

A simple SAC Story tracking Approved Requisitions

A Requisition is the approvable document created when a request is made to purchase goods or services. Our approach will let us view only the Approved Requisitions, excluding those still awaiting approval

For those feeling more adventurous, this setup can be repeated with different document types, and those combined to create more in depth SAP Analytics Cloud Stories. This is outside of the scope of our blog series

Solution Overview

Our finished solution will need SAP HANA runtime artifacts such as Document Store Collections, SQL Views and Calculation Views. We will define these as design-time artifacts in Business Application Studio, then deploy them to an HDI Container within our SAP HANA Cloud instance

Deploying our Design-time artifacts into SAP HANA Cloud

Using a scheduled SAP Data Intelligence Cloud Pipeline, we’ll query SAP Ariba’s APIs and place the data within our HANA Cloud Document Store Collection

Scheduled replication of Ariba Data

Our SQL View lets us create a view on top of the data within our JSON Documents. Creating a Calculation View on top of one or many SQL views will let us expose the data to SAP Analytics Cloud

Viewing the data in SAP Analytics Cloud

SAP Analytics Cloud can use HANA Cloud Calculation Views as the source for Live Data Models. With Live Data Models, data is stored in HANA Cloud and isn’t copied to SAP Analytics Cloud

This gives us two main benefits: We avoid unnecessarily duplicating the data, and ensure changes in the source data are available immediately (provided no structural changes are made)

Finally, we use the Live Data Model to create a Story within SAP Analytics Cloud. Once we’ve got everything set up, we can use this story to check our data at any time, with the Data Intelligence Pipeline refreshing the data in the background on a predefined schedule

Creating an Ariba Application

In order to access the APIs provided by Ariba, we’ll need to have what’s known as an Ariba Application. We do this through the SAP Ariba Developer Portal

For our use case we will be requesting access to the Operational Reporting for Procurement API

From the Ariba Developer Portal, click on Create Application
Click on the Plus Symbol
Enter an Application Name and Description then click on Submit

Once the Application has been created, we’ll need to request API access for the Application

Click on Actions, then Request API Access
Select the Operational Reporting for Procurement API, then select your Realm and click on Submit

Once the API Access Request has been approved by Ariba, your admin will be able to generate the OAuth Secret for our application

Your Ariba admin can click on Actions, then Generate OAuth Secret

This will generate our OAuth Secret, which is required to use the API. The secret will only be displayed once, so the admin should (securely) store this and provide it to you for use in the application

If the OAuth Secret is lost, the admin can regenerate it, at which point the old secret will stop working and you will have to use the newly generated secret

Ariba API

When we call the Ariba API, we have a number of things to consider. For our example, we’re using the Synchronous API to retrieve data, but there’s also a set of Asynchronous APIs you should consider when retrieving bulk data

Documentation is available online

In addition, when retrieving data sets, you have to specify an Ariba View that you wish to retrieve. These are similar to reporting facts in the Ariba solution, such as Requisition or Invoice. Views will specify which fields are returned, and may also specify filters you should provide when calling them

To simplify our example we’re going to use a System View, which is predefined in Ariba. You are also able to work with Custom Views using the View Management API to better match your requirements but this falls outside the scope of this blog series

To explore these at your own pace, you can visit developer.ariba.com

Enabling Document Store in HANA Cloud

The Document Store is SAP HANA Cloud‘s solution for storing JSON Documents. While the Column and Row Stores use Tables to store their data, the Document Store stores data inside Collections

Before we activate the Document Store in HANA Cloud, just a word about resources. Like the Script Server, the Document Store is an additional feature that can be enabled, however we should consider HANA Cloud’s current resourcing before enabling it.

When we’re ready to enable, we’ll need to navigate to SAP HANA Cloud Central.

From the BTP Control centre, we select our Global Account and click Open in Cockpit
From here we see our Subaccounts – we choose the Subaccount where our HANA instance resides
From our Subaccount, we click on Spaces
From the Spaces page, we select the Space that contains our HANA instance
Click on SAP HANA Cloud
Click on Actions, then Open In SAP HANA Cloud Central

From HANA Cloud Central, we can then activate the Document Store

Click on the dots, then choose Manage Configurations
Click on Edit
Go to Advanced Settings, select Document Store then click on Save

Once our HANA Cloud instance has restarted, we’ll be able to use the Document Store

Creating a DocStore Collection in Business Application Studio

While we can create a Collection directly using SQL through Database Explorer, we want to make sure we also have a design-time artifact for our DocStore Collection

To do this, we’ll use the Business Application Studio. For those unfamiliar with Business Application Studio, you can follow this Learning Journey Lesson to set up a Workspace – we’ll assume this is already in place

It’s time to set up our SAP HANA Database Project, and create the HDI Container where our runtime objects will reside

Creating our Project
Select SAP HANA Database Project

Next we’ll need to provide some information for our project

Give our Project a name and click Next
Leave the Module name as is and click Next

Double check the Database Version and Binding settings then click Next

Setting our Database Information

Next we have to bind the project to a HANA Cloud instance within Cloud Foundry. The Endpoint should be automatically filled, but we have to provide our Email and Password before we can perform the binding

Binding our Cloud Foundry Account

For this example we’re going to create a new HDI Container

If our Cloud Foundry space has more than one HANA Cloud instance, we may want to disable the default selection and manually choose the HANA Cloud instance where our container will reside

Creating our HDI Container

Now that we have our HDI Container and SAP HANA Project set up, it’s time to create our design-time objects. First, we login to Cloud Foundry

Click on View, then Find Command or press Ctrl+Shift+P
Search and select CF: Login to Cloud Foundry, then follow the instructions before selecting the Space with our HANA Cloud instance

Next, we’ll create our DocStore Collection

Use Find Command again to find Create SAP HANA Database Artifact, then click on it
Ensure that the artifact type is Document Store Collection, name is aribaRequisition and that the artifact will be created within the src folder of a HANA Project, then click on Create
Finally, we want to find our SAP HANA Project on the Explorer on the left, and click on the rocket icon to Deploy

After the deployment is successful, we have both our design-time .hdbcollection artifact, as well as the runtime DocStore collection which has been created in our HDI Container

Creating our Connections in Data Intelligence

So far we’ve gained access to Ariba APIs and enabled the Document Store in our HANA Cloud Instance. Next, we’ll be setting up two Connections in Data Intelligence Cloud

The first Connection will allow our Data Intelligence Pipeline to query the Ariba APIs to retrieve our data, and the second will allow us to store this data in the Document Store within our HDI Container

First, we use the DI Connection Manager to create a new Connection, selecting OPENAPI as the Connection Type

Create a new Connection in the Connection Manager

Our OpenAPI Connection will be used to send the request to Ariba. We’re going to set the connection up as below, using the credentials we received when we created our Ariba Application

Using our Ariba Application OAuth Credentials to create the OpenAPI Connection

Next, we’re going to create a HANA Connection that will let us work with the HDI Container we created earlier. To get the credentials, we have to go to the BTP Cockpit

Select our HDI Container from the SAP BTP Cockpit
Click on View Credentials
Click on Form View

We’ll want to keep this window open as we create our HANA DB Connection, as it has the details we need. Within Data Intelligence Cloud, create a new connection of type HANA_DB and fill it out as below using the credentials

Enter the credentials to create our HDI Connection

While we have the credentials open, take note of the Schema name. We’ll need this to set up our pipeline

Pipeline Overview

The source code for our pipeline can be found. Copy the contents of this JSON to a new Graph within the Data Intelligence Modeler. If you’re not familiar with how to do this, you can refer to the README

When the pipeline starts, a GET request is made to the Ariba API. If there are more records to be fetched, the pipeline will make further requests until it has all available data. To avoid breaching Ariba’s rate limiting, there is a delay of 20 seconds between each call

Fetching data from Ariba

Once all of the records have been fetched, the Document Store Collection is truncated to remove outdated results, and the most up to date data is inserted into our collection

Updating records
  1. A copy of the data is stored as a flat file in the DI Data Lake as reference
  2. The HANA Document Store Collection is truncated, and Documents are added to the Collection one at a time
  3. Once all records have been added to the Collection, the Graph will be terminated after a configurable buffer time (1 minute by default)

Configuring our Pipeline

In order to run this pipeline, you will have to make some changes to the pipeline:

In the Format API Request Javascript Operator, you should set your own values for openapi.header_params.apiKey and openapi.query_params.realm

You can edit this code from within the Script View of the Format API Request Operator

If your Connection names are different to ARIBA_PROCUREMENT and ARIBA_HDI, then you will want to select those under Connection for the OpenAPI Client and SAP HANA Client respectively

Changing the Connection for the OpenAPI Client
Changing the Connection for the HANA Client

Check the Path values for the operators “Write Payload Flat File” and “Write Error Log”. This will be where the pipeline will write the Flat File and API Error Logs respectively. If you’d like them to save elsewhere, edit that here

Setting the log paths

Finally, we’ll want to set the Document Collection Schema name in the DocStoreComposer Operator. This is the Schema we noted earlier while setting up the Connections

View the Script for our DocStoreComposer Operator
Add the Schema to the DocStoreComposer Operator

Testing our Pipeline

Now we’re ready to test our pipeline. Click on Save, then Run

Testing our Pipeline

Once our pipeline has completed successfully, we’ll be able to see that our JSON Documents are stored within our Collection by checking in the Database Explorer. We can access this easily through Business Application Studio by clicking the icon next to our SAP HANA Project

Getting to the Database Explorer from Business Application Studio

We can see that our pipeline has been successful, and that 206 JSON Documents have been stored in our Collection

Our Collection contains 206 Documents

Wrap-Up

In this blog post we’ve walked through how we can use SAP Data Intelligence Cloud to extract data from SAP Ariba, before storing it in a collection in SAP HANA Cloud’s Document Store

Rating: 0 / 5 (0 votes)

The post Ariba Analytics using SAP Analytics Cloud, Data Intelligence Cloud and HANA DocStore – Part 1 first appeared on ERP Q&A.

]]>