In this blog I will describe the steps needed to use the Live Smart Predict aka Live Predict, that is now available with SAP Analytics Cloud (SAC). This allows using Smart Predict with live (remote) datasets that reside on-premise in SAP HANA. This data will not be imported into SAC, it stays on-premise.
For this HANA connection, we require the SAP Cloud Connector (SCC), an SAC data repository, and an SAC (live) dataset.
Also Read: SAP HANAIMP 12 Certification Preparation Guide
Components Needed
- SAP HANA (on-premise) with the Automated Predictive Library (APL)
- SAP Cloud Connector (we will install this)
- SAP Cloud Platform Account
- SAP Analytics Cloud CF Tenant
Steps required
- Link SAC Tenant to SAP Cloud Platform Account
- Verify SAP HANA (on-premise) has the Automated Predictive Library installed
- Install the SAP Cloud Connector
- Configure the SAP Cloud Connector
- Add SAC Data Repository
- Create SAC Dataset
- Create Predictive Scenario
1. Link SAC Tenant to SAP Cloud Platform Account
In the SAC environment we need to link the tenant to a SAP Cloud Platform user account.
This is done by simply setting your username (email) in the system administration, datasource configuration.

Once complete you will see a Subaccount and region host that will be entered into the Cloud Connector Configuration later.

2. Verify SAP HANA (on-premise) has the Automated Predictive Library installed
Within HANA, we use the Automated Predictive Library (APL) v4, 1906 or higher, this is checked with the SQL call statement below.
If you don’t have this installed, are missing permissions or don’t have the correct version you should fix that.
call "SAP_PA_APL"."sap.pa.apl.base::PING"(?)

3. Install the SAP Cloud Connector
For simplicity I installed the linux version of the SAP Cloud Connector (SCC) on the HANA box.
The cloud connector requires a JVM, to make things easier download the SAP JVM.
https://tools.hana.ondemand.com/additional/sapjvm-8.1.059-linux-x64.zip
https://tools.hana.ondemand.com/additional/sapcc-2.12.1.1-linux-x64.zip
We see the 2 files downloaded onto the linux box.

Unzip the the JVM by extracting it to the target dir

Check if there is a JAVA_HOME already set

Install the Cloud Connector package

If you do see any errors during installation, it is easiest to correct these uninstall and re-install with rpm commands similar to the following.
## Query for scc package
rpm -qa |grep scc
## Erase / Uninstall
rpm -e com.sap.scc-ui-2.12.1-5.x86_64
## Install Package
rpm -ivh ./com.sap.scc-ui-2.12.1-5.x86_64.rpm
The SAP Cloud Connector (scc_daemon) can be controlled with the systemctl commands.
systemctl status scc_daemon
systemctl restart scc_daemon

4. Configure the SAP Cloud Connector
We need to open the Cloud Connector Homepage, you may see some warnings because the SSL certificate is self-signed. I had to use Safari in private browsing mode to get it to open and accept the insecure (untrusted) SSL cert. This certificate can be replaced with a signed one to avoid this issue.
https://<hostname>:8443

The default username password is Administrator / manage, this will be changed on first login.
We connect to the SAC tenant Subaccount from step 1.

Once the Subaccount is defined, we add a “Cloud To On-Premise” connection

The wizard guides us, mapping the internal physical host to a virtual host




We see the mapping is added and the host is reachable.

5. Add SAC Data Repository
We use this Cloud Connector configuration to specify the remote data repository

Verify the connection works

6. Create SAC Dataset

Choose datasource

Connect to Live Data Repository

Choose our data repository

Select the desired table or SQL view

This becomes our dataset

We get a data preview of our remote dataset

7. Create Predictive Scenario
In just a few clicks we can create a predictive scenario using our live dataset.

Select the required dataset

We set the model parameters, select the target variable and input fields.

Once the model has been trained we get some useful output

From here we can chose to keep the model, modify the model parameters, add a new model to our scenario or apply the existing model to a separate dataset.