In this blog, I will cover getting BW Business Content data into Databricks using the following 4 steps:
- Enable the BW Data Product Generator in the SAP BW (7.5 or BW/4HANA) instance.
- Run the BW Data Product Generator on the BW Content
- Create the Data Product based on the BW Data 
- Delta Share the Data Product to Enterprise Databricks.
Step 1 – Enable the BW Data Product Generator in the SAP BW (7.5 or BW/4HANA) instance.
Installation documentation is via the note at: https://me.sap.com/notes/3590400/E
To install the BW Data Product Generator the current condition applies (speak to your SAP team for potential options to this):
You have an SAP Business Warehouse, private cloud edition system (in SAP Business Data Cloud) in place and the object store in SAP Datasphere is enabled. Now, you want to connect to your SAP BW system to your existing SAP Business Data Cloud tenant and the connection is not configured.
In addition to the installation guide, making sure the Note Analyzer for Data Product Generator is run successfully.
The good news in running the SAP_BW_BDC_CONFIGURATION task, is it is now re-runnable and you can reuse the same target space (BWDPG) in my case. I won’t go through the details of running the installation – but it relatively straight forward using the documentation – making sure you have the certificates corrected assigned in the BWPCE STRUST.
Once successfully installing the BW DPG, you can access the Data Subscriptions Tile (in BW/4HANA):

Associated Details:

Also available via the SAP GUI, for use in BW 7.5:

Step 2 – Run the BW Data Product Generator on the BW Content
In the next steps, I will use the BW Cockpit version (It is similiar using the GUI)
Initially creating and selecting a source:

The following BW InfoProviders can be used in a subscription for the BW DPG:
- Base Providers: InfoCubes, Datastore Objects (Classic and Advanced), InfoObjects (Masterdata)
- Composite Provider, MultiProvider
- Queries: Query-as-InfoProvider

And selecting the BW Content that is activated.
Once saved, it still needs to be activated as per message below:

Also, as part of the Data Subscription,
- Settings : Allows Full or Delta execution mode, apply filters to the data extraction and add Process Chain Variants to the subscription.
- Projections : Able to simpify the output columns


Once you have configured any settings, you can activate the Data Subscription:

Activating the Subscription will also create the Datasphere Local Table in the BWDPG space (accessible from the link in the BW DPG or via Datasphere: :

Now the Target table exists on the Object store, which can also be seen in the BWDPG.
Side Note: I originally deployed the BW DPG using an earlier release, and the target objects where the Semantics Usage Type was : Local Table (Relational Dataset). Now after applying some recent notes it is now : Local Table (Fact)
As the Data Subscription is now ready, it can be run directly from the Subscription screen:

Viewing the executions:

Now the data from BW exists in Datasphere Object store and if selected, can be delta updated as required
I have shown the BW/4HANA Cockpit for the above, but it’s also fully available for the GUI as well:

Viewing the BW Data from Datasphere:

Step 3 – Create the Data Product based on the BW Data
I won’t cover every step in this process, as there are other blogs that do this, but essentially
The new Data Product, will use the BWDPG space:

Then add the 2 Data Products – Purchase Order Header and Line Items BW generated objects:

Then change the status of the Data Product to Listed. Once the Data Product is Listed, then it can be Delta Shared in the BDC Cockpit.

Step 4 – Delta Share the Data Product to Enterprise Databricks.
Within the Business Data Cloud Cockpit, the new BW Data Product is available:

Sharing the Data Product to Enterprise Databricks is via the share button. I have already added the BDC Partner Connector for the DataBricks instance (seperate to this doco)

The sharing to Enterprise Databricks is quick and easy (no effort around ETL):


Within the Databricks environment, accessing the Catalog Explorer and the BDC Connect provider:

The difference with Delta Sharing from BDC to Enterprise Databricks (as compared to SAP Databricks) is the requirement to select the mount point for the Delta share:

Once mounted, the SAP Data Product can be accessed directly in the Catalog:

From here, users can perform all the advanced features of Databricks on the SAP BW data.
This has been a basic end to end showcase – hopefully will have time to add in tips and tricks as we build out more customer Use Cases.

 
   
   
   
   
   
   
   
   
  