COVID-19 - ERP Q&A https://www.erpqna.com/tag/covid-19/ Trending SAP Career News and Guidelines Mon, 05 Jan 2026 07:06:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png COVID-19 - ERP Q&A https://www.erpqna.com/tag/covid-19/ 32 32 Predicting the next move of the virus covid-19 with SAP Analytics Cloud https://www.erpqna.com/predicting-the-next-move-of-the-virus-covid-19-with-sap-analytics-cloud/?utm_source=rss&utm_medium=rss&utm_campaign=predicting-the-next-move-of-the-virus-covid-19-with-sap-analytics-cloud Tue, 31 Mar 2020 09:18:56 +0000 https://www.erpqna.com/?p=28428 The past few weeks have been straight out of an unpleasant movie. We have witnessed how one microscopic organism, crept out of a seafood market and inch by inch, human by human, charted an unprecedented journey across the globe. In one swift move, nature has thrown down the gauntlet and the only armour we really […]

The post Predicting the next move of the virus covid-19 with SAP Analytics Cloud appeared first on ERP Q&A.

]]>
The past few weeks have been straight out of an unpleasant movie. We have witnessed how one microscopic organism, crept out of a seafood market and inch by inch, human by human, charted an unprecedented journey across the globe. In one swift move, nature has thrown down the gauntlet and the only armour we really have, is our own immune system and – information. As it turns out, the information at least is publicly available.

So it is, that in this blog post we will track the illustrious career of this seemingly ubiquitous virus as it goes about redefining the new “normal” of our times. We also attempt to foretell its next move. What we need is a tool that can crawl around the numbers and prod the patterns out to make a reliable prediction. SAP Analytics Cloud lends itself to the task, in what I find to be the easiest and by far the fastest way, to get a predictive forecast up and running. It uses a nifty feature named much humbly and without fanfare as – you guessed it – “Forecast”. Without further ado, let’s roll our sleeves up and dive into this rabbit hole!

Step 1: Get the data

There are three key sources providing regular updates of COVID-19 cases and deaths globally and by country: Johns Hopkins University, the World Health Organization (WHO) and the European Centre for Disease Prevention and Control (ECDC). All 3 sources are summarised pretty well here https://ourworldindata.org/covid-sources-comparison. On this page there are 2 csv files – one for confirmed cases and one for deaths. I use confirmed cases in this blog post, but you may well use the one for deaths and perform the same analysis. Download the csv file to your local machine.

A snippet of the data looks like this.

You can see the numbers from all 3 sources, John Hopkins, WHO & ECDC. I will use John Hopkins in this blog post.

Step 2: Upload the data to SAP Analytics Cloud

We begin by creating a model.

Our file is on the local system, so the choice is straightforward.

You will be prompted to select your file from your location.

Allow SAP Analytics Cloud to detect the file type and headers. Hit Import.

Your data is in. For our time series forecast to work correctly, we need the Date format to be recognised correctly. This file has data in MMM DD, YYYY, an unusual date format. We will juggle the fields around to bring it to a form SAP Analytics Cloud can recognise. First, we separate the date, month and year. Subsequently we will concatenate the fields to resemble MMM-DD-YYYY (which is one of the many accepted date formats). Click on extract everything before the “,”. Subsequently click on extract everything after the “,”. This should give you 2 columns.

Click on Split on “ “ to separate the MMM and DD fields.

Double click on the headers to rename the fields. Once renamed to YYYY, MMM & DD, click on the fx in the ribbon to create a new column in desired format.

Enter the below formula. And click on preview to take a look at the formula sample results.

I delete all other columns to keep things simple.

Lastly, I click on the Date field and update the Type to Date. The date format is auto selected. You can click on the Date Format to review the other accepted date formats. Once done, click on create model at bottom right.

After validating the data and checking for errors, you will be prompted to confirm that you wish to create the model. Hit create.

Save your file in a folder location of your choice, on the Cloud tenant. Hit ok.

And you’re done! But wait. The numbers in the raw file are cumulative. This essentially means that Day 1 has x cases, Day 2 has x+y cases, Day 3 has x + y + z cases and so on. To meaningfully add up all the numbers, I need each entry to be the new cases of that day. That is Day 1 to be x, Day 2 to be y and Day 3 to be z and so on. Thankfully, a few clicks can fix this in SAP Analytics Cloud. Click on Model on top right and click on Account.

Click on John Hopkins (or whichever is the source you wish to use for forecasting) and on the Member Details tab on the right, scroll down to Exception Aggregation Type. This basically informs SAP Analytics Cloud that the aggregation is not a straightforward sum. It will need to select the latest value of the cases by date and use that for aggregation.

Select Last in the Exception Aggregation Type.

Select the Date field as the Exception Aggregation Dimension.

Hit save on the model to finish data preparation.

Step 3: Create your forecast

We start, by creating a story. Go to Home and click on create a story.

On this page choose from the readily available templates or build your own story from scratch based on your needs. I choose canvas, but you can really choose any starting point.

I will add a chart to start.

I will be prompted to select a data model. I will select the one I just created.

In my newly created chart, I will select the chart type as line.

I will add Confirmed cases as the measure for the chart.

And date as the dimension.

By default, fields with data type date are created as hierarchies. SAP Analytics Cloud recognises the layers of Date to Month to Quarter to Year and refers to this as “levels”. I select level 4 as my display choice. You can play with the other options to understand this better.

I can see my trend for global cases show up in the chart. I am keen to understand the trend in March, so I drill down further.

At this point, adding a forecast is a few simple clicks away!

Click on the 3 dots at the top right of the chart to reveal the chart options. Click on Add > Forecast > Automatic Forecast.

Clicking on a future date, shows me the forecasted numbers for the corresponding date.

Click on the forecast link in green on the chart to understand this forecast a little better. It has automatically chosen 3 time periods to learn from the 3-week period available in this month.

Click on the confirmed cases, to uncover that SAP Analytics Cloud has high confidence in the predicted numbers.

You can play around with the 2 other options, Linear regression and Triple Exponential Smoothing to see the difference in results.

I notice the narrowest confidence bounds on the triple exponential smoothing and settle for this prediction.

I am now curious to see if this global prediction can be extended to my home country, Singapore. Click on input control at the top to add a page filter.

Select Entity in the dropdown, as this is the Country field.

Click on all members to allow all countries to display in filter selection. Hit ok.

Click on Entity and search for Singapore (or any country of your interest).

Triple exponential smoothing predicts 345 cases on Mar 19, 385 on Mar 20 and and 425 cases on Mar 21. At the time of this blog post, I am aware that the true cases reported in Singapore on these dates were 345, 371 and 432 respectively. The first prediction is eerily bang on. The others are in the whereabouts.

For this example, we have used the chart type for line trend. You can change the UI & other options slightly by selecting the chart type for time series. Try repeating the example with time series charts to see the difference.

Rating: 5 / 5 (1 votes)

The post Predicting the next move of the virus covid-19 with SAP Analytics Cloud appeared first on ERP Q&A.

]]>
Coronavirus/COVID-19: How to model & analyse data with SAP Analytics Cloud (Part 2) https://www.erpqna.com/coronavirus-covid-19-how-to-model-analyse-data-with-sap-analytics-cloud-part-2/?utm_source=rss&utm_medium=rss&utm_campaign=coronavirus-covid-19-how-to-model-analyse-data-with-sap-analytics-cloud-part-2 Mon, 30 Mar 2020 10:16:44 +0000 https://www.erpqna.com/?p=28324 This is the second part of my mini blog post series, where I will show how to model and consume data in SAP Analytics Cloud. In the first part, I have shown how to build the first version of the model for Confirmed cases. In this blog post I will show you how to: Please […]

The post Coronavirus/COVID-19: How to model & analyse data with SAP Analytics Cloud (Part 2) appeared first on ERP Q&A.

]]>
This is the second part of my mini blog post series, where I will show how to model and consume data in SAP Analytics Cloud. In the first part, I have shown how to build the first version of the model for Confirmed cases. In this blog post I will show you how to:

  1. Resolve data quality issues (see Part 1, where i got some rejected rows).
  2. Add new measures and data sets associated with them.
  3. Import/upload new data

Please also note that there might be some redundant steps in this procedure, but I wanted to show you how you could adjust the process in case something unpredictable happens. After all, this is what happened to me. Furthermore there is always some room for improvements and this I leave for final thoughts of this blog post.

Read More: SAP Analytics Cloud Certification Preparation Guide

Images in this blog post are the screenshots that I have made. Data sources that I have used are available (at the time writing this blog post) on

AD1:Resolve data quality issue

Just to refresh the memory…In the previous blog post some rows were rejected during the data upload. As you can see “Diamond Princess”(the same goes for “Grand Princess”) is using two different countries – “US” and “Cruise Ship”. This would not be a problem if we would use “Country/Region” as a separate dimension, but in our case we are using it (actually the calculation based on it) as a parent in our GEO dimension (That way we can aggregate & drill down from the country level into Province/State.

If we would detect this issue during data preparation phase, we could adjust the values of the parent before loading into the model, but this was not detected so we need to go down the different path.

Let’s upload the same file again into the SAP Analytics Cloud and enter into the data preparation mode by clicking on it:

We are going to work with the “Update” mode of SAP Analytics Cloud (since this is our first slice of data that we have uploaded into the model, we could also use “Clean and replace”).

So, please repeat all the steps that I have listed below from the blog post part 1:

  1. Transpose the date from column header
  2. Create “CALC_Province”
  3. Create “Country” column

This is how your card layout should look like:

Now, let’s do the mapping. Double click the “Measures” card and drag and drop “Value” into “Confirmed cases”.

Move “Key”(our date field) into the “Date” dimension:

Disconnect current mapping of “CALC_Province” dimension by clicking X of the card and then double click it to open it:

All items inside should be empty, meaning not linked to any existing columns of the data set.

Now drag and drop fields from the data set as indicated:

CALC_Province -> Dimension ID

Country -> Country

After you did this, click on the “Location” card and adjust the mapping in the right side bar as shown below:

The result should look like this:

Now we are going to deal with the “Princess” problems described at the beginning. Please switch to the grid/column view and click on the column “Country/Region”. Then click into the formula bar and select “REPLACE” transformation:

You will have to do the same process twice, for both ships: “Grand Princess” and “Diamond Princess”.

When you create the replace transformation, just press “Enter” and it will be activated. You should receive 2 messages bellow (the missing on in my screenshot was referring to the calculations update)

When done, you can also check all transformation in the transformation log and in case there is a mistake, just delete the transformation and perform it again, of course this time correctly:)

When done, press “Finish mapping” and finalise the whole data loading process. You should have zero rejected rows:

AD2: Add new measures and data sets associated with them

We are going to extend our model with two additional data sets that have the same structure but additional measures. So each data set contains only one (new) measure, a data set for deaths and a data set for recovered statistics.

First open our model and add two new measures as indicated below (don’t forget to setup the exception aggregation and decimal places) and save changes in the model.

We are going to load both data files into SAP Analytics Cloud as indicated below:

You need to repeat all the steps from above twice – remember we have two data sets – Deaths and Recovered, with one exception, the measure mapping:

  1. Transpose the date from column header
  2. Create “CALC_Province”
  3. Create “Country” column
  4. Mapping of date column
  5. Mapping of CALC_Province, Country, Location (don’t forget to do this otherwise your maps will be half empty)
  6. Perform 2x Replace transformation (“Grand/Diamond” Princess”)
  7. Measure mapping: map VALUE of corresponding data set to corresponding measure:
    • Deaths.VALUE -> Deaths
    • Recovered.VALUE ->Recovered

Please make sure that you have to select UPDATE import method (should be there by default, but still).

This is the list of transformations that I’ve done:

After you’ve done this should be the number of data sets loaded in your model with row count as indicated in the picture below:

Another thing that I warmly recommend is to do a simple data check by building a simple story just to see if the data was imported & mapped correctly. It is easy to spot anomalies/data quality issues in a simple bar/column chart.

In our case you should have data loaded for 3 measure until 16.3.2020 (we are using files that contain data until 16.3.2020).

AD3: Import/upload new data

Ok, so here we are at the time of writing this blog post (25.3.2020), many things related to COVID19 data sets have changed so I’ll try to show how we can manage it. In this section we are going to upload all the data available until yesterday, which means until 24.3.2020.

So the most important change is that previously used data sets are not updated any more, and the new data sets contain only aggregated numbers for the US Federal States. So we could just simply start using those, but since this is a blog post where I would like to show you other options of data integration, we will make this a bit more complicated, but not too complicated. Another important notice is that the “Recovered” data set is not going to update anymore, so we’ll ignore it during the loading and also when performing analysis. Feel free to flag it as hidden in modeller.

and we will combine it with more detailed data for the US Federal States, taken from here

Let’s first work on the DEPRECATED “Confirmed cases” data set so that we get the latest available data that includes US Federal States details as well.

First we plan to import this data set and then delete data where:

  • Date =”23-03-2020″ (it’s just the copy of “22-03-2020”) AND Country = “US”

In order to do this, please repeat all steps from the AD2: Add new measures and data sets associated with them of this blog post:

  1. Transpose the date from column header
  2. Create “CALC_Province”
  3. Create “Country” column
  4. Mapping of date column
  5. Mapping of CALC_Province, Country, Location (don’t forget to do this otherwise your maps will be half empty)
  6. Perform 2x Replace transformation (“Grand/Diamond” Princess”)
  7. Measure mapping: map VALUE of corresponding data set to corresponding measure:
    • Deaths.VALUE -> Deaths
    • ConfirmedCases.VALUE ->Confirmed Cases

You can also see the steps in this video here:

You should have this information in the import section of the model:

In the next step, we are going to delete the data for:

  • Date =”23-03-2020″ (it’s just the copy of “22-03-2020”) AND Country = “US”

Enter the model section and jump into delete section:

Select Country US and Date as “23-03-2020” and proceed with data deletion.

The data has been deleted. Now we are going to upload the latest data set available until the date 24-03-2020 (included). Please download the files from here and before you upload them into SAP Analytics Cloud DELETE the cumulative row for US from the CSV file. We are going to upload the US data from another source.

Please repeat steps indicated above with import method UPDATE or just replicate them from the video above:

  1. Transpose the date from column header
  2. Create “CALC_Province”
  3. Create “Country” column
  4. Mapping of date column
  5. Mapping of CALC_Province, Country, Location (don’t forget to do this otherwise your maps will be half empty)
  6. Perform 2x Replace transformation (“Grand/Diamond” Princess”)
  7. Measure mapping: map VALUE of corresponding data set to corresponding measure:
    • Deaths.VALUE -> Deaths
    • ConfirmedCases.VALUE ->Confirmed Cases

Ok, so by now you should have this situation in model and in our “control” story:

Remember, we still need to upload the US data for last two days that i downloaded from a different source: Worldmeters I have copy/pasted the data from a table into the spreadsheet and cleaned it a bit. Like I mentioned at the beginning of this blog post, the files are available here.

So this is the final chapter of this blog post and this is how it goes.

THE “COMBINE” CHAPTER:

Upload both files that contain US data:

Open the one starting with “20200324….” and work with it. Because the data set doesn’t contain LAT & LONG info we are going the use another data preparation functionality called “Combine data” where we are going to combine it with the data set that contains US State, LAT, LONG columns.

TIP: In order to the “perfect” merge, i have created a simple table in our SAP Analytics Cloud story and then exported the data in the XLSX format, so that i can reuse it in the “Combine data” action.

You should also use FLAT hierarch display for it. Afterwards I have cleaned up a bit the file so that I can use it here.

Here are the steps that we’ll perform before going into the “combine data” part:

  1. Create “CALC_Province”
  2. Create “Country” column
  3. Map newly created columns “CALC_Province” and “Country” to the ID and Country members of “CALC_Province” dimension.
  4. Check if the measures are mapped correctly (data set contains both relevant measures)

Ok, so now we are going to combine this data set with the US LAT & LONG data set so that we can visualise the data on a map.

Click “Combine Data”

and select the LAT & LONG data set:

Click on both “Province” columns that we’ll use in our “JOIN”, you can leave other options as is (of course feel free to explore them:)) and press “Combine” at the bottom right.

NOTE: Please also make sure that “All primary data” on top is selected (should be there by default) because we don’t want to loose any “fact data”.

LAT and LONG are added as additional columns. Please them into the measures as indicated below:

And even more important make sure to perform GEO mapping, double click on CALC_Province card to open all members of it and then do it:

Finish the mapping and import the data into the model. Please repeat all steps from THE “COMBINE” CHAPTER for the second data set of US only data that we have uploaded into SAP Analytic Cloud.

Ok, with this task we have concluded the exercise. Your model and story should look something like this:

Here is BEFORE the 24-3-2020 upload:

…and AFTER the upload.

As you can SAP Analytic Cloud offers a lot of flexibility to work with your data even before it comes into the dashboard/story. Of course there are also some other ideas how you could improve the whole process, but I wanted to show you how somebody like “DATA ANALYST” could work with the data and perform some steps to clean and structure it properly.

FINAL THOUGHTS

Here are some additional ideas on how to simplify, but not necessary make this process cheaper:

  1. Integrate COVID-19 data into a data mart – like SAP Data Warehouse Cloud and consume it from there with SAP Analytics Cloud
  2. Establish a scheduling of file import from a FILE SYSTEM (location)
    • for this you’ll need to setup SAP Cloud Connection and SAP Analytics Cloud Agent
    • the good thing is that all those transformation that we did manually will be applied during every import
    • of course you need to make sure to have a stabile data source:)
  3. Use different data source like Microsoft OneDrive or Google Spreadsheets and perform import into SAP Analytics Cloud
    • the connectivity is available
    • scheduling is available
    • transformation will be respected during data import
    • you need to make sure to have a stabile data source

Solutions mentioned above, their advantages/disadvantages can be also a discussion worth of a stand alone blog post.

Ok, that’s all for this part, stay tuned fo the next one where I will show how to visualise & analyse the data.

Stay COVID19 free!!!

#StayAtHome

Rating: 5 / 5 (1 votes)

The post Coronavirus/COVID-19: How to model & analyse data with SAP Analytics Cloud (Part 2) appeared first on ERP Q&A.

]]>
Coronavirus/COVID-19: How to model & analyse data with SAP Analytics Cloud (Part 1) https://www.erpqna.com/coronavirus-covid-19-how-to-model-analyse-data-with-sap-analytics-cloud-part-1/?utm_source=rss&utm_medium=rss&utm_campaign=coronavirus-covid-19-how-to-model-analyse-data-with-sap-analytics-cloud-part-1 Sat, 28 Mar 2020 10:37:06 +0000 https://www.erpqna.com/?p=28216 This is the first part of my mini blog series where I will show how to model and consume data in SAP Analytics Cloud. I have decided to use data sets coming from Coronavirus/COVID-19 area and hopefully you might use it in your area of work as well. In order to do this you’ll need […]

The post Coronavirus/COVID-19: How to model & analyse data with SAP Analytics Cloud (Part 1) appeared first on ERP Q&A.

]]>
This is the first part of my mini blog series where I will show how to model and consume data in SAP Analytics Cloud. I have decided to use data sets coming from Coronavirus/COVID-19 area and hopefully you might use it in your area of work as well.

In order to do this you’ll need following:

Images in this blog post are the screenshots that I have made. Data sources that I have used are available (at the time writing this blog post) on

SAP Analytics Cloud is our full blown analytics platform that enables you to make end-to-end decisions with all analytics capabilities in one place. By all we mean BI, planning & predictive/ML in one, single solution. No other vendor has business intelligence (BI), collaborative enterprise planning, and augmented analytics in one platform built for software as a service (SaaS)​. SAP Analytics Cloud allows you to gain enterprise-ready insights and take action within your business processes. Furthermore we can make smarter decisions faster with artificial intelligence (AI)-driven insights.

In my case I will create & use planning models in SAP Analytics Cloud, but you can also work just with an analytical model.

USE CASE

Let’s first define the business use case that we plan to work on. This is also super important in your non COVID-19 related scenarios, since every platform has several different options how to get to the finish line, so we need to be focused on the outcome. Otherwise it’s just another feature&function POC, pilot, etc.

In our case I want to track the Coronavirus outbreak, by following different KPIs or measures. To be more specific I would like to track the number of confirmed cases, deaths, recoveries, mortality rate. etc. Furthermore I would like to track the progress or dynamics of the disease and how it spreads in different countries. To make my analysis easier, I would also like to visualise data on a map. Since we also need to monitor how it spreads from day to day, I need to make sure to have data organised on a daily basis. At the end of the day, I need to know if I can go to the supermarket or should I wait for better times.

First draft result could look something like this:

Overview page:

GEO View:

In general we have to perform following tasks:

  1. build the model by uploading data into SAP Analytics Cloud,
  2. transform data,
  3. finalise building a model,
  4. create story to visualise our data,
  5. use insights to improve our life.

We will work with three different files as specified below (confirmed, deaths, recovered):

We could load them into 3 separate models and then use blending functionality in SAP Analytics Cloud, but for end business user it is easier to consume data just from one model. So we plan to load 3 data sets into one model. All three data sets have the same granularity.

Upload data into SAP Analytics Cloud

Select Create>Model>Import a file from your computer and use the file with confirmed cases that you have downloaded from the site above.

After the data is uploaded, this is the structure of the data set that we can see in SAP Analytics Cloud.

There are several things that are worth pointing out before we proceed with data transformations & modelling.

  1. First column “Province/State” might have some data quality issues – blank values
  2. Date dimension is currently part of the column names. Furthermore, we can expect that every time a new slice of data is added into the data set for a new date, additional column is added at the end of the data set.
  3. We do have LAT & LONG available in the data set that we can use for GEO encoding (and visualising data on a map).
  4. Create GEO hierarchy with Province & Country data
  5. Data is organised as BALANCE in the data set, which means that in this case the number of Confirmed Cases is defined per every date. In that case we would get a complete NONSENSE if we would aggregate data across DATE dimension.

AD1 – Fix data quality problem of “Province/State” column: Since we plan to use “Province/State” as Location ID in our GEO encoding, we need to make sure that this column doesn’t contain any blanks. After further inspection of the data set, we can clearly see that not all countries have data available on the level of province/state. In order to manage this exception properly, we’ll implement a “business rule” where we define a new column called “CALC_Province”. In case the value in “Province/State” column exists we take this one, otherwise we take the value of column “Country/Region”. We’ll use IF statement as shown below. In for better visibility/readability and order to avoid duplicate names in PARENT-CHILD hierarchy and during GEO encoding later on, we will also add “P_” prefix (as Province).

Create new calculated column:

Add IF statement & press ok after you’re done:

IF([Province/State]=””,
CONCAT(“P_”,
[Country/Region]),
[Province/State]
)

New calculated column is created as shown below:

AD2 – Transpose Date columns to rows: Since the date dimension is currently embedded into column names, we need to use a TRANSPOSE function in order to move data into rows of one date column.

Select “Transpose Columns to Rows”:

In the “Transpose” panel first select all fields and then deselect top 4 fields (Province/State, Country/Region, Lat, Long) and (IMPORTANT), the last field of the data set – calculated column CALC_Province

You can also activate “Preview” and inspect if the transpose works in expected way:

Press OK when you’re done and transformation will be done. We can also add more user friendly description for our Date dimension and make sure that system has recognised it as DATE dimension.

As you can see from the picture below, SAP Analytics Cloud has properly recognised dimension type as DATE, so we only need to add description to it. Double click on column name “Key” and rename it to “Date”:

Result after renaming the “Key” column to “Date”:

AD3 – Preform GEO encoding by LAT/LONG: In this step we are going to add GEO encoding into the model. That way we’ll be able to visualise data on a map when building a story in SAP Analytics Cloud.

Select “GEO Enrich by Coordinates”:

in the pop-up menu please select/define fields as shown and press “Create” when done:

A new “GEO” column called “Location” has been created as shown below including the GEO encoding of all locations in the backend, which means we’ll be able to use this data on a GEO map.

AD4: Create GEO hierarchy: If data allows (meaning this information is available), we can also create hierarchy within a certain dimension. SAP Analytics Cloud allows us to create PARENT-CHILD or LEVEL-BASED hierarchies. In our case I will create a PARENT-CHILD hierarchy with two columns “CALC_Province” and the copy of “Country/Region”. I will do this because in some use cases it might make sense to obtain a parent of hierarchy also as a separate dimension, that way we can use it as a stand alone dimension in charts/tables if needed.

Create a new calculated column as CONCAT between “C_” and “Country/Region” column and name it to “Country”:

In the next step I will define “Country” as parent of “CALC_Province” dimension:

If everything was done properly, the icon on the left side of the “Country” has changed to a hierarchy icon.

We will also rename the “VALUE” column to “Confirmed Cases” and check if SAP Analytics Cloud has defined a correct type to it. the column needs to be defined as measure:

Furthermore you can also delete the original column “Province/State”, because we have created a calculated column “CALC_Province” that has a better quality of data in it.

Before I create a model, I will also make sure that I create a planning model (no problem if you don’t have this option available, just ignore it. It means that your SAP Analytics Cloud environment does not come with planning license enabled. This is a separate purchase.). I will also define EUR as my currency. Note that for the time being we don’t have any measures that would carry currency information, but in case we would like to bring this into the model we can define it.

When finished, press “Create” and the model will be created:

Save the model the desired loaction:

After you save the model, it is available for further inspection or you can just use it to visualise the data from it. As you can see, we have a message that 220 rows were rejected. We are going to deal with it later. But just a quick look into the REJECTED file gives us an understanding what is the reason for it.

REJECTED file content:

Before I do any additional changes in the model I always like to explore the data from newly created model.

Create a new story and select option to explore the date. this will give us the possibility to select/filter the data in a simple way and SAP Analytics Cloud will provide chart suggestions:

Select “Data acquired from an existing model” and then select the model that you’ve just created:

As you can see from screenshot below, we are now in explorer mode, where we can “search for the needle in the haystack” in a simple and easy way.

First we are going to add all available dimensions into our view.

In the next step we’ll select on the measure “Confirmed Cases” and also select Date dimension:

As you can see the value of “Confirmed Cases” is around 3,6 mio which is way above the official number that we have (please note this was done on 17.3.2020). If we drill down to the day level, we get more appropriate numbers. So obviously we need to make additional tweaks to the model so that aggregation would work correctly across the date dimension. Remember, number of COVID-19 cases in our data set is organised as balance so we need to define an EXCEPTION AGGREGATION in the model.

Just before we jump into the modeller let’s build a simple story that will contain our chart that we plan to use to verify if the tweaks done in the model were correct.

Change the chart type to Bar/Column:

and move the chart to “Page 1” of the Story:

Resize the chart and use CTRL-C and CTRL-V to copy/paste the chart:

In the second chart drill-up to the level of month and save the Story.

Let’s open again our model and adjust some settings. Go into the ACCOUNT dimension. Please note that ACCOUNT dimension contains all measures defined for model.

We will change the setting of exception aggregation for Confirmed Cases measure. The exception aggregation type will be defined as LAST and we’ll select Date dimension as the exception aggregation dimension.

Please note this is a powerful capability that we can use to properly work with NON-ADDITIVE and SEMI-ADDITIVE measures. in our example “Confirmed Cases” is a SEMI-ADDITIVE measure which means it can be aggregated across all dimension and their hierarchy except the time dimension. Similar to this measure we also have many others like Inventory levels, Subscribers, Balance sheet items, etc.

I will also hide LAT and LONG measures since i don’t need them in my analysis.

After we save the changes, we can observe the difference in BEFORE and AFTER chart:

BEFORE:

AFTER:

Stay tuned for Part 2 of the blog series where I will show how to bring other data (Deaths, Recovered) into the same model, deal with rejected rows and show how to build some analysis.

Rating: 5 / 5 (1 votes)

The post Coronavirus/COVID-19: How to model & analyse data with SAP Analytics Cloud (Part 1) appeared first on ERP Q&A.

]]>