SAP HANA SPS 12: New Developer Features; Database Development

SAP HANA SPS 12: New Developer Features; Database Development

HANA Deployment Infrastructure(HDI)

The HANA Deployment Infrastructure, HDI for short, was first introduced in HANA 1.0 SPS11 as part of the rollout of XS Advanced(XSA). While XSA was officially released, at the time there was very little tooling for XSA, and the developer experience was not quite complete. Now that we have shipped the SAP HANA WebIDE for SAP HANA back in mid March, we’ve improved the experience quite a bit and filled several important gaps. So with SPS12, we want to re-introduce the concept of the HANA Deployment Infrastructure.

The vison of the HANA Deployment infrastructure is to simplify the deployment of database objects into the HANA database. We wanted to describe the HANA persistency model using file based design time artifacts, for exampele .hbdcds and .hdbprocedure files, and so on. We wanted an all-or-nothing approach, so if there are many dependent artifacts within your container, and any one of them fail to get created in the DB, then the entire build will fail and nothing gets created. We wanted a dependency based incremental deployment, which means we don’t want to drop everything and recreate it every time we build, we only want the changed objects to be adjusted. We wanted complete isolation of the applications database objects. This is achieved by the concept of containers, where each container corresponds to an underlying schema. This adds additional security since each underlying schema, containing its deployed objects, is owned by a specific schema technical user.

Defined, HANA Deployment Infrastructure is a service layer of the HANA database that simplifies the deployment of HANA database artifacts by providing a declarative approach for defining database objects and ensuring a consistent deployment into the database, based on a transactional all-or-nothing deployment model and implicit dependency management. HDI is based on the concept of containers, which allows for multiple deployments of the same application. This means you could have two versions of the same application running on the same HANA instance at the same time. Additionally, the focus of HDI is deployment only, so there is no versioning or life cycle management built into it. You would use Git for repository and version management, and Gerrit for code review and approvals workflow. Lastly, HDI supports the creation of database object only, so it does not support javascript, odata services, or any other application layer artifacts.

Again, we use a container concept in HDI where each container is a database schema. Actually it is a set of schemas, one main schema where the runtime objects reside, and several other supporting schemas used by the infrastructure itself. All of the artifact definitions, such as a CDS file, need to be defined in a schema-free way, where as in the past, you would put a schema annotation in your CDS file. This is no -longer supported when using XSA/HDI. The database objects within the container are owned by a container specific technical user and only this user has access to the objects within that container. You can reference other artifacts outside of your container via database synonyms.

HDI lives on top of the database conceptually. The HDI build interface is implemented as a node.js module. This node.js module simply calls the HDI APIs within HANA. This set of HDI APIs, which are implemented as stored procedures, are actually copied into each container schema in HANA, so each container gets its own copy of the API upon creation. Within these APIs, of course we are simply using SQL to create the runtime objects themselves. Finally, HDI runs inside its own process as part of the overall HANA core, one process per logical database.

SAP HANA SPS 12: New Developer Features; Database Development

The following is a list of artifacts which are supported by HDI. New artifacts added as of SPS12 largely deal with Text Analysis and Text Mining.

Virtual Tables & Configurations
Fulltext Indexes
Projection Views & Configurations
Scalar/Table Functions
Virtual Functions & Configurations
Table Types
Procedure Libraries
Graph Workspaces
Synonyms & Configurations
BI Views/Calculation Views
Core Data Services
Data Control Language(DCL)
Analytical Privileges
AFFLANG Procedures
Virtual Function Packages
Table Data w/ CSV Files
Table Data w/ Properties Files
Search Rule Sets
Replication Task
Structured Privileges
Public Synonyms
Text Analysis Configuration
Text Analysis Dictionaries
Text Analysis Extraction Rules
Text Analysis Extraction Rules Includes
Text Analysis Extraction Rules Lexicons
Text Mining Configurations


Core Data Services(CDS)

There have been several new features added in SPS12 for Core Data Services, or CDS. CDS was introduced in HANA 1.0 SPS06 and continues to be enriched with each SPS.

Graphical Editor

We introduced a new graphical editor for CDS artifacts in the SAP WebIDE for SAP HANA during the SPS11 delivery of the tool. This new graphical editor displays types as well as entities and views. It also shows the associations between entities as well as external dependencies. Of course you still have the ability to open the CDS artifact via the text editor as well.

SAP HANA SPS 12: New Developer Features; Database Development

Auto Merge

AUTO MERGE has been supported for quite some time with HANA core SQL, but only now supported in the context of CDS with SPS12. AUTO MERGE is used to enable the automatic delta merge. You can simply include AUTO MERGE or NO AUTO MERGE within the technical configuration section of an entity definition.

entity MyEntity {


} technical configuration {

[no] auto merge;


Index Sort Order

As of SPS12, within the definition of an index, you can now define the sort order per column. You can use ASC for ascending order, and DESC for descending order. You have the option to sort by a column grouping or by an individual column. Ascending is the default order when the order specification is omitted.

entity MyEntity {
key id : Integer;
a : Integer;
b : Integer;
c : Integer;
s {
m : Integer;
n : Integer;
} technical configuration {
index MyIndex1 on (a, b) asc;
unique index MyIndex2 on (c asc, s desc);
index MyIndex3 on (c desc, s.n);

Explicit Specification of CDS Types for View Elements

In a CDS view definition, it is now possible in SPS12 to specify the type of a select item based on an expression. For example, in the following example, if you define a column as A + B as S1, the resulting type is lost. You can now explicitly define the type.

type MyInteger : Integer;
entity E {
a : MyInteger;
b : MyInteger;
view V as select from E {
// has type MyInteger

// has type Integer, information about
// user defined type is lost
a+b as s1,
// has type MyInteger, explicitly specified
a+b as s2 : MyInteger


Defining Associations in Views

Another new features in SPS12, is associations in view definitions. In order to define an association as a view element, you need to define an ad-hoc association in the MIXIN section and then put this association into the select list. In the ON-condition of such an association you need to use the pseudo-identifier $PROJECTION to signal that the following element name is to be resolved in the select list of the view rather than in the entity in the FROM clause.

entity E {
a : Integer;
b : Integer;
entity F {
x : Integer;
y : Integer;
view VE as select from E mixin {
f : Association[1] to VF on f.vy = $projection.vb;
} into {
a as va,
b as vb,
f as vf
view VF as select from F {
x as vx,
y as vy

CDS Extensibility

The CDS extension mechanism, delivered with SPS12, allows adding properties to existing artifact definitions without modifying the original source files. The benefit is that a complete artifact definition can be split across several files with different lifecycles and code owners. The EXTEND statement changes the existing runtime object, it does not define any additional runtime object in the database. The extensibility feature uses the concept of extension packages. An extension package or simply package is a set of extend statements, normal artifact definitions (e.g. types which are used in an extend declaration), and extension relationships or dependencies. Each CDS source file belongs to exactly one package, i.e. all the definitions in this file contribute to that package. On the other hand, a package usually contains the contributions from several CDS source files. A package is defined by a special CDS source file named .package.hdbcds. The name of the package defined in the file must be identical to the namespace that is applicable for the file (as given by the relevant .hdinamespace file). With this new feature, we are able to extend several different aspects of a CDS file including; adding new elements to a structure type of entity, adding new select items to a view, adding new artifacts to an existing context, assigning further annotations to an artifact or element, and extending the technical configuration section of an existing entity.

Let’s have a look at an excessively simplified CRM scenario below. The base application has a CDS file called “Address.hdbcds” which contains a Type called “Address”, it also has another CDS file called “CRM.hdbcds” which uses the “Address.hdbcds” CDS file. Within the “CRM.hdbcds” file, we then have a context called “CRM” which contains an entity called “Customer” which has a column called “name” of type string and “address” of type “Address”.

SAP HANA SPS 12: New Developer Features; Database Development

In this first extension package call “banking”, we extend the “CRM” context and add a new type called “BankingAccount”. We then extend the “Customer” entity and add a new element called “account” which uses the Type called “BankingAccount”, so we have a new column called “account.BIC” and “account.IBAN” added to the “Customer” table.

In the second extension, we further extend the “Customer” entity by extending the types which it uses. First we will extend the “Address” type from the original “Address.hdbcds” file, and then extend the “BankingAccount” type which was defined by the previous extension. So in this case the “onlineBanking” extension depends on the “banking” extension, hence the reason why we have the DEPENDS clause in the package definition.

SAP HANA SPS 12: New Developer Features; Database Development

The final result is that we have new columns in the “Customer” entity, “account.BIC” and “acocunt.IBAN”, which were created by the “banking” extension, and “” and “account.PIN”, which were created by the “onlineBanking” extension.

SAP HANA SPS 12: New Developer Features; Database Development


SQLScript continues to be the stored procedure language used to take full advantage of the core capabilities of HANA such as massive parallel processing. Several new language features have been added in HANA 1.0 SPS12.

Global Session Variables

As of SPS12, we now have the concept of global session variables in SQLScript. Global session variables can be used to share scalar values between procedures and functions that are running in the same session. These are not visible from any other running session. We can use the SET statement to set a key/value pair in one procedure, and use the built in function called SESSION_CONTEXT to retrieve that value in a nested procedure or function call.

— Set Session Variable Value
SET ‘MY_VAR’ = :new_value;
— Retrieve Session Variable Value

Default empty for Table User Defined Functions

In SPS10, we introduced the ability to use the DEFAULT EMPTY extension when defining IN and OUT parameters of a procedure. This is useful for initializing a table parameter before its use within the procedure. As of SPS12, we now bring this same functionality to Table User Defined Functions as well.


SELECT * FROM myfunc();

Signatures for Anonymous Blocks

Anonymous Blocks were releases in SPS10, and allowed us to write SQLScript code in the SQL console without having to create a container, for example a procedure or function. This was a nice feature for creating quick and dirty test coding. The only problem was that it did not support input and output parameters. As of SPS12, we have added this feature. You can now define these parameters in the same way you would when defining parameters for a procedure. Both simple types and table types are supported as well as types defined via Core Data Services.

DO ( IN im_var INT => 5,
OUT ex_var INT => ?,
IN im_tab “” =>
OUT ex_tab “” => ?)
ex_var := im_var;
ex_tab = select * from :im_tab;

Enhancements for Implicit SELECT in Nested Calls

With this new feature in SPS12, implicit results from nested procedure calls are carried to the outermost procedure’s result. You must first set a parameter value in the indexserver.ini configuration file. This changes the default behavior system wide.

alter system alter configuration (‘indexserver.ini’, ‘system’) set (‘sqlscript’, ‘carry_nested_implicit_result’) = ‘true’ with reconfigure;

Until SP11, the nested implicit result is not carried to the caller. Its lifecycle goes with the nested call’s lifecycle. With this example, it is closed when the nested call statement(“call proc2”) is closed. From SP12, you can carry the nested implicit result with configuration change. When this configuration is on, the callee’s implicit result sets are carried to the caller.

Enhancements for Header-Only Procedure and Functions

Header-only procedures and functions were first introduced in SPS10 and allowed developers to create procedures and functions with minimum metadata first using the HEADER ONLY extension. The body of the procedure and function could then be injected into the container later using the ALTER PROCEDURE or ALTER FUNCTION statement. This allowed procedures and functions to be created without having to worry about the interdependencies between the procedures and functions. As of SPS12, we now have the ability to call a HEADER ONLY procedure from within a trigger, as well as the ability to create a view on a HEADER ONLY Table User Defined Function.

More: Complete preparation gudie on SAP C_HANAIMP_12 and C_HANATEC_12 certification on HANA SPS12

Leave a Reply

Your email address will not be published.