SAP SuccessFactors HXM Suite Archives - ERP Q&A https://www.erpqna.com/category/sap-successfactors/sap-successfactors-hxm-suite/ Trending SAP Career News and Guidelines Wed, 01 Jan 2025 09:54:06 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://www.erpqna.com/wp-content/uploads/2021/11/cropped-erpqna-32x32.png SAP SuccessFactors HXM Suite Archives - ERP Q&A https://www.erpqna.com/category/sap-successfactors/sap-successfactors-hxm-suite/ 32 32 Final Review Step in Onboarding 2.0 Process https://www.erpqna.com/final-review-step-in-onboarding-2-0-process/?utm_source=rss&utm_medium=rss&utm_campaign=final-review-step-in-onboarding-2-0-process Thu, 02 May 2024 11:58:54 +0000 https://www.erpqna.com/?p=84548 In most cases, customers want to verify the new hire data before moving Onboardee to manage pending hires. Finally, With the recent release, SuccessFactors provided us with the flexibility to add a step called “Final Review”. Final Review is an optional configurable step on the Process Variant Manager. This step can be added to your […]

The post Final Review Step in Onboarding 2.0 Process appeared first on ERP Q&A.

]]>
In most cases, customers want to verify the new hire data before moving Onboardee to manage pending hires.

Finally, With the recent release, SuccessFactors provided us with the flexibility to add a step called “Final Review”.

Final Review is an optional configurable step on the Process Variant Manager. This step can be added to your existing active workflows after personal data collection, additional data collection, or document flow.

Then, the process owner/onboarding coordinator can verify the full information that was entered by the Onboardee during the onboarding process before the new hire moved to Manage pending hires for the hiring process.

Note: The Final Review step can only be added to process flows where personal data collection is already present.

Here are the details steps to design the same in SuccessFactors.

1- Create/Modify Process Variant Manager:

Create a new process flow or update an existing Onboarding workflow by adding the Final Review step in the Process Variant Manager.

2- Add the Final Review Data Correction Message Template to the active email template list:

Add Final Review Data Correction Message Template using Reusing Preconfigured Email Templates under Email Services.

3- Assign a Responsible Group for Data Review:

Configure the Assign a Responsible Group for Data Review business rule scenario. Select Onboarding from the Process Type dropdown if you want to configure a business rule for New Hire Data Review or Final Review. Select Offboarding if you want to configure the rule for Employee Data Review.

4- Rule for Onboarding Task Participant Configuration:

Select the instance of the rule for Rule for Onboarding Task Participant Configuration and Rule for Offboarding Task Participant Configuration in Admin Center>Manage Data>Onboarding Configuration> DEFAULT_ONB2_CONFIG (DEFAULT_ONB2_CONFIG).

Once added to the workflow, the responsible user can review and edit the new hire data or request a correction from the new hire on the Provide Personal Data page.

On selecting the Review Completed button, the new hire appears in Manage Pending Hires as Ready to Hire once all the other configured onboarding tasks are complete.

Alternatively, if the responsible user selects the Request Correction button, the Request Personal Data Correction pop-up is displayed, where the responsible user can add a comment explaining the reason for the correction request.

When a correction is requested, personal data collection is reopened, and both the new hire and the responsible user receive email notifications. The Provide Personal Data page displays a pop-up with the responsible user’s comments.

Note: If the responsible user edits the new hire data and then requests a correction, changes are automatically saved.

Rating: 0 / 5 (0 votes)

The post Final Review Step in Onboarding 2.0 Process appeared first on ERP Q&A.

]]>
How to Avoid Accidental Data Deletion in MDF Portlets https://www.erpqna.com/how-to-avoid-accidental-data-deletion-in-mdf-portlets/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-avoid-accidental-data-deletion-in-mdf-portlets Tue, 14 Mar 2023 10:41:44 +0000 https://www.erpqna.com/?p=72836 The frequent occurrence of accidental data deletion is largely attributed to human error and can result in the loss of critical employee or system data which may cause dominoes effect across the system landscape, and other consequences. While there are remedial measures such as running audit reports and data retrieval to address such incidents, it […]

The post How to Avoid Accidental Data Deletion in MDF Portlets appeared first on ERP Q&A.

]]>
The frequent occurrence of accidental data deletion is largely attributed to human error and can result in the loss of critical employee or system data which may cause dominoes effect across the system landscape, and other consequences. While there are remedial measures such as running audit reports and data retrieval to address such incidents, it is essential to acknowledge that prevention is preferable to cure.

Therefore, organizations should take a proactive approach to prevent such incidents from occurring. Both technical and non-technical measures can be employed, but this article focuses on a technical solution that can prevent accidental data deletion

In this blog post, we’ll discuss how to remove delete option for any custom portlet to avoid accidental data deletion

Let’s delve into two scenarios under each of the options – RBP and Configuration UI – to help you find the best fit for your needs.

Role Based Permissions (RBP):

Let’s start with the RBP option. This option is perfect if you want to grant different permissions for different roles, allowing you to keep the delete option for a few roles and remove it for others. However, keep in mind that this option can be time-consuming as you’ll need to update all the concerned roles with the required permission for the concerned objects.

Scenario 1: Portlet with base object which has effective dating

Lets go through the steps to remove the “Delete” button of the portlet highlighted below

First, make sure to secure your MDF object, head to “Configure Object Definitions,” choose the required object, and update the security section’s fields as highlighted in the screenshot and save the object

Once you’ve secured the object, go to “Manager Permission Roles” to remove the delete option for the concerned role. Open the required role, click on the permission button to see permission settings, and uncheck the delete option in the selected permission category before saving the role.

You’ll no longer see the “Delete” button for the people who have that role.

In scenario 2: Portlet which has a base object with no effective dating

For the portlet which has composite association you can see “Trash” icon for each child record along with the “Delete” button at the bottom portlet as in scenario one

You’ll need to secure both the parent and child objects using “Configure Object Definitions” as in scenario 1

Following parent object has effective date as none and has composite association with child object

Child object has effective dating from parent

Update the security section of both objects as below

Once securing the object go to “Manager Permission Roles” to remove the delete option for concerned role as in scenario 1

As you can see there is no delete option for parent object as it has no effective dating ,its only available for child object. Uncheck delete option for child object and save the role

Unfortunately, the delete permission of the child object only works for the “Trash” icon and not for the “Delete” button of the portlet, as it’s one of the limitations under RBP option for objects without effective dating.

Configuration UI:

If you’re looking for an easier option, the Configuration UI option is the way to go. This option removes the delete option for all roles, saving you the time and effort of updating individual roles. Here we don’t need to perform any of the two steps which performed under RBP option

Scenario 1: Portlet with base object which has effective dating

Go to “Manage Configuration UI,” search for the UI ID created for the base object, click on edit properties and set the delete record field to “No” under control options section and save

This will remove the “Delete” button for all the roles as this change is at the UI level

In scenario 2: Portlet which has a base object with no effective dating

Go to “Manage Configuration UI” and search for Id the UI which you have created for the base object.

Click on edit properties and set the delete record field to “No” under control options section for both parent and child objects

You’ll no longer see both “Trash” icon and “Delete” button for the portlet.

In conclusion, by using one or both of these options, organizations can rest assured that their MDF portlets are safe from accidental deletions. This can help prevent data loss, reduce downtime and ensure that the applications run smoothly. It is important to assess the specific needs and requirements of the application to determine which option or combination of options will provide the best protection for the MDF portlets.

Rating: 0 / 5 (0 votes)

The post How to Avoid Accidental Data Deletion in MDF Portlets appeared first on ERP Q&A.

]]>
Learning License User Types in SAP SuccessFactors Learning https://www.erpqna.com/learning-license-user-types-in-sap-successfactors-learning/?utm_source=rss&utm_medium=rss&utm_campaign=learning-license-user-types-in-sap-successfactors-learning Mon, 13 Feb 2023 10:40:09 +0000 https://www.erpqna.com/?p=72089 SAP SuccessFactors licenses for Learning classify the users into two License user types, “Active” or “Functional”. It is important to classify the users into right type for compliance with the licenses purchased by the customer. Difference between Functional user and Active user: The Learning management system does not differentiate between an Active and Functional User. […]

The post Learning License User Types in SAP SuccessFactors Learning appeared first on ERP Q&A.

]]>
SAP SuccessFactors licenses for Learning classify the users into two License user types, “Active” or “Functional”. It is important to classify the users into right type for compliance with the licenses purchased by the customer.

Difference between Functional user and Active user:

The Learning management system does not differentiate between an Active and Functional User. The License type classification is only used for licensing and reporting reasons.

Who is an Active Learning User?

Employees with active profiles in the system that are not classified as functional are classified as ACTIVE

Who is a Functional Learning User?

Employees and non-employees whose learning history records are recorded within Learning but are not given access to the Learning management system are classified as FUNCTIONAL

Below are few examples of how customers can categorize the users under the two License types.

  • Functional User: Users who are Seasonal, Mastered in SAP HCM, Contractor, Retirees, Former employees with access
  • Active Users: Users who are Full time, Part time, Leave of Absence

A customer can categorize any user as Functional user only if they have purchased the Functional licenses.

How are the Active/Functional users measured?

The License User type is measured by SAP as below:

Active User –

Active user count includes the below criteria :

  • All the users who have ACTIVE status and are not classified as Functional user and not using Learning market place in the LMS
  • ALL users who have one learning history recorded in the Learning system for an external or internal course or have a course enrolled or launched a course on their learning plan

Functional User –

Functional user count includes the below criteria :

  • All the users who have ACTIVE status and are classified as Functional user and not using Learning market place in the LMS
  • All users who have one learning history recorded in the Learning system for an external or internal course or have a course enrolled or launched a course on their learning plan

Methods of updating the License user type

The License User type field on the User record can be updated using any of these methods:

  1. Manual update of License User Types in Learning Administration
  2. Importing License User Types in Learning Administration (Batch Import)
  3. Importing License User Types in Learning Administration (using User Connector)
  4. Updating the Succession data model with Standard element-“LearningLicenseUserType” for sending the License information from Bizx to Learning

1. Manual update of License User Types in Learning Administration

If customers have purchased SAP SuccessFactors Learning external or functional licenses and are using shopping account type or any other custom field in the system to classify the users as external, the user license types must be assigned to the External users as Functional.

Procedure to set License user type:

  • Go to Learning Administration> People> Users
  • Search for the user to find the user with the incorrect license user type and edit the record.
  • In License User Type, select the license type of the user:
    • Active
    • Functional

Search for users and their status

Search Users by Learning License User Type

2. Importing License User Types in Learning Administration (Batch Import)

To update multiple user license types in learning, you may import the user records with the license types .

Procedure:

  • In Learning Administration> System Administration >Tools> Import Data.
  • Select the Record type of User and then select Download Template to download the latest import template.
  • Add the users in the User ID column and add the correct License User Type for the users and save your changes.
  • Select Import Data to import the file updated with records
  • You may import as a background job or choose to run the job immediately or to schedule to run the job.
  • Select Finish.

Note: When adding new users, if license user type value is left blank, the system will default the type to Active.

Import User Data

Download the Import data Template for User import

Import User Data

3. Importing License User Types in Learning Administration (using User Connector)

If the license user type is not maintained in your HRIS, and you would like to update it directly in LMS, you may use the User connector to update the license user type in LMS

For updating license user type of users using User Connector, follow the below steps:

  1. Update the Connector configuration here -System administration > Configuration > System Configuration > Connectors = user.field.mapping.data. LICENSE_USER_TYPE=FUNCTIONAL
  2. Identify User records – Identify the user records to update the Correct License user type
  3. Download the User connector template here-System administration > Connectors > Download connector template> Select User connector
  4. Create the user connector input file with the users and the license type(values are FUNCTIONAL & ACTIVE) and other required fields
  5. Import the User data by placing the input fule in SFTP folder as per the Connector configuration
  6. Run the user connector
  7. Verify the status of the import and the connector results
  8. Check a user record to confirm the user license type is updated as per the file

4. Updating License User Types in SuccessFactors Learning using Employee export job (provisioning)

The Standard element “Learning License User Type” is optional in the Succession Data model by default, hence it must be added to the Succession data table and imported in Provisioning and sync the license user type data with LMS using User connector SF.

There are two options for creating the Learning License User Type in Foundation:

  • Create a custom field in Employee profile
  • Or enable the Standard element “Learning License User Type” in Succession data table.

While you could use custom fields in the SAP SuccessFactors employee profile and map it to the SAP SuccessFactors Learning license user type. However, you cannot use a custom field in SAP SuccessFactors Learning for license type.

Steps to configure:

1. In Provisioning:

Configure the Succession data model by adding the standard element named “Learning License User Type” ) in provisioning as below:

</standard-element>

<standard-element id=”LearningLicenseUserType” max-length=”255″ required=”false” matrix-filter=”false”><label>Learning License User Type</label>

When you import Learning license types, the supported values are “active” and “functional” and their variations. If you enter values other than the listed ones below, they won’t be saved.

ACTIVE, Active, and active.

FUNCTIONAL, Functional, and functional.

2. In SuccessFactors Learning:

To set license type for internal users with the data flows between Foundation (Bizx) and Learning LMS, the new connector parameters have to be set up as below :

System Configuration> Connectors

sfuser.field.mapping.data.LICENSE_USER_TYPE=ACTIVE or FUNCTIONAL (case sensitive!)

sfuser.field.transform.data.LICENSE_USER_TYPE.ACTIVE=Active, active

sfuser.field.transform.data.LICENSE_USER_TYPE.FUNCTIONAL=Functional,functional

sfuser.connector.defaultValue.LICENSE_USER_TYPE=ACTIVE

Learning Analytics for measuring the Active and Functional users

LMS Usage Metrics Report in LMS shows the number of Active /Functional users in the system along with users using Opportunity marketplace. This report can be used for audit purposes and comparing the Number of user types against the license user types purchased by the customer.

All Functional users with active profiles in the system are audited against the number of Functional licenses purchased by the customer

All Active users with active profiles in the system are audited against Full licenses purchased by the customer.

If the customer only has Full licenses, then all internal or functional users with active profiles in the system will be audited against Full licenses purchased by the customer

An Administrator with permissions for this report can run this report to see whether the number of users in the system match your license agreement. It is a good practice for a designated admin to check for compliance by running the Learning usage metrics report periodically.

If you have multiple productive instances, then the report must be run in all the instances and calculate the cumulative number to compare with the licenses purchased.

Note: The report is based on data collected each weekend and stored in LMS. It is not a real time report. Inactive or deleted users are not included in the report.

LMS Usage Metrics Report

I created this blog to present few options available for customers to stay within the license permits and to change the Learning License User Types when they exceed the licenses subscribed to with Partner’s help.

Rating: 0 / 5 (0 votes)

The post Learning License User Types in SAP SuccessFactors Learning appeared first on ERP Q&A.

]]>
Auto Generate Career Path External Code https://www.erpqna.com/auto-generate-career-path-external-code/?utm_source=rss&utm_medium=rss&utm_campaign=auto-generate-career-path-external-code Sat, 11 Feb 2023 04:58:48 +0000 https://www.erpqna.com/?p=72073 Purpose: In this blog, the user will learn how to create system-generated external code for a career path in SAP SuccessFactors Career Development Planning. Career Path: The Career Path displays information about the expected path for a position or various positions. HR managers and HR administrators can create multiple new career paths and associate them […]

The post Auto Generate Career Path External Code appeared first on ERP Q&A.

]]>
Purpose:

In this blog, the user will learn how to create system-generated external code for a career path in SAP SuccessFactors Career Development Planning.

Career Path:

The Career Path displays information about the expected path for a position or various positions.

HR managers and HR administrators can create multiple new career paths and associate them to a given role, as well as restrict employee access to the career paths displayed in the Career Worksheet based on criteria such as role, department, or division. The career paths created are instance wide.

However, many clients need system-generated external code while creating a career path as it’s confusing for the end user to enter a value in the external code field. To meet this requirement, I have found a solution which is explained below in detailed steps, so that if anyone has the same request by the client, they can also benefit from this and fulfill the client’s request.

Create Career Path Screen
Detailed Career Path Screen

Auto-Generated External Code Configuration Steps:

  • Create Sequence: Initially, we need to create a number sequence based on the client’s requirements.

Below are the steps to configure the approval workflow in SAP SuccessFactors:

Admin Center>Employee Files>Manage Data.

Manage Data Screen
Create Sequence Screen
  • Create Business Rule: Once the Sequence is created, the next step is to link this sequence to the business rule.

Below are the steps to create a business rule in SuccessFactors:

Admin Center>Company Settings>Configure Business Rules

Business Rule Screen
Business Rule
  • Assign Business Rule to Career Path Object: The final step is to assign the business rules to the Career Path object.

Below are the steps to assign a business rule to the Career Path object in SuccessFactors:

Home>Company Settings>Configure Object Definition>Select Career Path Object>Edit the object

Configure Object Definition Screen

Go to the “Initialize Rule” field and select the created business rule from the drop-down and hit the “Save” button.

Rule Mapping

Now, as we have completed all the configuration steps. Users can access the Career Path from the Home Navigation drop-down>Development>Career Path

Career Path Screen

Click on the “Create New Career Path” button to see the system-generated external code based on the above configuration.

Create New Career Path Screen

This workaround might keep the client happy since this has been a common ask to generate external code automatically.

Note: These same steps can be followed to auto-generate external code for the talent pool in Succession Planning.

Rating: 0 / 5 (0 votes)

The post Auto Generate Career Path External Code appeared first on ERP Q&A.

]]>
SuccessFactors Alert Message & Message Definitions: Enhance User Experience https://www.erpqna.com/successfactors-alert-message-message-definitions-enhance-user-experience/?utm_source=rss&utm_medium=rss&utm_campaign=successfactors-alert-message-message-definitions-enhance-user-experience Wed, 08 Feb 2023 10:31:51 +0000 https://www.erpqna.com/?p=72042 Introduction User experience has become a critical part of a successful implementation of SuccessFactors projects. One way to help customers increase the user experience is by enhancing alert messages and message definitions. These two functionalities are used to throw error messages, warning messages, or info messages during HR Transactions. Alert messages are highly used to […]

The post SuccessFactors Alert Message & Message Definitions: Enhance User Experience appeared first on ERP Q&A.

]]>
Introduction

User experience has become a critical part of a successful implementation of SuccessFactors projects. One way to help customers increase the user experience is by enhancing alert messages and message definitions. These two functionalities are used to throw error messages, warning messages, or info messages during HR Transactions. Alert messages are highly used to remind certain employees about a specific system event that may require their attention.

In this blog, I’ll provide some examples that you can use or adapt to your current configuration.

Details

Before jumping to the samples, it’s important to understand what can be used in both features to format the text.

  • Alert messages support HTML tags. HTML stands for HyperText Markup Language – it is the standard markup language for web pages. HTML elements are composed of opening tags, content, and closing tags.
  • On the other side, message definitions support BBCode (“Bulletin Board Code”). It is similar to HTML in the sense that BBCode also uses tags to format a specific text or word, but with a different convention.

Sample 1 – E-mail address in Alert Message:

E-mail addresses are defined as per the below tags:

<a href=mailto:”people@bestrun.com”>people@bestrun.com</a>

Let’s decode the above expression:

  • The <a> tag defines a hyperlink, which is used to link from one page to another. In our case, we want to open a new e-mail of our default mail client
  • Next we have the attribute href (stands for “Hypertext Reference”) – this attribute indicates the relationship between pages to search engines. In this case, we want to activate the default mail client on the computer for sending an email – so we use the “mailto” link.
  • The last part is the content that would be displayed in the alert message with the embedded link (marked in bold). For this example, I have used the e-mail itself, but you could have used any other text.
Image from Demo

Sample 2 – URL in Alert Message:

HTML links are defined as per the below tags (HTML):

<a href=”https://www.google.com/” target=”_blank”>here</a>

Let’s decode the above expression:

  • The <a> tag and href element were already clarified above
  • target=”_blank” is used to open the link in a new browser window or tab
Image from Demo

Sample 3 – Tables in Alert Message:

Tables are defined as per the below tags (HTML):

<table style=”width:100%”>
  <tr>
   <th>Document Type</th>
   <th>Renew Start Date</th>
  </tr>
  <tr>
   <td style=”text-align:center”>Permanent Resident </td>
   <td style=”text-align:center”>6 months before exp date</td>
  </tr>
  <tr>
   <td style=”text-align:center”>Conditional Permanent Resident </td>
   <td style=”text-align:center”>90 Days before exp</td>
  </tr>
 </table>

Let’s decode the above expression:

  • <table> – this tag defines the overall table
  • width is an HTML attribute that specifies the width of the element – since I have chosen 100% – it means this table will occupy all available width within my page
  • <tr> – this HTML element defines a row of cells in a table
  • <th> – this HTML element defines a cell as the header of a group of cells
  • <td> – this HTML element defines a standard data cell in a table
  • The text-align property specifies the horizontal alignment of text in an element – since I have used center – it means my content will be displayed horizontally centered in the standard cells
Image from Demo

Sample 4 – Background images in Alert Message:

A background message is defined per the below tags (HTML):

<p style=”background-image: url(‘https://www.sap.com/dam/application/imagelibrary/photos/291000/291639.jpg/_jcr_content/renditions/291639_homepage_3840_1200.jpg.adapt.1920_600.false.false.false.false.jpg/1644994056676.jpg’);”>

(…remaining content…)

</p>

Let’s decode the above expression:

  • The <p> tag defines a paragraph in a webpage and indicates a line change.
  • Next we have the tag <style> – this tag is usually used to define the style of the area or document that you’re currently working on.</p>
  • The last part is the background-image (which is a property that we can define in an HTML element) and the image URL that we would like to display in our HTML element (please be aware that this image should be placed in a public directory accessible through the Public Internet).
Image from Demo

Sample 5 – URL in Message Definition:

Now, let’s take a quick look at how the message definitions can be enhanced using BBCode.

In the below example, I have enhanced the message definition with the following:

  • set the risk level in bold and with a different color;
  • add links to external sites to share more information.

Here you can find the different codes that I used in my example:

Be aware that the country you are visiting has a [b][color=#FF9B4F]moderate[/color][/b] COVID-19 risk. Please check the local guidelines [xurl=https://google.com]here[/xurl] and regulations [xurl=https://www.google.pt]here[/xurl] or reach out to HR.

Tags:

  • [b] – this tag set the specific text in bold
  • [color] – this tag defines the colour of the text. I have used the hexadecimal colour value, but you can use also the colour name
  • [xurl] – this tag creates a hyperlink with the value defined, and it opens it in a new web browser. If you want to open the link on the same web page, you can use the BBCode tag [url].
Image from Demo

Sample 6 – Dynamic Fields in Message Definition:

Another option to increase the user experience in Message Definition is by adding dynamic fields. The dynamic fields can be defined in the Parameters section of the Message Definition creation page.

In my example, I have added the country in curly brackets (since it is my dynamic field) and I have added the parameter country in the Parameters Section.

Image from Demo

The next step is to create a business rule. Once you define the Then statement, the parameter defined in the message definition will be displayed in bold. You just need to select the field that you would like to display – in my case I am picking the name of the GO Country.

Image from Demo

Here you can find the code that I add in the message definition text:

Be aware that {country} has a [b][color=#FF9B4F]moderate[/color][/b] COVID-19 risk. Please check the local guidelines [xurl=https://google.com]here[/xurl] and regulations [xurl=https://www.google.pt]here[/xurl] or reach out to HR

Image from Demo
Rating: 0 / 5 (0 votes)

The post SuccessFactors Alert Message & Message Definitions: Enhance User Experience appeared first on ERP Q&A.

]]>
SAP Integration Suite, advanced event mesh: Using SAP SuccessFactors solutions as an Event Source https://www.erpqna.com/sap-integration-suite-advanced-event-mesh-using-sap-successfactors-solutions-as-an-event-source/?utm_source=rss&utm_medium=rss&utm_campaign=sap-integration-suite-advanced-event-mesh-using-sap-successfactors-solutions-as-an-event-source Sat, 24 Dec 2022 09:58:53 +0000 https://www.erpqna.com/?p=71231 SAP SuccessFactors solutions are cloud-based HCM software applications that support core HR and payroll, talent management, HR analytics and workforce planning, and employee experience management. SuccessFactors solutions are used by over 235+ million users in more than 200 countries and territories around the world. SAP SuccessFactors Intelligent Services Events SuccessFactors already comes with Intelligent Services […]

The post SAP Integration Suite, advanced event mesh: Using SAP SuccessFactors solutions as an Event Source appeared first on ERP Q&A.

]]>
SAP SuccessFactors solutions are cloud-based HCM software applications that support core HR and payroll, talent management, HR analytics and workforce planning, and employee experience management. SuccessFactors solutions are used by over 235+ million users in more than 200 countries and territories around the world.

SAP SuccessFactors Intelligent Services Events

SuccessFactors already comes with Intelligent Services events that allow to simplify HR workflows with the capabilities of these HTTP-based events. As a result, a number of SAP-built events are already available in SAP SuccessFactors that can be adjusted to specific use cases and needs and therefore used in event-driven business cases around SuccessFactors.

Intelligent Services Events include for example:

  • Employee Hire – a new worker is created with a specified start date
  • Change in Manager – published after a job information change for an employee that has been assigned to a new manager
  • Change in Employee Location – a worker has moved to a new location

A list of all available Intelligent Services Events can be found in the SAP SuccessFactors solutions documentation here.

SAP Event Mesh as an event broker for SuccessFactors

A few years ago my colleague Sai Harish Balantrapu has written an excellent blog on using SAP Event Mesh as an event broker for SuccessFactors.

We will take the approach this blog has described and adjust it for usage with our new offering SAP Integration Suite, advanced event mesh. In the end the approach remains the same, with just a few adjustments that are needed.

A lot of ground we are covering here has been described in the original blog. I was wondering whether it would make sense to just describe the differences, in the end decided to give you the full picture here to make it as easy as possible to follow.

Again, Kudos to Sai Harish for all the groundwork!

SAP Integration Suite, advanced Event Mesh

SAP Integration Suite, advanced event mesh is a fully managed event streaming and management service that enables enterprise-wide and enterprise-grade event-driven architecture. Advanced Event Mesh is a distributed mesh of event brokers that can be deployed across environments, both in the cloud and on-premise It offers a full purpose set of eventing services covering all relevant use cases AEM supports event streaming, event management and event monitoring Brokers fully scale as required and come in T-shirt sizes to perfectly fit different needs

High Level Overview of our Approach

On the SuccessFactors side we will create an integration in the Integration Center. The destination for this integration is going to be REST and we will choose JSON format for the event. We will add selected fields to the event. Then we will have to create the destination settings. We will use basic authentication and REST. The information for the destination we will have to look up in Advanced Event Mesh, so keep it open in parallel.

There is a very important step that we, most likely and depending on your individual settings, have to take before: we have to change the ports we use. The standard settings for SAP SuccessFactors ports don’t fit the standard settings for Advanced Event Mesh ports. So either we have to open up our AEM standard ports in SuccessFactors, or we can just adjust the ports on the Advanced Event Mesh side. Here we will just adjust the AEM port settings since this is very straightforward.

Preparation on the Advanced Event Mesh side

Go to the Cluster Manager and select your Event Broker

Click on Manage

Click on Advanced Options

Scroll down to Port Configuration

Expand Public Endpoint

Check on the value for Secured REST Host. AEM standard settings here would be 9443 which is typically blocked by SAP SuccessFactors. By standard the Secured Web Messaging Host is set to port 443 in AEM.

If you would like to adjust the ports on the AEM side (remember, you could open up the port on the SuccessFactors side as well), click on Edit.

Change the Secured Web Messaging Host port to a different value (e.g. 7443)

Then change the Secured REST Host to use port 443.

It might take some time for these settings to reflect.

Steps on the SuccessFactors side

Step 1

Logon to the SuccessFactors Home Page

Then search for Integration in the search field

Select Integration Center

Step 2

Click on the tile My Integrations

Step 3

Click the Create Button to create a new integration between SuccessFactors and Advanced Event Mesh

Step 4

Select More Integration Types

Step 5

On the next screen, select:

Trigger Type → Intelligent Services
Destination Type → REST
Source Type → SuccessFactors
Format → JSON

Click the Create Button

Step 6

A list of all available Intelligent Service Events is displayed. Let us look at the Employee Hire event. Therefore, select Employee Hire

Step 7

The Employee Hire event information is displayed in the right-side panel and you can see the fields including a data preview.

Click the Select button

Step 8

Enter an Integration Name and a Description and click Next

Step 9

On the next screen, click the + button and select Insert Sibling Element

Step 10

Select the newly added Element and enter the following in the Label field: context

Step 11

Select the Element “Context” and click the + button and select Add Child Element

Step 12

Select the newly added Element and enter the following.

Label: userId
Description: userid of the new hire.

Click the

button to do a mapping of userId

Step 13

Click Entity Tree View and Select User ID

Step 14

Click Change Association to “User ID”

Step 15

Select the context Element, click the + button and choose Add Child Element

Step 16

Select the Element and adjust as follows:

Label: managerId
Description: Manager of the new Hire
Default Value: “Enter your user id”

Step 17

Click on Next and then on the Response Fields screen click on Next again until you have made it to the Filter screen

Step 18

Expand Advanced Filters, then enter the following:

  • Field: context/userId
  • Operation: is equal to
  • Value: <a user in your system>.

The filter value enables you to test the integration.

Click on Next.

Step 19

In the destination settings click on REST Server Settings and enter the following pieces of information from Advanced Event Mesh:

Connection Name → <Any Name>
REST API URL → <From AEM Secured REST Host>+ your topic (e.g. /successfactors)
User Name → <From AEM Username>
Password → <From AEM Password>

To get this information, in Advanced Event Mesh go to your broker. Then select Connect.

IMPORTANT: you need to add a topic to write to. So add for example /successfactors as a topic at the end of the REST API URL

Click Next.

Step 20

Save the Integration by clicking the Save button

Step 21

Click on “Run Now” to test the event generation.

Steps on Advanced Event Mesh side

Employee Hired events are now configured on the SAP SuccessFactors side and will be written to a topic. In order to receive these events in Advanced Event Mesh we will create a queue and a queue subscription to our topic.

Step 21

Go to the Cluster Manager in Advanced Event Mesh, select your broker and click on Manage

Step 22

Click on Queues

Step 23

A new window opens up. Click on the +Queue button

Step 24

Enter a name for the queue, e.g. SuccessFactors and click Create

Step 25

On the next screen, click Apply

Step 26

Click on the queue you have just created. Then click on Subscriptions.

Step 27

Click on the button +Subscriptions

Step 28

Enter the topic you had used earlier on the SuccessFactors side as part of the REST URL. Most likely successfactors

Click Create

Step 29

Go back to the Queues screen. You queue is now subscribed to your topic.

Summary and Test

You have set up an event end-to-end all the way from SAP SuccessFactors to SAP Integration Suite, advanced event mesh where it ends up in your queue based on a queue subscription. In a next step you could now consume the event from the queue.

This diagram again shows the steps you have taken in SuccessFactors.

Now you can test your setup.

Go back to SuccessFactors and select your integration. Click the Run Now button to test the integration.

Once the integration has run successfully you can see this in the status of Last Run Time.

Once that has happened, go back to your queue in Advanced Event Mesh and check on whether the event has ended up in your queue.

You should now have a basic setup for event exposure from SAP SuccessFactors to SAP Integration Suite, advanced event mesh up and running.

Rating: 0 / 5 (0 votes)

The post SAP Integration Suite, advanced event mesh: Using SAP SuccessFactors solutions as an Event Source appeared first on ERP Q&A.

]]>
Generate report in Excel format and send as Email attachment to users using Report Distributor https://www.erpqna.com/generate-report-in-excel-format-and-send-as-email-attachment-to-users-using-report-distributor/?utm_source=rss&utm_medium=rss&utm_campaign=generate-report-in-excel-format-and-send-as-email-attachment-to-users-using-report-distributor Wed, 09 Nov 2022 10:32:34 +0000 https://www.erpqna.com/?p=69546 Introduction: The Report Distributor tool allows users to schedule and distribute reports via e-mail, send reports to a FTP server, or run reports offline. Foremost, create bundles containing the necessary reports & then select the items to distribute, the destination, the recipients, and the schedule. This blog gives details on creating Report Bundle with necessary […]

The post Generate report in Excel format and send as Email attachment to users using Report Distributor appeared first on ERP Q&A.

]]>
Introduction:

The Report Distributor tool allows users to schedule and distribute reports via e-mail, send reports to a FTP server, or run reports offline. Foremost, create bundles containing the necessary reports & then select the items to distribute, the destination, the recipients, and the schedule.

This blog gives details on creating Report Bundle with necessary reports & then distributing the reports via e-mail in Excel / CSV format as an e-mail attachment to users.

Third-party / external service providers like Security, Cafeteria, Office Administration etc. require updated employee data on day-to-day basis for providing services to employees, however they necessarily need not have access to SAP SuccessFactors system. Also, accurate filters could be used to ensure that only Department, BU or Location specific data is sent to respective external service providers across various sites.

Environment: Report Distributor

Constraints:

  • External user would be setup in Employee Central as a Contingent Worker with basic information.
  • External user will not have Login permissions into SAP SuccessFactors instance.
  • The report should be in Excel format enabling the user to operate on the data for further usage.

Solution:

Prerequisite:

Foremost, you should have a report designed, which is to be sent through email.

Navigation:

Report Centre > View Schedules > Switch to Legacy report Distributor Tool

Resolution:

The Report Distributor tool allows users to schedule and distribute reports via e-mail.

It also allows sending reports to a FTP server.

In this blog, we will see how Report distributor as a tool be used to –

  1. Schedule a bundle with Export Type as MS Excel OR CSV
  2. Email report to specific user/s in MS Excel or CSV format as an attachment

Step 1. In the Report Distributor, under Menu option, click on New Bundle.

Report Distributor – Menu

Add New Bundle Window pops up.

Add New Bundle Window

Enter a unique identifiable name of the Bundle.

Select the Page Size, Export Type and Export Format.

Export Type & Export Format defines the form in which the user shall get the report. Note that Export Type does not have MS Excel or CSV as an option to download. Nevertheless, it is possible to download / Export report in CSV or MS Excel format. The option to export report in MS Excel / CSV format appears after creation of the bundle.

Report Export Type Options

After the creation of the bundle, it will appear under the bundle list.

Report Distributor – New Bundle Creation

Step 2. Click on the “Add Item button” under the Items to add desired reports to the bundle. Multiple reports could be added to a single bundle.

The following three options are available –

  1. Add Report
  2. Add Excel Table
  3. Add CSV Table
Report Distributor – Add Item
  • Add Report

Allows adding reports from Report Center to the bundle.

Add Report to bundle
  • Add Excel Table

The next-generation XLSX was introduced in 2007 and is Microsoft Excel default format. XLSX delivers the same content within a smaller, more efficient file, benefiting both SAP SuccessFactors resources and your local environment.

Add Excel Table to bundle

You can download the Excel file for single-domain and cross-domain Table reports and Export mode in the XLSX format by default.

The Multi domain reports are downloaded in the XLS format.

5 million rows is the limit for report queries.

There are native limitations with Microsoft Excel product.

Microsoft Excel limitation
  • Add CSV Table

Reports which are created using “Integration Center” has file “Encoding as UTF-8” and the reports which are created using “ORD” and scheduled via “Report Designer” in csv format has the file “Encoding as UTF-8-BOM”. The UTF-8 BOM marker is expected for CSV exports. Without it, Excel does not work when opening CSV that have foreign characters.

Add Csv Table to bundle

There’s no row limit for CSV reports (which means CSV reports exceeding 1,048,576 rows can be exported from SAP SuccessFactors) Microsoft Excel limits the number of rows to 1,048,576.

Option 2 supports a Microsoft Excel based output whereas option 3 supports CSV format. Thus although PDF or MS Word Export format is selected while creation of the bundle, it would be superseded with the selection of an Excel table or CSV Table.

Step 3. Click on “Destination” tab to configure the report destination. In this blog, we will focus on Email option as the destination wherein Excel file is transferred as an email attachment to the recipient’s setup for this bundle.

Report Distributor Destination: E-mail

Click on “Edit Recipients” link. This link will route to the “Recipient” Tab.

On the “Recipients” tab, click on the “Add Recipients” button.

Add E-mail recipients in bundle

Add Recipients would pop up a Wildcard search window to search for users (recipients).

In the Wildcard Search box enter the User Name of the user and click refresh list. Select the user and click on Add.

Add Users to Bundle

The user will be added to the list of recipients. Likewise you can Add or Remove recipients of the report for the selected bundle from the below screen.

Recipients list for this bundle

Step 4. Schedule the bundle.

To schedule the Report Distributor to run at a certain time, click on Add button to configure the schedule.

Schedule Report Distributor for periodic run

In the scheduler, enter the start date, time (data center time) and the frequency.

Report Distributor: Scheduler

Step 5. Customize Report Output Format

Once the bundle is successfully created, you can customize the output format of the report. This step is to validate if the report would be generated & sent as an attachment in Excel Format. Furthermore, the Excel format can be toggled between XLS & XLSX as required.

Select the bundle -> Menu -> Edit

Edit Bundle

The Edit Bundle option shall show Excel file format based on Excel table added to the bundle as per Step 2.

Here you can choose between XLS and XLSX as the report generation option. Likewise the file would be generated and sent as an E-mail attachment.

Step 6. Customize email template used in Report Distributor.

In the Report Distributor, under the Menu option, click on Edit E-mail option in the dropdown.

Report Distributor: E-mail

This opens a window for configuring E-mail message.

Bundle E-mail message

Default in one bundle will reflect in all existing bundles.

To change the e-mail template for specific bundles select the ‘Override for bundle’ option.

On Completion, option will send completion message to the sender upon completion of the Report Distribution process.

E-mails sent from the Report Distributor use the Default Reporting@successfactors.eu / Reporting@successfactors.com e-mail addresses.

You can change the ‘From Address Prefix’ ONLY i.e., replace ‘Reporting’ with a different term.

SAP currently do not have the option to change the domain to something other than successfactors.com / eu

Result: This blog will help you to successfully schedule report for offline execution, generate the reports at a regular cadence and then distribute the reports via e-mail in Excel / CSV format as an e-mail attachment to users.

Rating: 0 / 5 (0 votes)

The post Generate report in Excel format and send as Email attachment to users using Report Distributor appeared first on ERP Q&A.

]]>
How to setup Time Evaluation in Time Tracking https://www.erpqna.com/how-to-setup-time-evaluation-in-time-tracking/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-setup-time-evaluation-in-time-tracking Fri, 21 Oct 2022 10:54:05 +0000 https://www.erpqna.com/?p=69067 Time Evaluation is needed to process recorded times in a way to calculate overtime premiums, shift premiums, working time account or overtime account balances, to create alerts and warnings if employees have times outside their planned hours, outside the flextime bandwidth, if they record more than 10hours per day and many more reasons. Successfactors Time […]

The post How to setup Time Evaluation in Time Tracking appeared first on ERP Q&A.

]]>
Time Evaluation is needed to process recorded times in a way to calculate overtime premiums, shift premiums, working time account or overtime account balances, to create alerts and warnings if employees have times outside their planned hours, outside the flextime bandwidth, if they record more than 10hours per day and many more reasons. Successfactors Time Evaluation provides you a very flexible way to create time valuation rules to accommodate your company agreements, trade union contracts or even processes that are derived from laws and regulations like the European Directive on working time.

Time Evaluation might be easy – but can also be complex. This depends on the business processes you want to cover. This blog describes the mechanism of the Successfactors Time Evaluation and each time valuation type in more detail in order to give some hints and help you to decide which valuation type to be used in what constellation.

Target readers are time management consultants who want to learn more on the configuration of Successfactors Time Evaluation.

Time Evaluation

Definitions

Time Evaluation is a valuation of attendance, break, oncall and absence times by comparing the recorded hours against the company-internal, contractual, and collective agreement provisions.

Time Evaluation runs periodically to calculate overtime, calculate time off in lieu postings and working time accounts (flextime), validate attendance recordings, and calculate wage types (for example, bonuses) for payroll.

  • Time Evaluation is the process of calculating valuation results from a set of input time records to generate an output time record.
  • An output time record can be an interim calculation result or an time valuation result
  • Input time records, output time records and the valuation results are represented by time type groups.
  • The complete Time Evaluation contains several small calculation steps with time type groups as input and output. These steps are called time valuations and are combined in a time recording profile.

Definition by Example

Time Evaluation – Definition by Example

Basics

Time Type Groups – General and Usage

  • General
    • Time type groups are containers for interim calculation results and final time pay types.
  • UI Component
    • Output of time valuations that can be displayed on the Time Sheet UI.
  • Time Pay Type
    • The calculated time valuation results are stored on the database and available either for payroll processing or conversion to time off in lieu or working time account.
  • Time Collector
    • Calculated time valuation result which is aggregated daily, weekly, or monthly and stored on the database.

Usage and Storage of time valuation results

Usage and Storage of time valuation results

Time Type Groups – Time Category

Overview

The Time Type Groups on the left hand side are defined by an assignment of time types. These groups are Set-like Groups.

The Time Type Group on the right hand side is calculated by a time valuation.

Time Type Groups – Time Category

Set-like Groups

The set-like groups are defined by an assignment of time types. You can use absence, break, attendance and on-call and even mix them in a time type group just as you need. You can create different set-like time type groups based on the need of your valuations.

The set-like Groups are the starting point to calculate more complex (calculated) groups.

Time Type Groups – Set-like Groups

Calculated Groups

The calculated groups are calculated by one single or several time valuations in a time recording profile.

The calculated groups have the time category ‘Calculated Time’ or ‘Counted Events’.

Time Type Groups – Calculated Groups

The calculation method for a specific calculated result group is defined as a valuation rule, which describes how one or two result groups are generated from a set of input groups.

The calculated groups have the time category Calculated Time or Counted Events. The calculated groups having the time category Calculated Time can be Input groups for subsequent time valuations.

Time valuation

  • Input
    • Input Groups with
      • Reverse Sign (optional)
      • Factor (optional)
  • Processing
    • Valuation Type (e.g. Aggregate and Split)
    • Additional fields depending on Valuation Type
  • Result Groups (Time Category = ‚Calculated Time‘ / ‚Counted Events‘
    • Time Type Group Above
    • Time Type Group Below
  • Error message
    • Error flag
      • No Error message
      • Raise error on Time Type Group Above / Below
    • Additional Info
    • Message
    • Error Type
Time Valuation

Time valuation – UI

Time valuations can be setup in Manage Data.

Time Valuation – UI

How are Time valuations processed?

General

  • In General all time valuations in a Time Recording Profile are processed on a daily basis for the entire time sheet period.
  • All time valuations in a Time Recording Profile are processed first with data (e.g. scheduled working time and recorded working time) of Monday. After this the data of Tuesday is processed, and so on.
  • The order in which the different time valuations are processed cannot be influenced in the configuration. It is determined automatically, so that necessary Time Type Groups are already calculated when a time valuation is using them as Input, Threshold, Comparison or Deduction group.

Time valuation Methods

  • When using the Valuation Methods “Valuate Per Day” or “Valuate Up To Today” only the entries for a single day are considered. If the breaks are deducted from the scheduled working time, this is done with the values of a single day for instance on Monday the break of 1hr is deducted from the scheduled working time of Monday 8hr.
  • The processing is stopped after the processing of the current day when the Valuation Method “Valuate Up To Today” is used. Days in the future in the same time sheet period are not processed.
  • When using the Valuation Method “Valuate Whole Sheet” the entries for the complete time sheet (usually one week) are considered. This is for instance necessary to calculate times of the whole time sheet greater 40 hrs per week.
Valuation Method “Valuate Per Day”
Valuate Per Day
Valuation Method “Valuate Up To Today”
Valuate Up To Today
Valuation Method “Valuate Whole Sheet”
Valuate Whole Sheet

Valuation Types

Overview

Valuation Types – Overview

Aggregate Input Groups and Split

This type of valuation collects all the time records from a set of input groups. At a certain threshold value, the collected records shift from the below group to the above group. The threshold (i.e. when the switch from above to below happens) can be defined per day and per time sheet period.

The threshold can either be a fix value or a reference to a Time Type Group which was calculated before as a result of other time valuations.

Possible use case

  • Which working time is up to 8 hours a day (= regular time) and which is above 8 hours a day (= overtime)?
Aggregate Input Groups and Split
Aggregate Input Groups and Split – How it works

Filter Input Groups

The filter used in this valuation is a type of day filter. All the time records from the input groups passing the day filter with all attributes are collected in the below group. Time records not fulfilling at least one filter attribute are put to the above group.

This valuation type supports using a Time Segment Filter as well.

Possible use case

  • Which working time is on full holidays, non-working days, Sundays, at night shifts, …?
Filter Input Groups – How it works

Time Records Filter – Overview

Time Records Filter – Overview

Filter Segments from Input Groups

The segmentation uses a special time segment filter which is a list of time segments. Time segments are a clocktime intervall. The valuation moves the parts of the input time records to the below group that overlaps with one of the time segments. The parts not overlapping are put in the above group.

This means that input records can be split in parts overlapping a time segment and a part not overlapping any time segment.

This valuation type can only be used for time recording variant clock-time.

Possible use case

  • Which working time is after 20:00 and needs to be considered for premium calculation? (Segment Filter: 20:00 – 6:00)
Filter Segments from Input Groups – How it works

Excurs on cross midnight processing

Excurs on cross midnight processing

All data of a shift day is processed in the shift day, even if it can be on a different physical day. All time valuations are processed on the shift day as well.

For several night shift requirements it’s necessary to process data from the next physical day. For instance a night shift can end on a public holiday and special premiums for the part of the night shift on the public holiday has to be paid.

Because the complete night shift is processed on the shift day (the day on which the shift started) we introduced a next day indicator in the time segment filter. We also enhanced the time records filter so that it can be checked if the next physical day is a public holiday.

Time Segment Filter – Next Day Indicator
Time Records Filter – Public Holidays

Deduct Group from Input Groups

The Valuation Type “Deduct Group from Input Groups” is an enhancement of “Filter Segments from Input Groups”

  • Deduction group defines a list of time segments
  • Input time record overlapping any time segment (of deduction group)
    à Overlapping part is set to below group, remaining parts to above group
  • Input time record overlapping none of the time segments (of deduction group)
    àIs set to above group

Example

Which is the (net) working time without breaks?

Deduct Group from Input Groups – Example

Difference Between Threshold and Input

This valuation type compares all time records from a set of input groups to the specified threshold value per day or per time sheet period. The purpose is to calculate the difference “input minus threshold” and put this result into the above group.

If the threshold is reached, the time records above the threshold are moved to the above group. If the threshold is not reached, a new time record with the missing negative difference is created and put in the above group.

The time type of this new time record is the main attendance time type and the date is set to the valuated day or – in case of valuation per time sheet period – the end date of the time sheet period.

Thereby, the above group always contains the positive or negative difference “input minus threshold”.

Possible use case

  • What is the delta between working time and scheduled working time per day or week? This delta can then be accrued to a working time account (WTA) for each day or for the whole week.
Difference Between Threshold and Input – How it works

Compare Threshold with Input Groups and Count Events

Result groups from this valuation are a special kind of calculated time type group: The above and the below group are so-called ‘counter’ groups with ‘counter’ records as time records.

A counter (time type) group is defined by time category COUNTED_EVENTS and has only time records with an internal counter time type of time data type ‘counter’.

This valuation aggregates all input records of the day (or the whole sheet period depending on the valuation method).

In case of day valuation, if the amount of all records is above the threshold a “counter record of 1” (for the counted event above threshold) is put to the above group. If not, the “counter record of 1” is put to the below group (for the counted event below threshold).

In case of whole sheet valuation method, all input time records together are compared to the threshold for the entire week, and depending on the result the “counter record of 1” is put either into the above group or into the below group on the end date of the valuation period.

Possible use case

  • How many days (in the time sheet week) have got working time above 8 hours?
Compare Threshold with Input Groups and Count Events – How it works

Compare Threshold with Comparison Group to Route Input

This valuation allows to move all input time records together for each day (or for the whole week depending on the valuation method) either to the above group or the below group.

Therefore, independently from the input time records, the comparison group is compared to the threshold. per day or time sheet period. If the time records of the comparison group are above the threshold, the corresponding input time records are moved to the above group, otherwise to the below group.

Possible use case

  • What is the planned working time for days on a public holiday if there is recorded working time for that day?
Compare Threshold with Comparison Group to Route Input – How it works
Compare Threshold with Comparison Group to Route Input – How it works

Filter Single Records from Input Groups (Time Tracking subscription necessary)

This valuation type loops over all time records of the input time type group.

For every record, the record attribute (e.g. Start Time) is compared with the attribute that is configured for the comparison group (e.g. Earliest Start Time).

The comparison operator can be configured as “Equal To”, “Less than or Equal To”, or “Greater than or Equal To”.

Time records that fulfill the condition are moved to the time type group below. All other time records are moved to the time type group above.

Possible use case

  • Find gaps in recorded working time at the beginning or the end of the scheduled working time.
Filter Single Records from Input Groups – How it works

Configuration Options for Valuation Types

Configuration Options for Valuation Types

Examples

Some information in front about the examples

  • The example configurations, described in this document, must be adapted accordingly to fit in existing Time Recording Profiles and meet requirements.
  • They need to be adapted with regards to used Set-Like Time Type Groups (e.g. WT-SCHED for scheduled working time) and the naming of necessary Time Type Groups and time valuations.
  • Partially the configurations are simplified for a better understanding of the basic functionality.

Overtime Recorder with Premium for Overtime above 40 total hours per week

Requirement

  • An Overtime Recorder records only overtime by using a certain Time Type Group. For all times (scheduled working time + overtime) above 40 hours per week there must be paid a premium.
  • The premium must be calculated for the day on which the overtime was recorded, not at the end of the week.

Solution in a nutshell

  • The recorded overtime is available in a Time Type Group of Time Category „Recorded Overtime“
  • The difference between 40 hours and the scheduled working time for a week is calculated and stored in a Time Type Group
  • The part of the recorded overtime above the value of the Time Type Group, that contains the difference between 40 hours and the scheduled working time, is moved to a Time Type Group for the premium.
Overtime Recorder with Premium for Overtime above 40 total hours per week

Weekly overtime calculation

Requirement

  • For the first two hours overtime per week a premium OT 1.5 has to be paid. For every additional overtime a premium OT 2.0 has to be paid.
  • The OT 2.0 premium has to be paid for working time on non-working days as well, independend if the employee worked overtime in this week at all.

Solution in a nutshell

  • Use a time records filter to filter work on non-working days. Recorded working time on non-working days can directly be moved to a Time Type Group OT 2.0
  • Use a time records filter to filter work on working days.
  • Calculate daily overtime by split recorded working time after reaching the scheduled working time.
  • Split the weekly overtime by two hours. Overtime up to two hours are paid OT 1.5. Overtime above two hours are paid OT 2.0.
  • Aggregate OT 2.0 from non-working days and working days in one Time Type Group.
Weekly overtime calculation

Daily overtime calculation

Requirement

  • A positive recorder records working time and overtime and both is considered for calculating a overtime premium of 100% for more than two hours per day.
  • For the overtime up to two hours per day a premium of 50% must be calculated

Solution in a nutshell

  • Breaks are deducted from overtime relevant time
  • Breaks are deducted from the scheduled working time
  • Overtime per day is calculated by deducting the scheduled working time without breaks from the overtime relevant time without breaks
  • Breaks are deducted from recorded overtime
  • Calculated overtime and recorded overtime are aggregated to the total overtime per day
  • The part of the overtime above two hours is moved to a Time Type Group for the 100% premium. The overtime up to two hours are moved to a Time Type Group for the 50% premium.
Daily overtime calculation

Core night detection

Requirement

  • In Germany if working time for a night-shift started before midnight then the tax free portion of the night premium is 25% till 00:00, but 40% in the time frame from 00:00 – 04:00 and after 04:00 it is again 25% till 06:00 in the morning.
Core night detection

Solution in a nutshell

  • For further usage in the core night detection we need to deduct breaks from the recorded working time.
  • We need to filter recorded working time without breaks for the following time frames:
    • 8:00 PM – 12:00 AM
    • 12:00 AM – 4:00 AM on the day after the start day of the night shift
    • 4:00 AM – 6:00 AM on the day after the start day of the night shift
  • Any recorded working time in the time frame 12:00 AM – 4:00 AM is moved to a Time Type Group for tax-free night premiums when there was working time recorded in the time frame 8:00 PM – 12:00 AM.
    • If there is no recorded working time in the time frame 8:00 PM – 12:00 AM then the recorded working time 12:00 AM – 4:00 AM is moved to a Time Type Group for taxable premiums
  • The taxable premiums 8:00 PM – 12:00 AM, 12:00 AM – 4:00 AM, and 4:00 AM – 6:00 AM are aggregated in one Time Type Group
Core night detection

One remark

  • The documented solution is simplified and needs to be adapted because a part of the night shift can fall into a sunday or a public holiday, and in these cases higher premiums needs to be calculated.

Detection of Late comers

Requirement

  • An Employee must be counted as a late comer on every day on which he/she clocks in more than 15 minutes after the beginning of the scheduled working time. For every occurence of late coming a warning message must be raised.
  • When there is a gap at the beginning of the scheduled working time but the employee recorded additional working time before the beginning of the scheduled working time, he/she must not count as a late comer.
  • When there is no working time at all recorded the employee must not count as a late comer.
  • The monthly occurrences of late coming must be counted and a warning message is raised when the employee came too late more than three times in a month.

Solution in a nutshell

  • A time valuation finds gaps within the scheduled working time, in which no working time is recorded, and stores them in a Time Type Group. This Time Type Group would also contain a gap e.g. caused by a lunch break.
  • In a time valuation possible gap at the beginning of the scheduled working time are filtered out of all gaps of the entire day. If a gap at the beginning of the scheduled working time is found, this gap is stored in a Time Type Group.
  • A time valuation checks, if there is recorded working time for this day at all, and routes the gap at the beginning of the scheduled working time to a Time Type Group if there is working time recorded. If there is no working time recorded at all for a day it is not a late comer. It might be that this employee forgot to record attendances, or an absence.
  • A time valuation checks if there is recorded working time before the beginning of the scheduled working time and stores it in a Time Type Group.
  • A time valuation routes the gap at the beginning of the scheduled working time to a Time Type Group in case that there is no working time recorded before the beginning of the scheduled working time.
  • A time valuation routes the gap at the beginning of the scheduled working time to a Time Type Group when the gap is longer than the threshold value 15 minutes. In this case, a warning is raised. In this time valuation a “real” late comer is detected according to the described requirements.
  • A time valuation counts the occurrences of late coming in a month in a Time Type Group. This Time Type Group is of Time Category “Counted Events”.
  • A last time valuation raises a warning when an employee clocked in too late more than three times in a month.
Detection of Late comers

Special Topics

Check Tool

The configurations in a Time Recording Profile can be checked by several checks in the check tool.

That should be done before a Time Recording Profile is used to avoid errors caused by erroneous configurations during runtime.

Check Tool

Time Valuation – Employee Time Valuation Result

Employee time valuation results are shown in the Employee Time Sheet object in Manage Data

  • Condensed per time type group, day and cost center
  • Posting Target defines further processing (Payroll, Time Account, Working Time Account)
  • Contains also allowances for payroll
Employee Time Valuation Result

Time Valuation – Trace

The trace, that can be called from the Time Sheet UI, is a tool that is used from Admins to analyze the results of the Time Evaluation for one week of an employee.

Permission needs to be granted in Role Based Permissions.

Trace – Permissions
  • Shows the result of new valuation run (Results can differ from results for Time Type Groups shown on the Time Sheet UI)
  • Two Parts
    • Time Valuation Result
      • Valuated Time Records (not equal to UI Time Records)
      • Time Type Groups with Result (incl. Time Records)
      • Time Valuations with Results (incl. Time Records)
  • Time Valuation Trace
    • Runtime view
    • According to internal recursive time valuation algorithm
  • No allowances
Trace

Troubleshooting / Hints & Tips / Q&A

  • Q: A Time Type Group is not filled with a value, also Time Type Groups that contribute to this Time Type Group are not filled. For these Time Type Groups “No time records” is shown in Trace.
    • A: Time Type Groups are only calculated when they are configured as pay type, UI component, or Time Collector or they contribute to such a Time Type Group.
  • Q: Threshold with minutes is not considered correctly
    • A: Please check in which format the threshold is configured. If you want to use ½ hour as a threshold you need to configure 0,5 instead of 0,3.
  • Q: The weekly result of a Time Type Group for the current week is wrong because future days are considered.
    • A: Please check if in affected time valuations the valuation method “Valuate Up To Today“ needs to be used.
  • Q: How can I use < as comparison operator?
    • A: You can use “Reverse Sign“ for the Input Time Type Group and compare it with a negative threshold.
  • Q: How can I trigger a time valuation only on a certain weekday, on a working (non-working) day, on a day with a certain shift classification, on a public holiday?
    • A: Use a Time Records Filter within a time valuation of type “Filter Input Groups“
  • Q: I only want to aggregate Time Type Groups without Split. How can I configure this?
    • A: Create a time valuation of type “Aggregate Input Groups and Split“ with the Threshold type “Fixed Value“ and the Threshold Value 0.
  • Q: I need to change a time valuation but do not know in which Time Recording Profiles it is used so that I can estimate the impact of my change and I can test it.
    • A: You can export (Import and Export Data) the MDF object “Time Recording Profile-Time Valuation”. In the column “timeValuation.externalCode” you can find the external code of the time valuation and in the column “externalCode“ you can find the external code of the Time Recording Profile in which the time valuation is used.
Rating: 0 / 5 (0 votes)

The post How to setup Time Evaluation in Time Tracking appeared first on ERP Q&A.

]]>
Automate Mass Compensation related changes with Integration Center https://www.erpqna.com/automate-mass-compensation-related-changes-with-integration-center/?utm_source=rss&utm_medium=rss&utm_campaign=automate-mass-compensation-related-changes-with-integration-center Fri, 30 Sep 2022 10:22:56 +0000 https://www.erpqna.com/?p=68436 Tired of imports/changes/exports? Unable to mass update current & future (specific) pay components with X% and maybe some extra logic in a more automated fashion? Do you want to mass change pay scale levels within a Compensation Cycle based on performance cycle outcomes? And/or are there employees that do not have the right manager in […]

The post Automate Mass Compensation related changes with Integration Center appeared first on ERP Q&A.

]]>
Tired of imports/changes/exports?

Unable to mass update current & future (specific) pay components with X% and maybe some extra logic in a more automated fashion?

Do you want to mass change pay scale levels within a Compensation Cycle based on performance cycle outcomes?

And/or are there employees that do not have the right manager in the system right now?

Then seek no further, because I’ll take you through the possible solutions shortly with integration center.

After some time and moving to McCoy & Partners, it’s now time to help you find some advanced use cases for it that drastically increase accuracy and capabilities within SuccessFactors. I will be taking you through 3 use cases where you can mass update/create data:

  1. Fixed (CLA) Increase % per DATE and future data records
  2. Create/Process New Pay Scale Levels based on PM cycle of employees with fixed salaries
  3. Automatically assign employees to (newly hired) manager

Use Case 1:

Fixed (CLA – in the Netherlands: CAO) % Increase per DATE and future dated records

– Prerequisite general: please ensure you don’t have existing records with errors and/or impactful
onSave business rules turned on during the processing that could block this process.
– Prerequisite when Min/Max of pay ranges need to be respected: Ensure you have the field Pay
Range in Comp Info and that the employees has this data.

Most companies for instance (due to CLA changes) wish to apply a flat increase (i.e. 4%) once per year/every so often and do this via Mass Exports, Changes and Imports, but this error sensitive and not required at all!
1000 or fewer records to process? –> you could fix this with 2 Integrations via Integration Center.

>=1000? –> you may need 4 (2 for filtering, 2 for processing) Integrations + an SFTP server.

Integration Center selections based on Scenario

In both situations you basically aim to (also specifically in this order):

I. Create a Compensation Record (to ensure it’s present) per DATE.
II. Update existing pay components per a specific DATE + future dated records
(III. Note! Business Rules (BR) can still perform extra actions based on the integration center data if needed, if you for instance need to adjust salary based on min/max or if you need to use a fixed salary for pay scale levels that can all be used via BR. It could be the case that you need to turn some onsave rules off for the purpose of this exercise).

Create a Compensation Record (i.e. per 1 April):

1. After first going to integration center and selecting the right scenario (as shown above), you have to choose: EmpCompensation

2. Dependent on your scenario, you may have to take different steps here:

2a. <=1000 records (you could get this up to 2000, but then you need to sort A-Z and Z-A on user-id) – Press Select to Proceed and give the Integration a name (and description) –>

– Add 1 as default value for the sequence (assuming you only want records per 1 April for
those that don’t have it yet), add i.e. 1 April as start date (I recommend using a calculated
formula as you can for instance select first of the month if it is April), directly map the User
ID and apply a default value for the event reason (in this case DATACHNG)

Note! Dependent on your Mandatory field settings you may need to fill more fields & you can also
leave sequence blank as the system will auto generate it

– Apply the right filtering, i.e. Employee Status (I recommend the external code) = Active (A),
no records per 1 April (if you don’t want to create extra comp records), but records that
are Effective As Of: 1 April that lack a record per 1 April (= Time Based Filter)

– Schedule/Save (at least do this)/Run

Example details can be found in the prior mentioned blog (steps 8 and 9)

2bI. >1000 records (filtering)

– Select the mandatory fields (note this may differ for your instance) + “Select”

– Give it a valid name (and description) + apply the right settings (simply header):

– You could add fields/transform fields (i.e. default effective date & event reason), but you
can commonly skip this (as you can also do it with the import)
* Example Default Date (assuming you do this in April and want 1 April 2022)

* Example Default Event Reason

* Example Default Sequence

– Apply the right filtering, i.e. Employee Status (I recommend the external code) = Active (A),
no records per i.e. 1 April (if you don’t want to create extra comp records), but records that
are Effecitve As Of: i.e. 1 April that lack a record per 1 April (= Time Based Filter)

(optional), you can also sort the data:

– Fill out the destination details (i.e. SuccessFactors SFTP, FEED folder, filename COMP – note this is sample data and may differ for you)

(note! SHOULD have file extension: csv)

– Schedule/Save (at least do this)/Run
Example details can be found in the prior mentioned blog (steps 8 and 9)

2bII. >1000 records (processing)

– Make sure that you’ve completed all steps of 2bI and that you’ve ran the integration &
login to the SFTP server, go to the FEED folder and observe & download the file
(hint! If you have exceptions, this is great because you can still download + adjust + re-upload)

– Go to Integration Center, select the right options (3rd picture in 1st screenshot on the top of this blog) & give it a fitting name (and description)

– Upload the file that you’ve downloaded (& accept the warning if you get it)

– On the left hand side, swap to the middle icon

& drag the fields to be mapped to the according places
(note, if you still need to transform/default fields, please don’t map those fields yet
green = always to be mapped, orange = only map if you transformed in the prior file)

– On the left hand side, swap to the 2nd icon (detailed view) and make changes if needed

* Example Default Sequence – default value 1

NOTE! You can also leave sequence blank as the system will auto generate it

* Example Default 1st day of the month as effective date

* Example Default Event Reason DATACHNG

– Set the SFTP Settings (to pick-up the file – note this is sample data and may differ for you), i.e.

– Schedule/Save (at least do this)/Run
Example details can be found in the prior mentioned blog (steps 8 and 9)

Update Pay Component(s) – Please only take these steps after completing all Compensation steps:

1. After first going to integration center and selecting the right scenario (as shown above), you have to choose: EmpPayCompRecurring

2. Dependent on your scenario, you may have to take different steps here:

2a. <=1000 records (you could get this up to 2000, but then you need to sort A-Z and Z-A on user-id)
– Press Select to Proceed and give it a name (and description)

– Go to the Configure fields Tab and select the second icon on the right hand side.

– Most fields can directly be dragged and dropped from the left to the right

For amount you will however, but on the left hand side first swap to the middle icon

, need to add a calculated field

Note! Dependent on your Mandatory field settings you may need to fill more fields

– Apply the right filtering, i.e. Employee Status (I recommend the external code) = Active (A),
Only for the SALARY and ALLOWANCE (in the Netherlands common for extra salary,
personal allowance, shift allowance etc.) pay component , but records that
are active as of 1 April or later (Time Based Filter).

(optional), you can also sort the data:

– Schedule/Save (at least do this)/Run

Example details can be found in the prior mentioned blog (steps 8 and 9)

2bI. >1000 records (filtering)

– Select the mandatory fields (note this may differ for your instance) + “Select”

– Give it a valid name (and description) + apply the right settings (simply header):

– Click on the Configure fields tab and add fields / calculations that make sense, i.e.

* Add the percentage increase (i.e. 4%) directly to the pay component amount

(Note! You can also use IF THEN ELSE if you have up to 4 scenarios!)

* add a field that indicates the flat increase (i.e. 4%):

* Assuming different % increase per pay scale type, use a look-up table to get the %

– Apply the right filtering, i.e. Employee Status (I recommend the external code) = Active (A),
Only for the SALARY and ALLOWANCE (in the Netherlands common for extra salary,
personal allowance, shift allowance etc.) pay component , but records that
are active as of 1 April or later (Time Based Filter).

– Fill out the destination details (i.e. SuccessFactors SFTP, FEED folder, filename PAYCOMP – note this is sample data and may differ for you)

(note! SHOULD have file extension: csv)

– Schedule/Save (at least do this)/Run

Example details can be found in the prior mentioned blog (steps 8 and 9)

2bII. >1000 records (processing)

– Make sure that you’ve completed all steps of 2bI and that you’ve ran the integration &
login to the SFTP server, go to the FEED folder and observe & download the file
(hint! If you have exceptions, this is great because you can still download + adjust + re-upload)

– Go to Integration Center, select the right options (3rd screenshot on the top of this blog) &
give it a fitting name (and description)

– Upload the file that you’ve downloaded (& accept the warning if you get it)

– On the left hand side, swap to the middle icon

& drag the fields to be mapped to the according places
(note, if you still need to transform/default fields, please don’t map those fields yet
green = always to be mapped, orange = only map if you transformed in the prior file)

– On the left hand side, swap to the 2nd icon (detailed view) and make changes if needed

* Example Calculation Pay component amount if you have a % increase

– Set the SFTP Settings (to pick-up the file – note this is sample data and may differ for you), i.e.

– Schedule/Save (at least do this)/Run

Example details can be found in the prior mentioned blog (steps 8 and 9)

3. When taking the actions above, you could also ensure SuccessFactors performs extra checks
(i.e. based on min/max pay ranges) to correct the new salary. An example below reflects one of the things you could do if you need to consider those (i.e. put SALARY to the minimum if it falls below that). The same could be done for instance based on the maximum (variable visible in the picture below too).

NOTE! As a safety measure, you could include that a specific user must run the integration in the business rule to ensure nothing else could potentially trigger or retrigger the increase.

Use Case 2:
Create/Process New Pay Scale Levels based on PM cycle of employees with fixed salaries

Prerequisites:
– Need to ensure your Compensation Worksheet calculates the new pay scale level based on for instance the Performance Rating
– Need an MDF object to store the new data based on the compensation cycle
– Need to have the minimum mandatory fields in your compensation worksheet to fill your job info
(i.e. Pay Scale Group, new code of pay scale level, user-id etc.)
– Job Info needs to be error free (especially for customers with many employees this can be challenging)
– Need to turn on/off onSave business rules temporarily for final integration center processing

Some companies wish to go through a compensation cycle with their CLA population (often those with pay scale levels and fixed salaries), but may have a lot of logic to determine the amount of pay scale levels that an employee can get (i.e. bad performance = no new pay scale level, very good performance = increase of 2 steps, manual correction of the pay scale level to use, etc.).

To manage this (mainly via compensation) you could:

1. Ensure you have an MDF object with the right fields. Example:

2. Ensure you have the right fields and calculations setup to get to the next pay scale level within
your Compensation Worksheet.

Example:
– Uses various default fields
– Uses some custom fields (i.e. user id, pay scale group and pay scale level) – Marked: Yellow
– The standard & custom fields are used to apply a guideline & default value to: merit field.
(in this case a movement to 0.5 pay scale level up = 1% increase and full = 2% increase)
– In the custom (pink area) fields for the steps to increase the Merit increase could be halved
to get to the number of steps & these are then added to the custom pay scale level to
calculate the new Pay Scale Level).

Note! If your pay scale level is not numeric, you may need to get the according one with a lookup instead.

3. Ensure you configure the publishing to the MDF object (+ complete the comp cycle & publish)

4. Ensure you have an integration center object to export the MDF data to the SFTP

– Go to Integration Center and choose the right options & the custom object:

– Select the correct fields (this may differ from those that you have) + press Select

– Give the file a name and apply the correct settings

– Go to Configure Fields & concatenate the Pay Scale Group + the new Payscale Level

– Filters may not be required, but it could make sense to filter for active employees (t) per a
specific effective date (i.e. 1 April) in the Filter and Sort Tab:

– Fill out the destination details (i.e. SuccessFactors SFTP, FEED folder, filename JOB – note this is sample data and may differ for you)

(note! SHOULD have file extension: csv)

– Schedule/Save (at least do this)/Run

Example details can be found in the prior mentioned blog (steps 8 and 9)

5. Ensure you have an integration center object to import the MDF data from the SFTP to job info

– Make sure that you’ve completed all steps of 1-4 and that you’ve ran the integration &
login to the SFTP server + go to the FEED folder and observe & download the file
(hint! If you have exceptions, this is great because you can still download + adjust + re-upload)

– Go to Integration Center, select the right entity (empJob), select the right options &
give it a fitting name (and description)

– Upload the file that you’ve downloaded (& accept the warning if you get it)

– On the left hand side, swap to the middle icon

& drag the fields to be mapped to the according places
(note, if you still need to transform/default fields, please don’t map those fields yet
green = always to be mapped)

(You may notice I haven’t mapped sequence, I did that deliberately as SF generates it for me now).

– Set the SFTP Settings (to pick-up the file – note this is sample data and may differ for you), i.e.

– Schedule/Save (at least do this)/Run
Example details can be found in the prior mentioned blog (steps 8 and 9)

Use Case 3:
Automatically assign employees to (newly hired) manager

Prerequisites:
– Need to have position management in place
– Ideally you don’t have multiple managers on one position
– the manager on the parent position is always the manager of the employee on the position
– Job Info needs to be error free (especially for customers with many employees this can be
challenging)
– Need to turn on/off onSave business rules temporarily for final integration center processing
– Make sure you have the standard canvas report: “Disparities between Reporting Line and Position” report in your instance (or an equivalent) & that you’ve ran it to identify the mismatches

This is not directly compensation related, but as the compensation module relies a lot on correct manager – employee relationships, this can be crucial. It happens often that a manager leaves the company and that the employees are then temporarily assigned to another manager. When the new manager gets in place, there often isn’t an easy way to mass assign the employees back, until now if I’ll use Integration Center correctly!

– Identify the users where the manager deviates from the one of the parent position.

– Go to Integration Center, select the right entity (empJob), select the right options &
give it a fitting name (and description)

– Press Select to Proceed and give it a name (and description)

– Leave the sequence blank, add 1 April as start date (I recommend using a calculated
formula as you can for instance select first of the month if it is April), directly map the User
ID, apply a default value for the event reason (in this case DATACHNG) & add the new
Supervisor ID(s)

* Example Default 1st day of the month as effective date

* Example Manager of Parent Position

(In this case I’ve added just 1 manager user ID, but you can also use a lookup table based on the user ID field to update the manager)

Note! Dependent on your Mandatory field settings you may need to fill more fields
You may notice I haven’t mapped sequence, I did that deliberately as SF generates it for me now

– Unfortunately Incumbent is empty for parent positions, hence you have to add the codes
of the respective Parent Positions as a filter (based on the report)

– Schedule/Save (at least do this)/Run
Example details can be found in the prior mentioned blog (steps 8 and 9)

Rating: 0 / 5 (0 votes)

The post Automate Mass Compensation related changes with Integration Center appeared first on ERP Q&A.

]]>
Child Care Leave China Workaround in SuccessFactors https://www.erpqna.com/child-care-leave-china-workaround-in-successfactors/?utm_source=rss&utm_medium=rss&utm_campaign=child-care-leave-china-workaround-in-successfactors Wed, 13 Jul 2022 11:41:14 +0000 https://www.erpqna.com/?p=64995 Overview of requirement Childcare leave refers to a period of paid or unpaid leave granted to employees for the care and/or support of their children under a certain age. In China, we have a common requirement that each couple can enjoy five days’ childcare leave each year until their child reaches three years. In this […]

The post Child Care Leave China Workaround in SuccessFactors appeared first on ERP Q&A.

]]>

Overview of requirement

Childcare leave refers to a period of paid or unpaid leave granted to employees for the care and/or support of their children under a certain age. In China, we have a common requirement that each couple can enjoy five days’ childcare leave each year until their child reaches three years. In this blog, I am going to demonstrate a workaround solution to achieve this in EC Time Off.

Disclaimer: Please consult your partner before implementing this solution. Perform thorough testing including performance of the Jobs which are going to be listed here. Reiterating that this is one of my workaround solutions and is not a standard approach. Please do your due diligence before adopting this approach.

For simplicity case, we would take only three children for childcare leave. Any more children need to be handled manually by adding the balance of 5 days to one of the existing accounts

What are the key challenges here?

  • Keeping track of every new child born along with youngest dependents
  • Avoiding any manual intervention wherever a new child is born. For example, System should be able to detect a child born event and should generate the time account with balance automatically
  • Time accounts are not fixed. They do not start with a fixed date like hire date or from Jan to Dec. Every account should start from the child birth date and this process needs to be repeated for the next three years i.e until the child turns 3 years old.

Design

Keeping in mind above challenges the below would be the high level design of my solution

  • A custom MDF for every user having such a time profile. An integration center job would run daily to ensure that this custom MDF is created/updated whenever there is a new child born
  • The custom MDF is updated with childbirth details each time a child is born using relevant on Save Rules. With every new child born we get a youngest dependent and the youngest dependent keeps changing as and when we progress through years and a new child is born.
  • Ad hoc accounts – Since every account should start from the child’s Date of Birth, adhoc account is best suited for this requirement.
  • Since we need 3 accounts for 3 years for 3 children, we would need 9 similar IC to create these individual accounts. All ICs are basically a copy of the first IC created with minor changes. This sounds alarming at first sight, but the good part is each IC has its own filter in such a way that it picks only relevant records where The field for Time account is not null. Also, they would filter records only for those users where the MDF was recently changed due to a new child. This is based on last modified date filter. This should take care of performance too.

Since the design is clear, let’s look at configuration now

Configuration

First, we would build a custom MDF with the following fields and structure

In total we would need 12 fields

External code: This is the user field. For each user we would need an instance of Custom MDF

Child Date of Birth: For each child we have three DOB fields

ChildTA1: We have to store the time account dates for each of the children. For example, Say the 1st child is born on 1st June 2022. Then we would need 1st June 2022, 1st June 2023 and 1st June 2024 to be stored as Time account dates for Child 1. Since we support three children, this would need 9 dates for 3 years.

Now let’s get into the on Save Rules

fillChild1DOB:

The rule checks if this is the first child using count of dependents and gets the youngest dependent date which is the first child itself in this case. We also check for null case “child1DOB is null”. If this is null only then the Child 1 DOB and Time account 1 Date is updated. This is because, there can be cases like Hire, Migration, where the data needs to be fed in manually into this custom MDF (if the employee has existing children). In such manual cases, these on Save rules should not override existing data filled.

Similarly create the other two on save rules for child 2 and child 3

fillAccountDatesCHN:

This is the final on Save Rule which populates fills Child Time Account Dates for year 2 and 3 as and when that birth year is reached

Time Account Type and Time Type Configurations

Since we have three children, we would need three time account types and three time types to be added in time profile. For demonstration purposes, I will be using only two-time account types for two children.

The following is the time account type structure for adhoc accounts

Similarly, create two-time types for 2 children and link them to corresponding time account types. I think this is straight forward and hence I am skipping this part

Integration Center 1

The first Integration center report is to create/update this custom MDF on daily basis. The starting entity is user, and the target is our custom MDF “cust_ChildDOBDetailsCHN”

Once the custom MDF is created or updated, the details like child’s DOB and relevant Time Account Dates are filled

Please note: The custom MDF is saved only if there are new changes. If there are no changes since the last save from the job, the custom MDF is not saved again. This is necessary for other ICs (which I would explain shortly ) which would filter only recently modified records.

There is only field User ID from User entity which is mapped to externalcode (User field) of the Custom MDF

Integration Center 2

As I explained in above sections, we would need 9 ICs to create three accounts for three children for three consecutive years. Also, the IC’s are built in such a way that only relevant records are read and only those which have Childs time account date fields filled in custom MDF.

For example, if there is no child born, None of the ICS would pick that user. If the user has only 1 child, IC meant to run for 2nd child will not even pick that user. Lastly if the user’s MDF has not gone through any changes since the last IC job run that, entire MDF record for that user is filtered out. So overall, this should take care of efficiency of IC runs

I will explain the concept of 1st IC. The remaining 8 ICS are basically clone of the 1st one with some attribute changes. If you can look at the 1st IC design, you can build the other ICs using Save as feature of Integration center in no time

IC to create 1st time account of 1st Child:

You would need to create a SF to SF OData Integration with Starting entity as “Custom Child MDF” and target as “Time Account”

1) External Code of Time account is a calculated field which concatenates the user ID with Child Birth of Date and Child No from custom MDF

2) Time Account Type can be hard coded with TAT you use. For example if its child 1 the use the Time account code for child 1, if its child 2 then use time account code for child 2

3) Booking possible from, account valid from and account valid to are all mapped to corresponding time account date field of each child. If we are dealing with child 1 and time account 1, then use child1TA1 from the custom MDF. If its child 1 and time account2 then child1TA2 and so on..

4) Booking Possible Until is mapped to child Time account date (refer to point 3) + 12 months – 1 day

5) User ID is mapped to external code of custom MDF

6) Time account Details (Child Object of Time account)

  • Time_Account_External_Code – This is the external code of time account parent object. Use the same calculation and same code as described in point 1)
  • External code – This is external code of child time account detail. Use the same code and calculation as described in point 1)
  • Amount posted – 5 days for child care per child per year. You can change it according to your requirement
  • Posting Date- Drag and drop effective start date from Child MDF
  • Posting Type – Defaulted as “MANUAL_ADJUSTMENT”
  • Posting Unit- Defaulted as “DAYS”

The payload looks like this

The payload looks like this
 {
    "entityName": "TimeAccount",
    "payload": [
      {
        "externalCode": "1032582021-04-01C1",
        "accountType": "Child Care CHN (Child1)",
        "bookingEndDate": "/Date(1648684800000)/",
        "bookingStartDate": "/Date(1617235200000)/",
        "endDate": "/Date(1617235200000)/",
        "startDate": "/Date(1617235200000)/",
        "userId": "103258",
        "timeAccountDetails": [
          {
            "TimeAccount_externalCode": "1032582021-04-01C1",
            "externalCode": "1032582021-04-01C1",
            "bookingAmount": 5,
            "bookingDate": "/Date(1617235200000)/",
            "bookingType": "MANUAL_ADJUSTMENT",
            "bookingUnit": "DAYS",
            "__metadata": {
              "type": "SFOData.TimeAccountDetail",
              "uri": "TimeAccountDetail(TimeAccount_externalCode='1032582021-04-01C1',externalCode='1032582021-04-01C1')"
            }
          }

Last step is to add a filter. You can go with a combination of last modified date and a calculated filter

The last modified filter filters out only recently modified records of custom MDF. This ensures that not all user records are picked up unless a new child is born for them

Testing

The 1st IC (user to custom mdf) is run first in sequence and this should create the custom MDF for the user first.

Let’s assume that employee has 1 child first born on April 01,2021

The on save rules on custom MDF check youngest dependents and fill the child details as and when new childs are born.

Now I Run IC for time account 1 child 1 :

Next I run IC for time account 2 Child 1:

Next lets say a new child is born today and dependent data is filled (child 2 should be filled by main IC1 when its scheduled and run)

Main IC1 would again fill the existing MDF with child 2 details for this year. See below TA1 date for child 2 is filled

Then IC for Time account 1 Child 2 runs and creates the relevant adhoc account too as you can see below

To summarize, the IC1 for creating custom MDF should be run first so that it periodically keep tracks of new children born and then remaining time account creation ICs should be run. These can be scheduled every day, week during non-business hours with a gap of say 2 hours between each of them

Other user cases and recommendations

  • Migration/ New hire :

For new hires, there can be two scenarios:

  1. New Hires without any children – This will work without any manual intervention and any new children born would be tracked via IC and MDF.
  2. New hires with existing children below three years – This info has to be fed manually in the custom MDF just like any migration scenarios….(The HR can maintain this info just like how they maintain for any new hire like job info, compensation info, dependents info etc). Once the information is fed in custom MDF, the time accounts would get created automatically via IC. I guess it is reasonable for HR to do this since they anyways feed the other personal and employment details manually during new hire.
  • Twins – For twins only the 2nd child info is fed. You would need to fill the 1st child info manually.
  • This approach is recommended for a customer with relatively smaller employee size. If you have a larger employee size, then you can split the main IC (user to custom MDF) into multiple ICs depending on filters like legal entity, Department, Employee Class, type etc For example, if you have say 5000 employees who need this solution, split the main IC into 5 equivalent ICs and then schedule them with a gap of atleast 2-3 hours. I have tested this for a smaller sample size. For larger size, I would recommend you try out above options, test completely in a sandbox or QA and then deploy it
  • Also in the IC, we have operations listed as “Upsert Multiple” and “Upsert Single”. If there are data issues trouble shooting becomes difficult in Upsert Multiple Mode as the entire batch of records would fail. For initial load and testing, use Upsert Single to see if it works for majority of the population. Then once you have tested this and run it multiple times, you can try with Upsert Multiple.
Rating: 0 / 5 (0 votes)

The post Child Care Leave China Workaround in SuccessFactors appeared first on ERP Q&A.

]]>