Professional Documents
Culture Documents
THR96.2211.EN-US
THR96.2211.EN-US
.
.
PARTICIPANT HANDBOOK
INSTRUCTOR-LED TRAINING
.
Course Version: 2205
Course Duration: 2 Day(s)
Material Number: 50159726
SAP Copyrights, Trademarks and
Disclaimers
No part of this publication may be reproduced or transmitted in any form or for any purpose without the
express permission of SAP SE or an SAP affiliate company.
SAP and other SAP products and services mentioned herein as well as their respective logos are
trademarks or registered trademarks of SAP SE (or an SAP affiliate company) in Germany and other
countries. Please see https://www.sap.com/corporate/en/legal/copyright.html for additional
trademark information and notices.
Some software products marketed by SAP SE and its distributors contain proprietary software
components of other software vendors.
National product specifications may vary.
These materials may have been machine translated and may contain grammatical errors or
inaccuracies.
These materials are provided by SAP SE or an SAP affiliate company for informational purposes only,
without representation or warranty of any kind, and SAP SE or its affiliated companies shall not be liable
for errors or omissions with respect to the materials. The only warranties for SAP SE or SAP affiliate
company products and services are those that are set forth in the express warranty statements
accompanying such products and services, if any. Nothing herein should be construed as constituting an
additional warranty.
In particular, SAP SE or its affiliated companies have no obligation to pursue any course of business
outlined in this document or any related presentation, or to develop or release any functionality
mentioned therein. This document, or any related presentation, and SAP SE’s or its affiliated companies’
strategy and possible future developments, products, and/or platform directions and functionality are
all subject to change and may be changed by SAP SE or its affiliated companies at any time for any
reason without notice. The information in this document is not a commitment, promise, or legal
obligation to deliver any material, code, or functionality. All forward-looking statements are subject to
various risks and uncertainties that could cause actual results to differ materially from expectations.
Readers are cautioned not to place undue reliance on these forward-looking statements, which speak
only as of their dates, and they should not be relied upon in making purchasing decisions.
Demonstration
Procedure
Warning or Caution
Hint
Facilitated Discussion
TARGET AUDIENCE
This course is intended for the following audiences:
● Technology Consultant
Lesson 1
SAP SuccessFactors People Analytics Overview for Administrators 3
Lesson 2
Introduction to SAP SuccessFactors Workforce Analytics and Planning for Consultants 14
Exercise 1: Interpret the Metrics Pack Documentation 23
UNIT OBJECTIVES
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Describe People Analytics
SAP SuccessFactors People Analytics combines all facets from Operational Reporting to
Strategic Analytics and Planning in a holistic solution. This is critical to drive HR operational
and business success whilst also helping to improve the manager and employee experience.
Note:
Data Protection and Privacy Features
With the Q1 2018 release, several new data protection and privacy features have
been made available to our customers and some existing features have been
enhanced.
The data protection and privacy features include, for instance, the ability to report
on personal data changes and the capacity to report on all the data subject’s
personal data available in the application. Customers will also have options to
configure data retention rules at country level for active and inactive employees
that will permanently purge personal data from SAP SuccessFactors applications.
It is the customer’s responsibility to adopt the features that they deem
appropriate. More information can be found on the SAP Help Portal: https://
help.sap.com/docs/SAP_SUCCESSFACTORS_HXM_SUITE
At the very center of People Analytics is the employee experience. On the one hand the HR
operational success, on the other hand, the business success. People Reporting with Stores is
the first part of the consistent user experience. By embedding SAP Analytics Cloud
technology directly in the SAP SuccessFactors platform for People Analytics, SuccessFactors
is providing this unified format of reporting called 'Story' for smooth user experience.
For example, you want to find out how many people have registered for a course, or how
many people have registered for a course which they've not yet taken it this month. This is
running off of your live SAP SuccessFactors data.
The next step is to transform the SuccessFactors data, adding it to any other HR related data.
With Workforce Analytics, this data is transformed in the way that we can do more complex
analytics.
For example, is the trend on this particular course going up or down? What is the ratio of the
money that we spent on learning to the number of courses that people take for example?
You can also analyze all data from SAP and non-SAP software as it relates to people.
SuccessFactors provides a connector between Workforce analytics and SAP Analytics Cloud
Enterprise edition, where it enables SAC to consume more complex result sets and better
hierarchy handling.
There reaches a point within advanced Analytics where you now need to start combining the
people data with the business outside of HR. We offer customer-managed analytics, where
customers can do advanced analytics and planning across SuccessFactors as well as other
business sources including other SAP solutions and non-SAP 3rd party data sources beyond
People Data.
For example, to prove the business impact of a training intervention, I need to see that sales
people who have taken the sales training actually resulted in more sales revenue.
For workforce planning purposes, you fundamentally need all of the business data to generate
the people supply.
Available with the SuccessFactors Suite, People Analytics reporting with Stories provide the
following to help you make better decisions:
● Standardized HR metrics
● Data trending
● Instant data-driven insights
Additional input:
● Everything organized and accessible from one place to create and deploy all reporting
needs
● Intuitive, guided experience to create and share content; role-based sharing and simple
scheduling
● Single-click, in-line access to analytics and related links from any page in suite
● Contextually aware, immediately relevant
● Based on live transactional data for all customers
● Drill to detail and take action
● Flexible ad hoc reporting
● Standard reporting accelerated with report templates
● Filter and aggregate live data, drill down to parts of the organizational structure
● Easy formatting of data and distribution
● Create and configurable dashboards with drag and drop
● Drill to detail and take action
Note:
Stories are out-of-scope for this training course.
To get deeper analysis of the data SuccessFactors has Workforce Analytics (WFA).
Note:
A separate license is required for Workforce Analytics
WFA Provides:
● Standardized HR metrics - 2,000 metrics of pre-delivered HR and talent metrics built as a
powerful discovery tool for hypothesis testing – Truth or Myth?
● Data trending - Track trends through time and across different periods, such as annual,
quarterly, monthly, and seasonal time models.
● Actionable analytics - Answer key questions about your workforce and spot risks and
opportunities rapidly with visual, interactive HR analytics.
● Integrated data foundation - Builds on operational reporting from Employee Central with
metrics & predictive analytics and also integrates data from multiple systems to create a
solid data foundation and rely on SAP software to help manage data quality.
SuccessFactors also provides a managed service to integrate, govern, refresh, and publish
your data for analytics. We provide the staff resources and technologies to manage this for
you.
● · Enterprise Planning
Enterprise Analytics
Enterprise Analytics enables you to explore data across the organization and deliver insights
at the point of decision. While no content is pre-delivered content, with this solution you can:
● Gain insights and take action in context of business processes
● Analyze, predict, plan, and execute within your business process for instant insight to
action
● Increase engagement, and accountability with built in collaboration capabilities
Design multi-page reports with sections, dynamic fields, tables and charts; schedule
publications to SAC and non-SAC recipients
Enterprise Planning
Enterprise Planningenables you to execute your business strategy with the right people, with
the right skills, at the right time and cost. While no content is pre-delivered content, with this
solution you can:
● Act in the moment by planning, analyzing, and simulating your data directly within your
enterprise solution
● Continuously collaborate with your team on plans within context for increased
accountability
● Discover predictive planning by uncovering top performance influencers with predictive
forecasting and machine learning tools
SAP SuccessFactors Workforce Analytics (WFA) helps customers to gain insights into how
investments in talent are impacting business outcomes, increase visibility into workforce
trends, risks, and opportunities, and improve the distribution of talent management metrics
to front-line managers. WFA offers insight into workforce dynamics and composition, brings
focus on areas that matter, and helps a customer find answers to key questions about current
workforce challenges and how to solve them. WFA can source its data from SAP core HR,
SuccessFactors Employee Central (EC) and Talent Solutions, or any 3rd party HR and talent
management system.
trending through time, report filtering and role-based security. Publish and distribute them
in PDF, Word, or Excel format.
● Report Distributor (Scheduling and Delivery) - Automate distribution of reports to
designated users and schedule reports to run at specific times.
Metrics Packs
SAP SuccessFactors employs a modular design that enables customers to focus on metrics
most relevant to their organization. These ‘modules’ are referred to as Metrics Packs: Each
metrics pack contains a related set of measures and reporting dimensions that:
● Define a standard set of metrics specific to a range of topics (Performance, Recruiting,
etc.)
● Define a standard set of dimensions/hierarchies (e.g. high/low performer) to support
further analysis of the metrics
● Map metrics to the data needed to populate them
● Includes common business logic which can be customized by clients
The modular approach significantly reduces the time to implement multiple data sets (avoids
the need to “start from scratch”) and publishes metrics according to pre-built standards and
logic. SuccessFactors extensive knowledge of workforce measurement means that each
metric pack is specifically targeted to provide insights into the chosen topic.
Through working directly with organizations’ core data systems; SAP SuccessFactors
differentiates itself from other benchmark providers by offering rich and standardized
formulas and definitions.
The SAP SuccessFactors benchmarking application is the most comprehensive online
resource for workforce metrics on the market today, as SAP SuccessFactors approach to
benchmarking delivers a true “apples–to–apples” experience for the user and provides
access to granular data cuts, with minimal time investment for organizations. The benchmark
values are embedded in organizations dashboards and are superimposed onto actual
company results for quick visual comparisons.
provision of this transaction level data to SAP SuccessFactors standards is all that is
required to participate in the benchmarking program.
● Provision of Results in a Benchmark Spectrum – So that results are not ‘lost in the
average,’ the SAP SuccessFactors program provides benchmark results in a spectrum
(25th, 75th, and 90th percentiles) so that organizations can have access to the true range
of benchmark results.
● Analysis and Organization-Type Cuts – The nature of the data collection (at the employee-
ID level) allows SAP SuccessFactors to slice data by organization, employee, and job
function characteristics.
Note:
Benchmark data is available only in certain global regions.
LESSON SUMMARY
You should now be able to:
● Describe People Analytics
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Describe SAP SuccessFactors Workforce Analytics
● Describe Introductory WFAP Concepts Specific to Implementation Consultants
● Describe the Consultant Roles and Relevant Training
Workforce Planning
Workforce Planning: In Maintenance Mode
Caution:
SAP SuccessFactors is no longer selling licenses for SuccessFactors Workforce
Planning (WFP) as it has entered into Maintenance Mode. Information is included
in this course as it is possible that a customer that obtained a licensees before
maintenance mode could implement and use WFP.
Over the past several years, SAP has been investing in a new set of cloud-native analytics
solutions: SAP Analytics Cloud. Included in these solutions is a set of cloud-based business
planning capabilities, SAP Analytics Cloud for Planning, which consolidates both workforce
and business data, enabling customers to effectively perform comprehensive and advanced
enterprise planning scenarios, including workforce planning.
As a part of this evolution, SuccessFactors made the decision to deliver end-to-end People
Analytics and planning, through the combined use of SAP SuccessFactors People Analytics
(specifically Workforce Analytics) and SAP Analytics Cloud for Planning. Therefore,
SuccessFactors announced intent to deprecate SAP SuccessFactors Workforce Planning
(WFP).
The result of the announcement (in January 2021):
Maintenance Mode: SAP SuccessFactors Workforce Planning has entered ‘Maintenance
Mode’. Generally, this solution lifecycle phase means that while critical issues will be fixed, no
additional product enhancements will be made.
Continued Use: If you have already purchased SAP SuccessFactors Workforce Planning –
regardless of adoption status – you can continue to use it within ‘Maintenance Mode’ until
your next renewal.
Workforce planning is a systematic process that aligns business and HR needs to ensure
organizations have the right people, with the right skills, at the right time and cost to execute
efficiently and successfully. With proper planning, you minimize risks associated with
executing business strategy. There are two main components of workforce planning: strategic
and operational. Strategic workforce planning takes the long view and forecasts critical roles
the organization will need in the future. Operational workforce planning looks at the short
term demands of the business, compares them to current talent supply, and determines
whether additional resources are needed.
● Forecasting - Forecast demand, create models, and perform gap analysis for skills across
critical job roles. Project the size and shape of the future workforce required to execute on
business strategy.
● Risk Analysis - Identify critical gaps that pose a risk to business strategy execution.
● Strategy, Impact, and Cost Modelling - Determine interventions that will most effectively
mitigate identified risk. Model the impact and costs of these interventions to best
determine financial implications of all possible futures.
● Action and Accountability - Determine how to integrate workforce strategies into broader
corporate strategies and measure their success.
● Best Practice Strategy Bank - Create a library of workforce planning designs and strategic
interventions or select designs and interventions from the Best Practice Strategy Bank.
The Best Practice Strategy Bank includes over 30 years of field experience and research
and lists almost 100 different tactics and strategies that organizations can adopt to
strengthen attraction and retention efforts.
Align long-term workforce plans with business strategy. With SAP SuccessFactors strategic
planning, you can view, assess, and design your workforce to support your organizational
strategy and goals through five simple steps.
SAP Analytics:
● Flexible enterprise toolset based on SAP BW, SAP HANA, SAP BI, and SAP Analytics Cloud
● Selected pre-defined content for dashboards, cubes, and queries
● Standard SAP systems integrations
● Deployed on premise and in the cloud
*Analytics Value Chain From Case Studies by the Google People Analytics Team as presented
by Brian Ong during 2012 HCI Events: http://www.hci.org/lib/case-studies-google-people-
analytics-team
For more information, you can review the following blog. Please note that at the time of its
writing SAP Analytics Cloud was SAP Business Intelligence: https://blogs.sap.com/
2013/05/10/successfactors-workforce-analytics-or-sap-business-intelligence-for-human-
resources/
Figure 16: SAP Analytics Cloud/ Digital Boardroom and SAP SuccessFactors
SAP SuccessFactors is one of many data sources for SAP Analytics Cloud.
SAP Analytics Cloud (SAC) includes a connector supporting Workforce Analytics. This
connector allows users to import data from WFA for analysis and inclusion within SAC models
and stories. Stories form the basis of Digital Boardroom presentations, allowing WFA data to
appear in these presentations.
How are SAP SuccessFactors Analytics and SAP Analytics Cloud Different?
The implementation project for WFAP has two consultant roles: functional and technical. The
implementation process is designed for each of the roles to be provided by separate
individuals / groups as the primary skills desirable in the roles are different. Therefore
separate courses exist for each role, with some shared content between the two courses.
In either role, the consultant must know:
● The high-level overview of the product and where it fits in the SAP SuccessFactors
analytics solution
● How to utilize the application
● The Implementation Methodology and the actions the functional / technical consultant
must perform
For the functional consultant, additional content is included that focuses on administering
WFA, including:
● Managing users, roles, and permissions
● Configuring Insights
● Managing customer-facing administrative tools for WFA on SAP HANA
● Managing administrative features of Strategic Forecasts
For the technical consultant, there are more pre-requisite knowledge and courses
recommended that are related to:
● Understanding Employee Central Data and Data Structures
● Solid knowledge with utilizing SQL for configuring HANA databases
● General knowledge on Business Intelligence (BI)
WFA can be implemented as traditional WFA (on SQL Server) or WFA on SAP HANA. While
the implementation process is similar in either case, there are some differences. The primary
difference is in traditional WFA, the technical consultant role is performed by a dedicated SAP
team of consultants, while in the new WFA on SAP HANA, the technical role can be supplied
by the SAP partner implementation consultants.
Note:
For more information on the differences in traditional WFA and WFA on HANA,
please review the Workforce Analytics on SAP HANA FAQ document in the SAP
Help portal or the Implementation Methodology section of this course.
Please read the content of The Core Workforce and Mobility Metric Pack Documentation and
complete the exercise that reviews the content. You can access the exercise in the learning
content.
Abstract
This exercise will guide you in interpreting the metrics pack documentation.
Objectives
Once you will have completed this exercise, you will be able to:
● Describe the metrics pack documentation
● Determine standards of an SAP SuccessFactors metrics pack configuration
Customer Requirements
You will be supporting a new implementation of WFA for the Company ACME Boots. The first
project will implement the metrics pack Core Workforce and Mobility- as their HR information
system is SAP on premise. You need to familiarize yourself with the metrics pack to anticipate
questions and explain the documentation to ACME’s project team. To facilitate this study, the
exercise asks both questions to emphasize points of understanding and questions that the
customer may also ask.
Reference Documentation
You should complete the exercise by using the Core Workforce and Mobility Metrics Pack.pdf
located in the learning content.
Note that these documents are also available in the Product Documentation ZIP file that you
should have downloaded as part of Module 1. After the course, you can download the latest
version from the SAP Help Portal.
Documentation Interpretation
Using the Core Workforce and Mobility Metric pack documentation, please answer the
following questions.
You should be able to explain these concepts to customers at the end of the training.
1. Where can you locate the common Key business questions core workforce and mobility
(CWM) metric pack addresses? Would you be able to locate the key business questions
for the Absence Management Metric Pack?
4. From the documentation, can you explain the difference in the three measures (base
input, derived input, and result)?
5. What do the following acronyms stand for: EOP, FTP, and SOP?
6. What is the description of the Base input Measure: Movements – Out? Does it support
drill-to-detail?
8. In WFA, time dimensions provide for the breakdown of measure values across what
periods of time?
9. What are the standard 3 breakdown levels of the dimension Age? What are the standard
labels for Recruitment sources?
10. What is the dimension filter applied to the Derived Measure: Terminations – voluntary?
11. What is the calculation for the result measure: Termination Rate – Voluntary? What
Category and Sub-category is it located in?
Result
Well done! You are now all set to work. You now know how to:
● Describe the metric pack documentation
● Determine standards of an SAP SuccessFactors metric pack configuration
Abstract
This exercise will guide you in interpreting the metrics pack documentation.
Objectives
Once you will have completed this exercise, you will be able to:
● Describe the metrics pack documentation
● Determine standards of an SAP SuccessFactors metrics pack configuration
Customer Requirements
You will be supporting a new implementation of WFA for the Company ACME Boots. The first
project will implement the metrics pack Core Workforce and Mobility- as their HR information
system is SAP on premise. You need to familiarize yourself with the metrics pack to anticipate
questions and explain the documentation to ACME’s project team. To facilitate this study, the
exercise asks both questions to emphasize points of understanding and questions that the
customer may also ask.
Reference Documentation
You should complete the exercise by using the Core Workforce and Mobility Metrics Pack.pdf
located in the learning content.
Note that these documents are also available in the Product Documentation ZIP file that you
should have downloaded as part of Module 1. After the course, you can download the latest
version from the SAP Help Portal.
Documentation Interpretation
Using the Core Workforce and Mobility Metric pack documentation, please answer the
following questions.
You should be able to explain these concepts to customers at the end of the training.
1. Where can you locate the common Key business questions core workforce and mobility
(CWM) metric pack addresses? Would you be able to locate the key business questions
for the Absence Management Metric Pack?
Key business question examples are located in the Overview section of the documentation
for each metrics pack. Therefore, you would find the answer for Absence Management in
the Absence Management Metrics Pack document.
The overview section provides a list of other metric packs. Additionally, there is a
document called ‘SAP SuccessFactors Metric Pack Overview available in the Product
Documentation download that includes a summary of each metric pack.
Drill-to-detail functionality is provided via the SAP SuccessFactors website and allows
users to build data queries that return the details of the individual employees who meet
the criteria of the query. Drill to detail is NOT available on all measures.
4. From the documentation, can you explain the difference in the three measures (base
input, derived input, and result)?
Base Input Measures form the basic building blocks of any business intelligence solution.
Filtering of Base Input Measures by Dimension Hierarchies generates a rich set of Derived
Input Measures. Finally, Derived Input Measures are combined in formulas to generate
Result Measures (in percentages, ratios, etc.) which are commonly used in analysis and
reporting, as well as in benchmark comparisons.
5. What do the following acronyms stand for: EOP, FTP, and SOP?
EOP stands for En of Period, FTE stands for Full Time Equivalent, and SOP stands for Start
of Period.
6. What is the description of the Base input Measure: Movements – Out? Does it support
drill-to-detail?
Movements – Out refers to the number of movements out of positions in the reported
organization unit. Yes, it supports drill-to-detail.
Workforce Mobility
8. In WFA, time dimensions provide for the breakdown of measure values across what
periods of time?
Year, quarter, month (in both a calendar and fiscal year). This does NOT include new time
models of WFA on HANA.
9. What are the standard 3 breakdown levels of the dimension Age? What are the standard
labels for Recruitment sources?
10 years, 5 years, and 1 year. The standard labels are: Hire, Rehire, Promotion, Demotion,
transfer, other internal movements, acquisition, restructures.
10. What is the dimension filter applied to the Derived Measure: Terminations – voluntary?
11. What is the calculation for the result measure: Termination Rate – Voluntary? What
Category and Sub-category is it located in?
Result
Well done! You are now all set to work. You now know how to:
● Describe the metric pack documentation
● Determine standards of an SAP SuccessFactors metric pack configuration
LESSON SUMMARY
You should now be able to:
● Describe SAP SuccessFactors Workforce Analytics
● Describe Introductory WFAP Concepts Specific to Implementation Consultants
● Describe the Consultant Roles and Relevant Training
Learning Assessment
1. Which solution integrates data from multiple sources to analyze talent trends like
voluntary turnover rate?
Choose the correct answer.
X A WFP
X C WFA
X D Dashboards
Lesson 1
Workforce Analytics Basic Navigation 34
Exercise 2: Navigate Standard WFA Views 57
Lesson 2
Workforce Analytics Custom Queries 62
Exercise 3: Create Query with Query Workspace 89
Lesson 3
Workforce Analytics Custom Reports 96
Exercise 4: Create a Report with Report Center 123
Lesson 4
Workforce Analytics Investigate 129
Exercise 5: Create an Investigation 147
UNIT OBJECTIVES
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Navigate Workforce Analytics
After you log into the SAP SuccessFactors Instance, you can navigate to the WFA application,
also referred to as the WFA portal.
To access the WFA portal:
1. From the main navigation list select Workforce Analytics. Note the label can be renamed in
individual instances.
● It eases user navigation by providing access to WFA standard views and areas of the site at
one entry point.
● Users can choose their own Landing Page.
The Analytics Portal Page, also known as the landing page, provides the following navigation
components:
3. Filters: Apply time, structural and/or analysis dimensions to further customize your
report.
A button is available for users to set and clear the Workforce Analytics Landing Page. The
landing page is the first page shown when navigating to WFA and can be set to any canvas
report created in Report Center.
A user can save predefined dimensions as a filter profile. Additionally, a filter profile can be set
as the default.
The tools panel allows you to access a variety of WFAP tools within the application. Some of
these tools will be covered in more detail in this course.
Measure Views (also called Metrics views) provide a quick and easy way to navigate the rich
content of over 2,000 metrics provided by Workforce Analytics.
These Measure Views allow for different views of the WFA data without any need to build
dedicated reports or queries.
To locate a measure, you can expand the different measure categories and subcategories.
The location of the measure category and subcategory is documented in the Core Workforce
and Mobility Metric Pack Documentation.
You can also search for a measure with the search box. To search for a measure:
1. Type part of the name in the search box and either click the Search icon or press Enter.
2. You can click next and previous to navigate through matching measures.
Query Workspace allows WFA users to create custom queries against WFA data and visualize
the query results in tables, charts, or embed them in custom reports. It allows users to
analyze data in ways that are not available in the default measure views.
The use of Query Workspace is detailed in a lesson later in the materials.
To access Query Workspace:
● Select Query Workspace from the Tools panel in the navigation bar.
Note:
You can also navigate to query workspace by selecting the Switch to Classic View
link inside an Investigation.
Navigating to Investigate
Investigate combines a clean and modern look with improved usability based on SAP Fiori
design principles and our comprehensive analytical library to make interactive analytics
simple and readily accessible. Investigate allows you to create queries and adjust
visualizations within a single tool.
The use of Investigate is detailed in a lesson later in the materials.
To access Investigate:
● Select Investigate from the Tools panel in the navigation bar.
Note:
If Investigate has not been enabled, you cannot access the tool. Details for
enabling Investigate can be found in the Lesson on the Investigate tool.
Report Center is a report management tool. One of its functions is to allow WFA users to
create custom reports. These reports can include data from a variety of data sources,
including the Query Workspace and Investigate.
The use of Report Center is detailed in a lesson later in the materials.
To access the Report Center:
● Select Reporting from the main navigation menu.
Input Measure: An input measure is information that has been sourced or calculated from the
raw data extract provided by members from their Human Resource Information Systems
(HRIS) to convert into metrics.
The following examples are input measures:
● End of Period Headcount
● FTE
● Average Headcount
● Terminations
● Hires
● Performance Appraisal Participants
Derived Input Measure: A derived input measure is a base input measure that has been
filtered by a dimension category. The following examples are derived input measures:
● EOP Headcount – Male
● FTE – Part-Time
● Terminations – Voluntary
● External Hires – Minority
Result Measure: A result measure is a new piece of information that has been calculated by
combining multiple input measures. This information represents information that was not
present within the original HRIS extract.
The following examples are result measures:
● Average Workforce Age
● Staffing Rate – Female
Navigational time models are multi-level models that allow the user to navigate through the
time dimension using the lefthand navigation menu. WFA supports time models based upon
calendar years or fiscal years.
The Time Filter supports 3 levels for the time dimension; years, quarters, and months.
Yearly time models are based on calendar years like 2018 and 2019 and on fiscal years like
2017/2018 and 2018/2019.
Quarterly time models are based on the calendar quarters of Quarter 1 (January to March),
Quarter 2 (April to June), Quarter 3 (July to September) and Quarter 4 (October to
December) or, alternatively, on the fiscal quarters.
Monthly time models are based on the twelve months of the year.
Note:
WFA on SAP HANA also supports weekly and daily time models, but they are NOT
available within the Time Filter.
Note:
A dimension must be configured as a Structural Dimensions to be used in Tree
Security. Tree security allows WFA administrators to limit access to areas within
the hierarchy of a dimension. Dimensions implemented as Analysis Dimensions
can only be enabled / disabled for the entire hierarchy. For more information on
Tree Security, please see the section on Workforce Analytics Administration.
Analysis dimensions allow analysis by groups which logically segment the measure. For
example, Workforce measures such as Termination Rate may be analyzed by Gender or Job
Role, Financial measures such as Operating Expense may be analyzed by Expense Type and
Staffing measures such as Applicant may be analyzed by Applicant Source.
Analyze By: Choose an analysis dimension to “slice” the data.
Filter By: Choose an analysis dimension to filter the data.
Note:
Analysis Dimensions may vary according to the modules implemented and the
data available.
Note:
Analysis Dimensions must be utilized to support Benchmarking and Derived Input
Measures.
Benchmark Gauges
Note:
Benchmark results will appear only for applicable benchmarked measures.
Benchmarking data availability varies by region.
Measure Explain
When selected, the ? icon displays an explanation of the measure that is currently being
viewed.
Footnotes
The + icon displays any footnotes available for the currently selected measure. Footnotes are
specific to an organization and can be added by any user to provide more detail/context on
the measure for other users within their organization.
Use the dropdown navigation on each measure page to see the other measure pages
available.
Note:
The asterisk on the menu indicates the page you are currently on.
In addition to the common features of all measure views, the Analysis View for Result
Measures contains:
1. A trended graphical representation of the measure for the whole organization for a
selected period of time
3. A table comparison of each structural dimension unit by each analysis dimension unit for
the measure for the selected time period
In addition to the common features of all measure views, the Benchmark View for Result
Measures contains:
3. A stacked benchmark chart showing the organizations results for the selected time period
against the benchmark ranges
4. A table comparison of the measure by the selected analysis option compared to the
benchmark category selected
In addition to the common features of all measure views, the Inputs View for Result Measures
contains:
1. A trended graphical representation of the measure for the whole organization for a
selected period of time
3. A table comparison of each of the measure inputs for the selected time period
In addition to the common features of all measure views, the Table View for Result Measures
contains:
1. A table representation of the measure by the selected analysis option for the selected
time period
2. A table representation of the measure by the selected analysis dimension for the selected
structural dimension
In addition to the common features of all measure views, the Trend View for Result Measures
contains:
1. Trended graphical representation of the measure for the selected time period including
the average over time
2. Trended table representation of the measure by the selected structural dimension for the
selected time period
Use the dropdown navigation available on each measure page to see the other measure pages
available.
Note:
The asterisk on the menu indicates the page you are currently on.
In addition to the common features of all measure views, the Analysis View for Input
Measures contains:
1. A trended graphical representation of the measure for the whole organization sliced by the
analysis dimension
3. A table representation of the measure broken down by the selected structural dimension
and sliced by the selected analysis dimension
In addition to the common features of all measure views, the Pie Chart View for Input
Measures contains:
1. A pie graph representation of the measure by the selected analysis dimension for the
selected time period
2. A pie graph representation of the measure by the selected structural dimension for the
selected time period
3. A table comparison of the measure by the selected structural and analysis dimensions at
the selected point in time.
In addition to the common features of all measure views, the Table View for Input Measures
contains:
1. A table representation of the measure by the selected time period and the selected
analysis dimension
2. A table representation of the measure for a single point in time, by the selected analysis
and structural dimensions
In addition to the common features of all measure views, the Pie Chart View for Input
Measures contains:
1. A trended graphical representation of the measure for the selected time period including
the average over time
2. A trended table representation of the measure by the selected structural dimension for
the selected time period
Administrative Features
The Admin link that appears in the top right corner of the WFA Portal provides access the
administrative features available for your role.
Note:
The options available in the Admin menu depend on an individual user’s role
permissions.
Note:
The administrative features are covered in a variety of areas in this document. Not
all apply to WFAP and therefore may not appear in this document.
Export Report Pages, or an entire Report to Microsoft Word, Adobe PDF or Microsoft
PowerPoint using the Export icons in the menu bar.
Exporting to Excel
You also might need to export the results of a table. Each table has a button to export that
individual table to Microsoft Excel.
Certain individual charts can be copied and pasted into an image, document or presentation
editing program.
Right click the chart and select Copy, then, right click in the program and select Paste.
WFA Standard templates are pre-built templates only applicable to customers with Workforce
Analytics. They are available in Customer Community and the SAP Help Portal to be uploaded
to the organizations instance. Customers are then able to customize these reports per
business requirements. The data is easily exportable for use in other tools or applications
such as Microsoft Excel.
Note:
You can find these reports via the SAP Help Portal → SAP SuccessFactors People
Analytics.
Analytic Scorecards are one-page report templates that can be used and customized in the
Workforce Analytics environment. Modules included from the Workforce Analytics Metric
packs are Core Workforce and Mobility, Learning, Recruitment, Succession, Compensation,
Availability and Performance and Goals.
Note:
The scorecards that are available depends upon what Metric Packs have been
implemented.
New Manager Provide a new manager a quick insight into their new team.
Employee Profitability Provides a quick and easy visual that will assist reporting on
“how” human capital impacts an organization’s financial per-
formance.
Business Beyond Bias From a planned series of “issue focused reports”. This report
specifically addresses business beyond bias?
Human Capital Reporting This template provides all required metrics for external report-
ing based on the ISO 30414 Human Capital Reporting Stand-
ard.
Some other reports offer more pages and/or detail than scorecards.
Note:
The document Workforce Analytics Report Templates – Examples on the SAP
Help portal has examples of each report.
Note:
In this exercise you will determine what was the Voluntary termination rate at
Research and Development (OU under Corporate) for those with a position
tenure of less than 1 year in Q4 2017? You will determine how voluntary
termination rate is calculated and what the benchmark value is.
1. Click .
2. Click .
3. Click .
4. Click .
5. Click .
6. Click .
7. Click .
8. Click .
9. Click .
10. Click .
11. Click .
13. Clicking in the scroll area displays the desired screen area.
Note:
The Research and Development rate was 19.7% for employees with 1 year or
less of job tenure.
14. Click .
15. Click .
Note:
Formula is displayed in Explain, as well as other useful information.
16. Click .
17. Click .
18.
Note:
2017 Benchmark for Volunatary Termination Rate was 6.9%.
Note:
In this exercise you will determine what was the Voluntary termination rate at
Research and Development (OU under Corporate) for those with a position
tenure of less than 1 year in Q4 2017? You will determine how voluntary
termination rate is calculated and what the benchmark value is.
1. Click .
a)
2. Click .
a)
3. Click .
a)
4. Click .
a)
5. Click .
a)
6. Click .
a)
7. Click .
a)
8. Click .
a)
9. Click .
a)
10. Click .
a)
11. Click .
a)
13. Clicking in the scroll area displays the desired screen area.
Note:
The Research and Development rate was 19.7% for employees with 1 year or
less of job tenure.
a)
14. Click .
a)
15. Click .
Note:
Formula is displayed in Explain, as well as other useful information.
a)
16. Click .
a)
17. Click .
a)
18.
Note:
2017 Benchmark for Volunatary Termination Rate was 6.9%.
a)
LESSON SUMMARY
You should now be able to:
● Navigate Workforce Analytics
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Create custom queries in Workforce Analytics
Query Workspace
Introduction to Query Workspace
Using the Query Workspace tool, you can analyze data in ways that are not available in the
default measure views.
To access the tool, please review the basic navigation lesson earlier in the documentation.
When first entering Query Workspace, the view will be populated with the last measure
previously viewed.
All dynamic options are enabled and one can immediately begin to drill through and change
analysis using the Filters tile.
Note:
Dynamic Options (Selectable Time, Selectable Structure, Selectable Analysis, and
Selectable Filter) allow the user to select the dimension setting at runtime.
Query Management
If you want to create a new query from scratch, or change an existing query, you will be
working with the design panel. The query design panel includes a measure & dimension list
and the ability to toggle between table view and design view. In either case you can drag and
drop measures and dimensions onto the workspace. You access the design panel from the
edit menu.
Measures and Dimensions list of all measures and dimensions currently available in the
portal.
The Show button allows you to toggle between Table view and Design view.
Note: weekly and daily time models are only available on WFA on SAP HANA
implementations.
Time Period: the report user can select a time node using the filter panel. Selecting a node will
display the child nodes of the selected node. In the example, the year 2016 was selected
Time Period (Current node only): the report user can select a time node using the filter panel.
Selecting a node will display the selected node only. In the example, the year 2016 was
selected
Time View: the report user can select a time type (Year, quarter, months of a year) using the
filter panel. Selecting a node will:
● Year: display all years
● Quarter: display all quarters
● Months > Year: Display all months of the selected year
Many time models exist for that display node(s) that represent years.
Many time models exist for that display node(s) that represent quarters.
Note:
Seasonal View displays the same quarter (Q1, Q2, Q3, Q4) of all the years, for
example Q1 of 2016 and Q1 of 2017.
Many time models exist for that display node(s) that represent months.
Many time models exist for that display node(s) that represent weeks.
Many time models exist for that display node(s) that represent days.
Tokens
Tokens allow you to display the textual information that is dynamically generated. This
becomes a powerful tool especially when you consider how you might want users to navigate
and adjust selection criteria in your report at run-time or even if you decide to adjust your
report in the future. With tokens your report will automatically tell its own story without you
needing to manually adjust text.
For example, you are building a report that contains several charts on a page. Each of these
charts shows a distinct view of data that when shown together on a report page portray a data
story that is important to your report. As the author of the report you understand this story
but you want to make sure that other report users can easily discern what each chart is
depicting. To do this you plan to add titles and axis descriptions to each chart to make this
clear.
You could simply manually type in a text title/axis description for each chart. However, due to
the dynamic ability of these tools, there is a way to let the chart automatically do this for you
using tokens that describe the selected properties of the chart. If you are constructing a chart
using the End of Period Headcount measure we could manually type this into the title.
Alternatively, you could include a token in the title that means “automatically display the
name of the measure that this chart represents”. Then if you later change this chart to use the
Start of Period Headcount measure then the tile will be correct without any change in the title.
Types of Tokens
Tokens can be divided into categories that are largely defined by the major components of a
chart or table. You can think of these components as being the major building blocks of the
underlying query.
Token categories include:
● Organization – presenting an organization specific disclaimer, and the date data was
published
● Page – User selected filter in the filter panel
● Measure - the measure/s used in the query
● X and Y Axis (Columns and Rows) - the data selections (dimensions) that appear on the
two axis of a chart (X and Y) or table (columns or rows)
● Filter – any analysis dimension filters that are applied to the query
● Context – any time or structural filters that have been applied to the query
● Benchmark – any benchmark categories that have been applied to the component
All tokens have a similar format when you are using them and that is [%TOKENNAME%]. The
‘[%’ and ‘%]’ characters are simply designed so that tokens may be distinguished from
normal text that is manually typed.
You may build up sentences using multiple tokens combined with normal manually entered
text.
Using Tokens
Tokens can be used in a variety of locations for WFAP tools. The token list can be retrieved by
selecting “…” next to a text box in a variety of locations, or by selecting “Token Selector”
within the certain components.
Targets
A Target is a desired level of performance for a measure that can be achieved through
proactive management. Targets can be configured by WFA administrators. A variety of query
and report components can utilize configured targets. Targets are defined for measure
utilizing a time, structural dimension and optionally an analysis dimension. For example, you
can set termination rate targets for the Location Organizational Units for the years 2016 and
2017.
Note:
Full description of target configuration is outside the scope of this document.
You can use thresholds to display a close to target/near range. They can be configured as a
number or percentage above/below a value. Additionally, they can be set for one side of a
target, or both.
Table View
Table view gives the user a tabular view of the output of the current query. When in table view,
you can view the results of any changes to the query. You can drag metrics onto the row or
column to include them in the query. You can drag a dimension onto the row / column / filter
to include those in the existing query. Table view provides the simplest way to modify an
existing query.
Find the required measure from the list in the left side pane, then click and drag the measure
to either the Rows or Columns bar. When creating a new query, a measure must be added
before any dimensions. In this example, Average Headcount has been dragged and dropped
onto the Rows bar.
Note:
Custom Measures can be found in the Custom Calculations folder. It is possible to
include more than one measure in the same query.
Once a measure has been selected, the Dimensions list will automatically open. If the query is
to be displayed in a chart format, it must contain a dimension in the columns. In this example,
the structural dimension All Organizational Units has been dragged and dropped onto the
Rows bar. A time dimension has not been selected, so the most recent year will appear as the
default time period.
Note:
Custom Dimensions you have permission to access can be found in the Special
Functions folder.
Time Dimensions
A time period for the query can be selected by dragging and dropping a time dimension onto
the query pane. Using a time dimension in the rows/columns of the query makes the
dimension visible as part of the query. In this example, the time dimension Calendar Years has
been dragged and dropped onto the Columns bar. Calendar Years can now be seen as the
columns in the query results.
Note:
Reliable queries will include a measure, a time dimension and a structural
dimension.
Selecting a Filter
A filter for the query can be selected by dragging and dropping a dimension node onto the
query pane. In this example, the node Generation X (1964-1978) from the Generation analysis
has been dragged and dropped onto the Filters bar. By adding a filter to the example, we now
have a query showing the Average Headcount for All Organizational Units by Calendar Years
filtered to Generation X employees.
Design View
Design view provides the user more detailed management of the output of the query. There
are several actions that can be completed that cannot be done in table view:
● Remove measures / dimensions
● Change the query order of dimensions
● Swap dimensions between rows / columns
● Include a subset of a structural or analysis dimension
2. If more than one measure is used in the query, the ability to push measures up or down
the query order
6. A measure Edit option to toggle between Actual/Annualized results (if applicable for that
measure)
5. If more than one dimension is used in columns/rows, the ability to push dimensions up or
down the query order
Dimension Levels
Dimension Levels can be changed by selecting Edit for the dimension you wish to change.
Dimension Levels consist of Qualifiers and Distance.
● Qualifiers determine which dimension nodes will be included in the query, according to
their relationship to the selected node.
● Distance refers to the difference in depth between the dimension node selected and its
associated target node(s). For example, if you wanted to examine the items immediately
below the dimension node (the child elements), then a distance of one (1) applies.
Similarly, the grand-children nodes are identified by a distance of two (2).
In this example the Qualifier All descendants to distance with a Distance of 1 has been used
with the All Organizational Units node of the Organizational Unit structural dimension – this
will show all the dimension nodes immediately underneath All Organizational Units.
To change the existing selection:
4. Click Add
Note:
More than one qualifier selection may be added to the query if desired.
A time period for the query can be selected by adding a Time Context. Time Context is used
when it is not intended that the time can be seen as rows/columns in the query, but simply as
a background filter to the query. Time Context is added to the query via the Design View.
Select a time context and click Set. Then close the window by clicking the up arrow. In this
example, the time model Last Year has been selected.
Note:
If a Time Dimension has already been added to Rows/Columns, it will not be
possible to also add a Time Context.
Include Benchmarking results in your query in Additional Data Items: Additional Data Items
allows you to add benchmarking results to your query.
Note:
Additional Data Items is only available when constructing a query in Page
Designer, using the Chart View Showing Benchmark Lines component.
Other Options
Clear Query Parameters allows you to clear all query settings and begin a new query. Use
Structural Context if an organizational structure is not selected on the rows or columns, and
the query is to be used in a Report. Ticking this box will enable the organizational structure to
be selected at runtime.
From the Edit menu, select Edit Query Formatting to change the look and feel of the query
display.
Note:
Any formatting applied in Query Workspace will NOT be applied if the query is
used in a custom canvas report. You will have to format the component in the
report separately.
From the View menu, select Table or Chart to toggle the query display between table/chart
view.
Note:
To view a query as a chart, the query must have a dimension on the columns.
the desire may be to group ages above 40 and look at the total result of those groupings.
Furthermore, rather than having this view pertaining to a single query, the user may want this
view to be available at anytime for query work without having to recreate it.
The Custom Dimensions functions make the above possible, as well as providing the scope
for many other uses.
Custom Measures
The Custom Measures function provides users with the ability to create custom calculated
measures for use within Query Workspace, Investigate, and Page Designer.
Create custom measures by dragging and dropping measure inputs and formula operators
onto a designer and see your measure build itself in real time.
The tool also supports a filter/s for each input measure as the measure is constructed.
Custom Measures/Dimensions can be accessed via the Edit menu in Query Workspace. Any
previously created custom views appear within the folder structure. The tools tile displays the
two custom options not currently selected.
View Management
Any Custom Dimensions you create will be hidden from other users of the site unless you give
it a public status. Click Status to see a list of the dimensions you have access to and their
statuses. The views status will be displayed as a ‘x’ if the dimension is not shared, and a ‘√’ if it
is shared. It is important to note here that users can edit/delete their own custom views and
any views created by other users that have a public status. When accessing views for query
use within Query Workspace, users can use their own custom views and any public custom
views. Role Based Security permissions continue to apply – a user needs the correct RBS
permissions to view results in a query using Custom Members, Sets and/or Calculations,
even if the view has a public status. Share your custom dimension with other users by
selecting Status, then Make Public.
Note:
Users that are assigned the Custom Metric and Dimension Admin permission can
view, edit, make public, and delete any custom metric or custom dimension.
Folders can be created to simplify the management of custom views and views can be moved
to different folders by dragging and dropping. Manage the folder structure with the New, Edit
and Delete buttons. If a folder is not selected when a dimension is created, the dimension will
save to the (Default) folder.
Custom members and custom sets are created in the same way, but will display different
results when used in a query.
● Custom Members: Groupings within a dimension will create a single total result made up of
all the selected groupings.
● Custom Sets: Groupings within a dimension will be made up of only the selected
groupings.
Note:
Both custom members and custom sets can be referred to as custom dimensions.
To create a new member or set, choose Custom Members or Custom Sets from the left tools
tile and click File New. The Name field is used to give the custom dimension an appropriate
label. Build a custom dimension from by selecting the structural or analysis dimension you
wish.
As can be seen in the example, the organization has an Employee Preference Survey cube
containing measures and dimensions based on Employee Survey results.
● Cube Id: The Cube Id selection is necessary to define the type of data of which the analyses
are used to slice by. The workforce cube houses the majority of data and analyses in most
instances, but other measures such as those involving financial data will often be housed in
other cubes and thus will need to be selected if the goal is to create a custom view for
analyzing those particular measures.
● Dimension Id: The Dimension Id specifies the analyses that can be used to create the
custom dimension. Continuing the Age example, the relevant dimension in this case would
simply be Age. Only dimensions that are available in the organizations cube can be used to
create a custom dimension. Custom dimensions cannot be created from other custom
dimensions.
Note:
The procedure for creating member and sets is exactly the same. The only
difference is how they display within query results.
To edit the nodes of the member or set, choose Paths tab then adjust the nodes from the
selected Dimension. Then click OK to save, or Cancel to ignore the changes. Select the
appropriate nodes from the dimension tree. Use CTRL + Click to select more than one node at
a time and drag to add them to the custom dimension. Nodes can be deleted from the
Members list by dragging or Remove All.
To create a new calculation, choose Custom Calculations from the tools tile. Click File New.
Enter the Measure details in the Custom Calculation Editor pane:
1. Enter Name
Select a Filter
The Select Dimension Filter window will automatically open each time an input measure is
dropped onto the designer. Filter the measure by selecting one or more filter nodes from the
required analyses:
Note:
Should a filter not be required, simply click OK.
Adding an Operator
Continue building the formula by dragging operators and more measures into the designer.
After the addition of the first measure, each subsequent operator and measure will need to be
dropped onto the appropriate grey bar to either side of every component in the formula.
Select from the list of available operators and drag them onto the designer.
To remove an input, drag input from the designer and drop it onto the Delete button.
The Preview Window displays your calculation.
The Validate function can be used to check the validity of the custom dimension or measure
and is accessible by clicking the Validate button. Validation will check all dimensions/
measures saved in the tool; if no errors are detected an “All is valid” message will be
displayed. Validation becomes an increasingly necessary function throughout the life cycle of
the cube and portal as measures and dimensions are added, modified and, on occasion,
removed as part of the refresh process.
To use the custom measure/dimension, the user must create a query within Query
Workspace, Investigate or Page Designer.
Query Workspace and Page Designer use the same drag and drop paradigm.
Custom dimensions are included in the Special Functions category and can be located under
Dimensions.
Custom measures are and can be located under Measures panel and are included in the either
Custom Calculations category or the selected Category and SubCategory during its creation.
For more information about using custom measures and custom dimensions with Investigate,
please review the Investigate Lesson in the course.
Note:
In this exercise you will create a query in query workspace that displays the
termination rate – voluntary for 2016. The table will display the value broken
down for organization tenures under 3 years. Finally, it will be filtered for North
America employee only.
1. Click .
Note:
In this exercise you will create a query in query workspace that displays the
termination rate – voluntary for 2016. The table will display the value broken
down for organization tenures under 3 years. Finally, it will be filtered for
North America employee only.
2. Click .
4. Click .
5. Click .
6. Click .
7. Click .
8. Click .
9. Drag .
10. Drop on .
11. Drag .
12. Drop on .
13. Drag .
14. Drop on .
15. Click .
16. Click .
17. Click .
18. Click .
19. Click .
20. Click .
21. Click .
22. Click .
23. Click .
24. Click .
Note:
You would repeat the steps for the nodes: 1-<2 Years and 2-<3 Years.
25. Click .
26. Drag .
27. Drop on .
28. Click .
29. Click .
30.
Note:
In this exercise you will create a query in query workspace that displays the
termination rate – voluntary for 2016. The table will display the value broken
down for organization tenures under 3 years. Finally, it will be filtered for North
America employee only.
1. Click .
Note:
In this exercise you will create a query in query workspace that displays the
termination rate – voluntary for 2016. The table will display the value broken
down for organization tenures under 3 years. Finally, it will be filtered for
North America employee only.
a)
2. Click .
a)
a)
4. Click .
a)
5. Click .
a)
6. Click .
a)
7. Click .
a)
8. Click .
a)
9. Drag .
a)
10. Drop on .
11. Drag .
a)
12. Drop on .
13. Drag .
a)
14. Drop on .
15. Click .
a)
16. Click .
a)
17. Click .
a)
18. Click .
a)
19. Click .
a)
20. Click .
a)
21. Click .
a)
22. Click .
a)
23. Click .
a)
24. Click .
Note:
You would repeat the steps for the nodes: 1-<2 Years and 2-<3 Years.
a)
25. Click .
a)
26. Drag .
a)
27. Drop on .
28. Click .
a)
29. Click .
a)
30. a)
LESSON SUMMARY
You should now be able to:
● Create custom queries in Workforce Analytics
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Create custom reports
Managing Reports
Introduction to Custom Reporting with Report Center
Custom Reports
Through custom reports, WFA report designers can present a wealth of human resources
data and build engaging reports that deconstruct metrics, organization units, occupational
groups and other analysis options.
You may need to manage report access, delete old reports, organize reports, and change
report ownership.
Management of custom reports handled via the Report Center. Report Center provides
centralized access and management of all reports, not just custom reports on WFA data.
Navigation to Report Center is covered in the basic navigation section earlier in this course.
The Report Center provides all report management in a single, unified interface. Each item is
categorized by a report type. Reports that can utilize WFA data are called Canvas report type.
The actions available in Report Center are:
2. Filter and Sort reports: Filters are persistent between logins and the number
indicates how many types of filter (Type, Author, Last Modified, Labels) are enabled.
3. Report Types: (Canvas Report, Table Report, Tile, Dashboard, Custom Report, Story
Report)
4. Perform an Action: Action menu, Label As, Run. The actions are described
below.
6. Add a report as favorite by selecting the star ( ) or remove a report from your
favorites by clearing the star ( ). You can access the list of your favorite reports from
the home page.
10. Views: My Reports displays reports authored by you or shared with you, All Reports
displays all reports.
Note:
The report views are only visible when you are a Report Administrator.
Actions on Reports
In the Report Center interface, you manage reports via the Action menu. You can perform the
following actions:
● Run
● Edit
● Share
● Rename
● Delete
● Duplicate
● Export
● Schedule
● Assign a custom label
● Change Author (owner)
● Copy Link
Note:
Not all options may be available depending upon permissions.
You can run reports on the actions taken on the individual reports in Report
Center with the Report Event Audit and the Report Execution Audit reporting
domains. For more information on working with other types of reports, please see
course: HR882- SAP SuccessFactors People Analytics: Reporting and
Administration.
You can use Report Center to add or remove favorite reports that are available on the home
page.
To use favorite reports:
● Select or clear the star icon that precedes the report name to add it or remove it from the
list of favorite reports. For example, Select the star before Goal Status in Report Center.
● On the latest home page, choose the new Favorite Reports quick action to view your list of
favorite reports, arranged by the report types. For example, Goal Status is listed under
Dashboards. The quick action allows you to search and remove reports from your favorites
list.
Custom labels provide a method to organize reports within Report Center. This helps users
find the reports they are looking for, and to group common reports. You can create groups of
labels as well. This allows for the reports to be organized in a hierarchical display if required.
You can filter reports by the assigned label. Turn on the filter panel and select the label for the
reports you would like to display.
Private labels:
● Are only visible to the user that created the private label.
● Can be created by any reporting user.
● Can be assigned to reports that the user can edit.
Creating Labels
You can create labels though either the dropdown next to the labels tab, or though the action
area of a report. Report administrators can create public and private labels. Other reporting
users can only create private labels.
By default, when creating a new label, the label is set to private.
When working with label name, consider the following:
● Labels are NOT case-sensitive. Retail and retail are considered as one label.
● You CAN create ONE public and private label with the same name.
● Label names of the same type (public, private) must be unique. For example, you CANNOT
create a private label named FAVORITES and another private label named favorites.
● Private labels with the same name CAN be created by different users. For example, Sally
can create a private label named Favorites and Sam can create a private label named
Favorites.
Figure 111: Other Reporting Users Can Only Create Private labels
To create a Label:
4. To nest the label inside another, select Nested label in, and select its parent label from the
dropdown list.
6. Click Create.
7. Click Done.
3. Click Apply
Managing Labels
You can also manage labels, such as deleting labels, editing the label color, marking as public
or private, and grouping/nesting labels.
To manage labels in Report Center:
● Edit an existing label with the pencil (edit) button. With Edit you can:
All reporting users can manage their own private labels. Only Reporting Administrators can
manage public labels. See the following example of the Manage Labels screen for two
separate administrators and a reporting user.
Note:
In the images in the examples, the text (Public) and (Private) has been added to
the label name for clarity of the example. Typically, you can differentiate public
and private by color, public labels have a color assigned while private labels have a
grey label color.
Administrator B Can Manage All Public Labels, But Not Administrator A’s Private Labels
Figure 115: Administrator B Can Manage All Public Labels, But Not Administrator A’s Private Labels
Figure 116: Report Creators and Consumers cannot Manage Public Labels
Figure 117: Reports (formally flat) Displays Labels under the Report Name
You can select the view in Report Center. In Report view, the labels appear underneath the
corresponding report titles in the flat report list. Clicking on a label will filter for that label.
Figure 118: Label view (formally grouped) shows public and private labels that can be expanded / collapsed
When you enable Labels view, Report Center will display public labels and the reporting user’s
private labels. Labels that do not have any reports that the user has access to will not display.
Reports without labels will appear in Unlabeled Reports.
You can select multiple reports at once in Report Center using the check boxes to the left of
the report name. Up to 50 report may be selected at a time.
To take action on multiple reports within Report Center:
Note:
With the current release, the only actions available for multiple, selected
reports are export and delete. Additional actions are planned for multiple
reports.
If you export multiple reports, they are contained within a single ZIP file. The
single ZIP file can then be imported into a separate instance.
Create a Report
2. Click New.
5. You can return to Report Center by selecting Report Center from the breadcrumb menu
(Home / Report Center / Report name) in the top left corner.
Note:
For more information on working with Report Center reports, please see
course: HR882- SAP SuccessFactors People Analytics: Reporting and
Administration.
When you create a new canvas report with Report Center, a report with a single blank page is
generated. The content of canvas report pages is built using Page Designer. Page Designer
builds pages based upon components. Details on adding components is in the next section.
Page Designer also allows you to manage the pages within a report, which includes:
● Adding Pages (new pages or pages that exist in another report)
● Renaming Pages
● Reordering Pages
● Deleting Pages (orphaned pages or pages from the current report)
● New Page to add a new, blank page with an auto-generated name to the report.
● Add Existing Page to select to add either a page from another report or an orphaned
page that is not assigned to any report.
Note:
You can delete existing orphaned pages from the Add Existing Page dialog
box.
3. Click OK.
3. Click OK.
When creating or editing a page, the page properties panel becomes available. You can use
the panel to access several options:
● Edit page properties allows you to:
- Name the page
- Set page orientation
- Configure page margins
- Configure a page (canvas) size.
● Edit designer properties allows you to:
- Configure a background grid to assist with the layout of components
- Manage overlapping components
● Validate Page checks the pages for errors. allows you to:
● Grid Options allows you to configure quickly some of the same grid options that were
available in Edit designer properties.
Pages are designed using components. There are a number of different components, each
with its own purpose, that can be used in canvas report pages.
Note:
This document does not go into the depth of detail required for the full use of each
of these components. Some additional detail on several components is available in
the appendix of course HR886: SAP SuccessFactors Workforce Analytics
Administration.
Use the fly out menu or right click the blank page to select a chart type.
To add a WFA query from Query Workspace or Investigate, use Chart view based on published
data.
To add a chart that includes benchmarking data, use Chart view showing benchmark lines.
Use the fly out menu or right click the blank page to select a gauge type.
Use the fly out or right click the blank page to select a table type.
To add a WFA query from Query Workspace or Investigate, use Table view based on published
data.
To add a single table utilizing multiple queries for comparison, use Composite table view
based on published data.
Use the fly out menu or right click the blank page to select the following text types:
● Simple Text: Link to reports and measures
● Rich Text: Multi-line text
● Explain Text: Insert standard measure formulas and descriptions
● Workforce Planning Explain Text: Insert standard WFP explanations
Use the fly out menu or right click the blank page to add an image:
● Company Logo: Add the preconfigure company logo used in the WFA portal.
● Upload Image: select a saved image to upload to the report.
Note:
You cannot resize an image (logo or uploaded image) in Page Designer. The
image size must be correct before adding to the canvas.
Use the fly out menu or right click the blank page to add miscellaneous items. Examples of
items that can be added are:
● Small Text with Hover
● Navigating Selectable Items
● Transition Diagram
● Transition Table
● Measure Index
Note:
Transition Diagrams are only available on portals that have implemented the
Talent Flow Analytics Module.
Add a chart based upon a query or pivot from a variety of data sources.
Editing Components
When you select an existing component on the page, you can use the quick options panel to
set the most commonly used component settings. You can right click the component to
access the ‘full editor’ that often has more settings than are available in the quick options
panel. Each component will have different configurable settings.
“Share” action will be active if you have access to edit the report. Find and select individual
users to share your report with.
"Share” action will be active if you have access to edit the report. Find and select RBP Groups
or Dynamic Groups to share your report with.
Note:
You need Share Reports to Groups & Roles permission to share your report with
groups.
“Share” action will be active if you have access to edit the report. Find and select RBP Roles to
share your report with.
Note:
You need Share Reports to Groups & Roles permission to share your report with
groups.
Note:
Only users with Schedule Reports to SFTP Destination permission can schedule
reports to be sent to SFTP.
To schedule a report,
c. Enter the e-mail addresses you want to send notifications. These e-mail addresses do
not receive the resulting scheduled report. You can choose to send notifications on
Job Start or Job Completion.
4. On Destination tab, select either Offline to download the report later from the View
Schedules section or schedule a report to be sent to Secure File Transfer Protocol (FTP).
2. Enter the File Path where the import or export file is located.
3. Enter the File Name with extension, for example, New_Report.xlsx, and select the Date
Format for Ad Hoc reports.
Note:
For canvas reports, enter the Folder Name. Date Format is not available.
On Job Occurrence tab, select how often the report should run and the first scheduled
occurrence of the report.
The job runs at the specified time for each occurrence:
On View Schedules page, see the schedules you have set up and the jobs that are running.
You can perform the following actions from Action menu.
● Run the schedule.
● Delete the schedule.
● Cancel the job to make it inactive.
● View jobs for an individual schedule
● Edit the schedule.
View Schedules
Report distributor is a legacy tool to help automate report distribution. With the release of
Report Center, access to report distributor has been provided while functionality is migrated
to report center scheduling.
With the report distributor reports can be delivered by SFTP, run offline or sent by email. The
report distributor can output data to PDF, Word, Excel, or PowerPoint formats. Tables and
reports are collected into bundles, and then the bundles can be scheduled to run at a set time,
or be run manually.
The report distributor provides many benefits, including:
● Simplify integrations using SFTP
● Burst Reports to Managers inboxes as file attachments
● Users are able to continue using the application while a report is running offline by
scheduling the report offline
Note:
The most common use of the Report Distributor is to email reports to recipients
since that functionality is not yet available in Report Center’s Scheduler.
Menu Tile
The menu tile provides management function for the report distributor. The file menu allows
you to copy, edit, or delete the selected bundle. Additionally if the bundle’s destination is E-
mail, you can adjust the content of the subject and body of the email.
Additionally you can create a new bundle from scratch suing the new bundle button. Run
report distributor allows you to manually run a bundle immediately. The Job status allows you
to check the status and history of any bundles that have been run, as well as access the
results of bundles that you ran offline. Finally the Export to PDF allows you to preview the
output of the bundle in PDF format.
Bundle
As an administrator, you will create a new bundle when you wish to distribute one or more
reports to users on a regular basis. When you create a new bundle, you must specify several
criteria:
● Bundle name
● Page size
● Export format
● Report/tables to include in the bundle
● Destination (E-mail, SFTP, run offline)
● Users to receive the report (if destination is e-mail)
● Schedule to run the bundle (Optional)
Additionally tables can be added to a bundle directly when viewing a report using the add to
bundle button. Once the bundle is created, it can be run manually or scheduled to run.
4. From the Items tab, click Add Item. Then click Add Report or Excel Table.
5. Once the report or table is added you will see it listed in the Items tab.
6. To email the data from the selected report bundle from the Destination tab, select the E-
mail radio button, and click Edit Recipients to set up specific users to receive the report
bundle.
8. Select the users and click Add. All email addresses added must be associated with a user
in the system. You CANNOT add freeform email addresses. Click close when finished. To
have the report sent at a specific time, click Schedule → Add.
9. Enter the required information in the Scheduler box and click OK.
Note:
● You can only select report recipients that have a valid WFA user account
●
Note:
If the total attachment size on a Report Distributor email exceeds 13MB, the
attachments are zipped. If the zipped size still exceeds 13MB, attachments are
discarded and a message is included in the email to notify the recipient
Note:
Emails sent using Report Distributor arrive with a SuccessFactors domain name.
While it is not possible to edit the "@successfactors.com" in the email address,
some customers choose to edit the prefix with their own company name:
CompanyName@successfactors.com.
Note:
In this exercise you will create a report in Report Center that displays a logo,
report date, and 2 queries saved in query workspace.
1. Click .
2. Click .
3. Click .
5. Click .
7. Click .
8. Click .
10. Click .
11. Click .
14. Click .
15. Click .
17. Click .
18. Click .
19. Click .
20. Click .
21. Click .
23. Click .
24. Click .
25. Click .
26. Click .
27. Click .
28.
Note:
In this exercise you will create a report in Report Center that displays a logo,
report date, and 2 queries saved in query workspace.
1. Click .
a)
2. Click .
a)
3. Click .
a)
5. Click .
a)
7. Click .
a)
8. Click .
a)
10. Click .
a)
11. Click .
a)
a)
14. Click .
a)
15. Click .
a)
17. Click .
a)
18. Click .
a)
19. Click .
a)
20. Click .
a)
21. Click .
a)
a)
23. Click .
a)
24. Click .
a)
25. Click .
a)
26. Click .
a)
27. Click .
a)
28. a)
LESSON SUMMARY
You should now be able to:
● Create custom reports
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Create investigations
Investigations
SAP SuccessFactors Investigate
Overview of Investigate
SAP SuccessFactors delivers the capability to accelerate the process that HR Analysts go
through to analyze issues by surfacing relevant metrics and analysis dimensions and
recommending compelling visualizations to clearly communicate the findings.
Investigate is a query and presentation tool available to analyze WFA data. It combines a clean
and modern look with improved usability based on SAP Fiori design principles and our
comprehensive analytical library to make interactive analytics simple and readily accessible.
Investigate Features
● SAP Fiori design: Investigate combines a clean and modern look with improved usability
based on the SAP Fiori design principles to make interactive analytics readily accessible.
● Recommended Metrics/Dimensions: Investigate accelerates the process that HR Analysts
go through to analyze issues by surfacing recommended metrics and analysis dimensions.
Note:
For more information on using Investigate, please see the Workforce Analytics
Investigate Guides in the SAP Help Portal.
Note:
All roles that have been granted Query Workspace permission will be inherited by
Investigate.
After you enable Investigate, the Investigate icon appears in the Tools panel in place of Query
Workspace.
Note:
All the roles that have been granted Query Workspace permission will also have
access to Investigate.
Figure 148: Accessing Query Workspace After Investigate Has Been Enabled
After you have enabled Investigate, it becomes the default framework for querying and
investigating data. However, you can switch back to the classic view of Query Workspace at
any stage by clicking Switch to Classic View on top of the page.
From Query Workspace you can switch back to Investigate by selecting Investigate from tools
panel on the left.
Note:
Investigate queries are not available in Query Workspace (QWS). However, you
can open a QWS query in Investigate. Custom Calculations and Custom Members
and Sets created in Query Workspace will be available in Investigate.
For instructions on navigating to Investigate, please refer to the previous lesson on basic
navigation. Once you launch the investigate tool, you will be provided the Investigate Home
Page.
You can change your display preference to large icons by clicking the grid icon.
1. Click Rename to rename an investigation. The investigate checkbox must be enabled for
the rename option to be enabled.
3. Click Share to share an investigation and type the names or email addresses of the
recipients to share the investigation with. Once an investigation has been shared, the
Shared icon is displayed and hovering on the icon displays the recipients.
4. Click Delete to delete an investigation or folder. Deleting a folder will delete the folder and
their contents.
5. Click New to start a new investigation. To open previously saved investigations, click the
investigation title in list view or the investigation tile in grid view. To open a query created
in Query Workspace, select New → Open Query from Query Workspace.
6. Click Add New Folder to create a new folder. Folder structures help manage and organize
your investigations
4. Click Open.
5. Click Save to include the query as an Investigation. Click Cancel to close the query without
saving.
You can select metrics from Recommended for Analysis or from the Metrics category in the
right hand panel, under the Data tab. You can also use the new search capability to easily filter
and find metrics or dimensions from the comprehensive analytic suite.
The Search capability removes the need to know and navigate the old navigation hierarchy
tree. From the right hand panel the user can also select dimensions, custom dimensions and
time models.
Investigate will display a trend analysis chart with future projection and the data table
Note:
The dotted line represents the future projection. This forecasting function is a new
feature which returns a predicted value in the chart and data table using a best fit
linear regression.
1. Click the Filter icon from the quick ribbon to open the filter panel. Click Manage Filters to
display and select dimensions to add to the panel. The user can then apply the required
filters by selecting the nodes from dropdown menu
Note:
This feature enhances the filter panel to allow multiple selections per
dimension.
2. Select or Deselect Chart or Table from the quick ribbon to enable or disable the view as
required.
3. Click the Chart Library icon to display and select chart visualizations available to best fit
your data. Charts are grouped in Recommended and Other categories. See
Recommended Views.
4. Click Swap Axis from the quick ribbon to sway the values on X and Y axis (column and
row).
5. Click the Chart Styling icon to turn on or off Axis Titles, Axis Labels, Data Labels or the
option to Always Start at Zero (0), or to change Color Themes. The icon has options to
control the display of the following:
● Color themes (Default, High Contrast Black and High Contrast White)
Note:
Collections is an easy way to collect, export or send a quick email to initiate
a conversation about a result. See Metric Explanation, Collections and
Saving Investigations.
Comparison Options
You can enable or disable comparison options from the right hand panel under the Analyze
tab.
Forecast Function
The forecasting function returns a predicted value(s) in the chart and data table using a best
fit linear regression.
By default, the forecast switch is OFF.
Note:
● Time period can be forecast up to +5.
● Calculation is based on the Forecast function in https://support.office.com/en-us/article/
FORECAST-function-50ca49c9-7b40-4892-94e4-7ad38bbeda99
Trendline
● A linear trendline can be overlaid in the chart to quickly visualize the trend of the data
values.
● By default the trendline switch is Off.
To enable the trendline, toggle the switch On.
Annualized
● Metrics can be viewed by actual or annualized values.
● By default Annualized is switch Off, meaning that the actual value for the period is
returned.
To view the Annualized value toggle the switch On.
Recommended Views
From the right hand panel, under the Analyze tab, recommended views are suggested to
visualize data in different ways. Views are suggested based on the underlying data query.
To manage tokens:
1. Drag and drop the Token to swap the axis position. For example, from X (Column) axis to Y
(Row) axis / from column to row. This updates the visualization and data table.
2. Toggle to display parent or child data. Click the down arrow to display the selected
dimension only (parent value). Click the left arrow to display the nodes directly under the
selected dimension (children or distance to 1). This is the default behavior when selecting
a dimension.
3. Click the X to remove the Token (metric or dimension) from the query.
Drilldown Capabilities
Drilldown capabilities add interactive drilling through visualizations by clicking the chart
component or the data table (column or row header) in Investigate.
1. Click the chart column bar or the table row header (for example, South) to quickly drill
down to the next level. This Displays the next level in the hierarchy (for example: Alabama,
Delaware, Georgia, North Carolina etc.)
Note:
Not all metrics have drill to detail enabled.
Click the + icon from the Custom Metrics or Custom Dimensions category from the right hand
panel under the Data tab.
Custom Metrics
Users can create and save their own custom metrics for use within Investigate.
The user interface is clean and simple to use either by clicking measure and formula
operators in the dialog wizard or by using keyboard commands.
To create a custom Metric:
2. On the right pane, under the Data tab, choose + (add) icon next to Custom Measures.
3. Choose the function you want to use to calculate the custom metric.
4. Define the calculation for the custom metric, and choose Next.
5. Enter the Name and set the visibility of the metric to private or public.
6. Select a unit for the custom metric and set the number of decimal places that the value of
your custom metric displays.
7. Select the relevant Measure Category and the Measure Sub Category to correctly align
the custom metric within the Metrics hierarchy.
8. Choose Create.
Custom Dimensions
Users can create and save their own custom dimensions in Investigate.
Custom dimensions can be used to analyze data that predefined dimensions don't
automatically track or can be especially useful where there is a requirement to view and/or
aggregate parts of analysis dimensions on an ongoing basis.
Note:
Private custom metrics and private custom dimensions are hidden by default. To
display these in the category list, you can toggle the Show private custom metrics
or Show private custom dimensions switch to ON.
1. If permissions are maintained using SAP SuccessFactors HCM Role Based Permission
(RBP), you can grant the new Custom Metric and Dimension Admin from Admin Center>
Manage Permission Roles> [Role]> Permission> Analytics permissions> Custom Metric
and Dimension Admin.
2. If permissions are maintained within Workforce Analytics, you can grant the new Custom
Metric and Dimension Admin from Admin> Roles Maintenance> [Role]> Permissions –
add Custom Metric and Dimension Admin to Action Restrictions group
.
3. Click Collections to see previously collected insights and research. Collections are a new
and engaging way to export and share key research and insights. Collected insights can be
exported (PDF, PPT, Word, Excel) or emailed to initiate a conversation or quickly share
results.
4. Click Save and type a name for your investigation (if you do not want to use the default
name) or click Save As if you have previously saved the investigation and want to keep
multiple versions. Click Cancel to exit the current investigation and return to the
Investigate home page. You will be prompted to save your work with a different name (for
example: ‘copy’).
Metric Explanation
Internationalization in Investigate
Investigate supports Internationalization so that specific languages and cultures can easily be
adapted.
If you select a language that reads from right to left, such as Arabic, a mirrored version of the
application is displayed.
Note:
In this exercise, you will create a custom query with Investigate. You will view the
Termination Rate for employees with less than 1 year organizational tenure for all
locations.
1. Click .
2. Click .
3. Click .
4. Click here .
5. Click .
6. Click .
8. Click .
9. Click .
10. Click .
11. Click .
12. Click .
13. Click .
14. Click .
Note:
You can change from displaying the child nodes of a dimension to the parent
node by clicking the upside down caret (^).
16. Click .
18. Click .
19. Click .
20. Click .
21. Click .
22. Click .
23. Click .
24. Click .
25. Click .
Note:
In this exercise, you will create a custom query with Investigate. You will view the
Termination Rate for employees with less than 1 year organizational tenure for all
locations.
1. Click .
a)
2. Click .
a)
3. Click .
a)
4. Click here .
a)
5. Click .
a)
6. Click .
a)
8. Click .
a)
9. Click .
a)
10. Click .
a)
11. Click .
a)
12. Click .
a)
13. Click .
a)
14. Click .
Note:
You can change from displaying the child nodes of a dimension to the parent
node by clicking the upside down caret (^).
a)
16. Click .
a)
18. Click .
a)
19. Click .
a)
20. Click .
a)
21. Click .
a)
22. Click .
a)
23. Click .
a)
24. Click .
a)
25. Click .
a)
LESSON SUMMARY
You should now be able to:
● Create investigations
Learning Assessment
1. Which type of measure combines other measures to produce outputs as percentages and
ratios?
Choose the correct answer.
X A Dimension Hierarchies
X D Result Measures
Lesson 1
Overview of WFAP Implementation Methodology 157
Lesson 2
Overview of Workforce Analytics on HANA 189
Lesson 3
Fundamentals of Analytics Cubes and Fact Tables 195
Lesson 4
Understanding the WFA on HANA Configuration 199
Lesson 5
Post OLAP Configuration Tasks 243
UNIT OBJECTIVES
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Outline the WFAP Implementation Methodology
For WFA projects customer resources can be loosely defined as Technical Roles, Strategic
Roles and Functional Roles.
Resource Key
The Executive Sponsor (ES) is the primary executive advocate for the human capital
measurement capability at the organization. The Executive Sponsor may not be involved in
the day to day project management over the course of the year, but their engagement and
sponsorship will be critical to the successful implementation of the portal and other tools
(such as reports) among the user population.
The Project Manager (PM) is the person within the organisation who is responsible for the
ongoing progress of the company’s human capital measurement capabilities within their
organization. This person will work closely with the SAP SuccessFactors Consultant and SAP
SuccessFactors Project Manager assigned to the project to ensure both strategic and
operational goals are met.
The Functional/Technical Specialists (FS)(commonly an HR Analyst role) is the primary
resource for workforce data interpretation and will need to be familiar with the Human
Resource system, its data structures and business rules. The role will also include other
activities such as data verification and user testing.
The Technical IT Resource (IT) is the primary resource for data extraction and transfer
activities.
For SAP or Partner resources, we generally define three roles. Each of these roles are carried
out by separate individuals or teams, as the required skills of each role is quite different. The
high level responsibilities of the roles are:
Project Management:
● Managing Project Risks
● Managing Internal Team
● Ensuring Quality Gates are met
● Translating business and technical
● Support of Customer through the process
Technical Stream:
● Data Extraction Verification
● Creation of Data Staging Framework
● Resolution of application issues
● Configure data refresh
Functional Stream:
● Specification Generation
● Customer Data Verification/Auditing
● Application Verification and Knowledge Transfer
Note:
The technical consultant role is performed by an assigned SAP Technical
Consultant for partner implementations of traditional WFA. For WFA on HANA,
partners can perform those tasks. Please review the ‘SAP SuccessFactors WFAP
SAP Technical Tasks and Effort’ document of PartnerEdge.
ACTIVATE is the implementation methodology used by consultants to help customers get the
most out of their SAP SuccessFactors applications, in the shortest amount of time. First they
establish the framework for a successful project and then design the solution and develop a
detailed project approach. Then, they proceed to configure the solution to enable each
customer to achieve their strategic business objectives. Once the configuration is complete
we perform thorough testing, complete the training and execute on the rollout strategy.
SAP SuccessFactors will implement each metric pack independently, starting with Core
Workforce and Mobility. The following is an overview of the implementation methodology
phases, objectives and deliverables.
The approximate timeline for a traditional WFA project is 100 days.
The implementation of WFA on HANA can be shorter due to a couple of reasons:
● Currently WFA on HANA uses only SF data replicated into the analytical cube. Therefore,
the creation of extraction scripts and testing of those scripts is not required (Data Audit).
● Changes to the HANA analytics cube can be applied iteratively by the consultant and
tested individually, instead of the consultant applying all required configuration at once.
● Multiple metrics packs can be initially applied at once. However, each metrics pack must
still go through individual configuration, modification, and testing.
Please note that just because an implementation is utilizing WFA on SAP HANA, that does not
necessarily mean it will be faster than the approximate 100 day for traditional WFA. Often, the
bottleneck for completion of the project is the pace at which the customer can complete their
required steps. This can be the case for either traditional or WFA on HANA projects.
Project Timeframes
Project Management Summaries are created from a blank template for each new customer.
During the pre-sales process a summary may have already been generated to give sample
timeframes.
Summaries should be updated on a regular basis, or when a milestone is achieved.
Customers are not expected to manage these summaries, and the summary is often provided
as a PDF to remove edit functionality.
Metrics Packs
Each metrics pack focuses on a logical group of data items that support the generation of a
related set of measures and reporting breakdowns. Each metrics pack is based on a highly
structured framework that:
● Defines a standard set of core metrics which includes a high percentage of commonly
selected key performance indicators (KPIs).
● Defines a standard set of core business dimensions/hierarchies to support further
analysis/breakdown of measure values.
● Maps requirements to a source set of core data items.
● Includes templated extract programs and scripts for common computer systems.
● Includes common business logic which can be customized for SAP SuccessFactors
Member specific business rules.
● Includes a default gallery of reporting templates.
This modular approach facilitates the sourcing of data and the calculation of measure values
by significantly reducing both the time to implement and the cost of providing a
comprehensive reporting solution. The primary output is a data warehouse ready to support
the human capital management (HCM) reporting and analysis agenda of the SAP
SuccessFactors Member organization.
Traditional WFA applies the foundational metrics pack first, then additional metrics packs can
be added individually. Each additional metrics pack is its own implementation project. The
foundation metrics pack is referred to as Core Workforce and Mobility (CWM). It is applicable
whether the HRIS source is SF Employee Central or another external HRIS (SAP, PeopleSoft,
etc.). It is also applicable for both WFA only customers and WFA & WFP customers.
Strategic Workforce Planning (WFP) is NOT available without WFA.
WFA on HANA implementation can consist of only Core Workforce and Mobility metrics pack,
or a collection of metrics packs including CWM initially loaded into the system as a single
configuration. The list of supported metrics pack is available on the SAP Help Portal.
Note:
Not all metrics packs are supported on WFA on HANA. Please review the section
later in the course.
For each Metric Pack there is a defined set of source data fields which are extracted from a
Customers human resource information system (HRIS) and other business systems. These
data items are processed by SAP SuccessFactors Data Transformation, using predefined
formulae and the Customers business logic, to generate Base Input Measures and Dimension
Hierarchies. These two components form the basic building blocks of any business
intelligence solution.
Base Input Measures: The building blocks that enable generation of the WFA content.
● They cover basic counts like Headcount, FTE, Hires, Terminations and Internal Movements
Structural Dimensions:
● Typically used to allow breakdowns by the organizational, geographical or similar types of
reporting structures.
● They support tree security. Administrators can limit access to only certain areas of the
hierarchy for different users.
Analysis Dimensions:
● Allows slicing and filtering of measures by a number of analysis criteria: Demographic,
Job-based, Tenure etc.
● They support Dimension Restriction. Administrators can remove access to the dimension
for different users. Access is granted to the entire hierarchy of that dimension.
All Measures (Base Input, Derived Input and Result Measures) and Dimension Hierarchies are
available to support an SAP SuccessFactors Member’s reporting and analytics agenda
Workforce Planning
● The FTE measure set included within the Core Workforce and Mobility metric pack
supports the Workforce Planning application.
● Core Workforce and Mobility with additional measures which allow organizations to
forecast using either headcount or FTE metrics.
● Inclusion of Critical Job Roles as an Analysis Option allows workforce planning for a
reduced number of the workforce that may have a big impact in the organization.
Note:
Operational headcount planning (or Headcount Planning) does NOT rely on
WFA’s Core Workforce and Mobility data. It works with SAP SuccessFactors
Employee Central and Position Management data. Therefore, Headcount
Planning does NOT require an implementation process as described in this
course. After a customer meets all the prerequisites for Headcount Planning, the
product is enabled via a checkbox in Provisioning. It does NOT require WFA
configuration.
Additional Structures
Organizations can customize their reporting solution according to their reporting
requirements. Additional dimensions, hierarchies, time periods and special time categories
are supported through customization.
Historically, WFA has been implemented on Microsoft SQL Server, but the move to HANA
results in a number of changes to the implementation process.
● Performance Management
● Compensation Management
● Recruitment Management
● Succession Management
● Goals Management
● Fieldglass Contingent Workforce
● Employee Relations
● HR Delivery
Additional Metrics Packs are in development and planned for future releases.
Stage I: Prepare
Note:
This lesson presents all the potential steps for the implementation process for
traditional WFAP (on Microsoft SQL Server). WFA on HANA affects certain steps
and the potential timeline of a WFA implementation Project. Not all steps are
required for WFA on HANA. Where possible, the differences in implementation
projects has been noted at the different phases.
The prepare phase is primarily handled by the project managers. Project planning is the focus
of this phase; however, it is beneficial to understand the purpose of the data questionnaire.
The following topics will be covered in this section:
1. Implementation Documents
2. Welcome Call
4. Project Summary
Welcome Call
The Welcome Call typically occurs during the first two weeks of the SAP SuccessFactors
subscription. The objective of the Welcome Call is to define the customer’s goals and
expectations of the customers. Key timeframes, deliverables and communications plans will
also be discussed.
The SAP SuccessFactors project team is happy to include any agenda items the customer
would like; however the following provides an outline of the standard Welcome Call agenda:
● Introductions, discussion of personal background / position / experience
● General discussion (identifying key decision makers, key project managers , client’s hot
HR topic, client’s critical success factors for the SAP SuccessFactors membership)
● Key project outcomes
● Schedule key events (e.g. periodic calls / meeting, existing reports to be supplanted)
● Overview of the Implementation Process and next steps
Implementation Documents
At that welcome call, clients meet the people we’ll be working with and start to talk about how
to conduct the project. After that welcome call, you have to starting giving the client some
documents that review on their own time and also fill in to be able to help us in the
implementation.
1. The first document should be a Project Summary. This is an overview document of the
implementation process. It describes the steps from the beginning to the production site.
The project summary helps the customer to understand what that process is and the
milestones. In some instances, you will through that document with a little bit of detail
during the welcome call.
2. The second document is the Data Questionnaire. The Data Questionnaire has several
questions that the client has to respond to.
3. The third document is the Metric Pack Document. If they don’t already have a copy, it’s
really important that they do have the document at the very beginning so they understand
what the standard metrics and dimensions for the metric pack are.
Project Summary
Following the Welcome Call the customer is requested to complete the Data Questionnaire.
The purpose of the Data Questionnaire (DQ) document is to gather information to enable the
development of a data specification document that will form the framework around which the
customer’s site will be built. A separate questionnaire will be provided for each of the metric
pack to be implemented on the site. The questions throughout the document will focus on
detailed information about data sourcing and specific rules that may be applicable for various
topics and data elements, as well as general information about HRIS and reporting
requirements. Each data source provided will contribute towards populating different parts of
the WFA portal. The information provided will be used to create a customer-specific data
specification document.
Note:
Data questionnaires also exist for any additional metric packs that a customer
purchases, and are used in the same way.They can be obtained on the
PartnerEdge site.
Figure 201: Interaction of the Data Questionnaire and the Specification Template
The Data Questionnaire (DQ) is used to build the Specification Document, it asks the
customer a series of questions about their HRIS and its setup. The answers to these
questions in combination with the template specification document, create a customer
specific technical blueprint for the portal build.
If necessary, a follow up Technical Call will be scheduled to discuss the responses provided in
the questionnaire and, where needed, request clarification and additional information.
The following provides an outline of the standard Technical Call agenda:
● Team Introductions
● Overview of the Implementation process (recap if required)
● Detailed review of the Data Questionnaire
● Next steps, key dates and timeframes
Traditional WFA
For traditional WFA (WFA on SQL server), it is customary for the project team to have a
demonstration site so the customer can see examples of how decisions made during the
alignment of the specification document effect the outcome of the project. Many consultants
already have a demonstration instance that will suffice for the project. If a new demonstration
instance is needed, then it can be requested from the SAP SuccessFactors HCM Cloud
Operations Portal. The demonstration instance is NOT modified to the customer’s
configuration, it only serves to visualize the outcome of certain decisions. Customizations will
be applied and tested on a Beta site that is created later in the project.
WFA on HANA
WFA on HANA takes a different approach. Early in the project the technical consultant can
publish an alpha site. As customizations are requested, they can be applied directly to the
alpha site and viewed as the changes occur. The initial alpha site can be configured quickly by
applying a standard template, then customization can occur throughout the project as part of
the realize phase. The alpha site can be setup by following the ‘WFA on HANA Template
Implementation Guide’ from the WFA on HANA implementation kit.
Based on the responses received to the Data Questionnaire, you will create a document called
the Specification Document. That specification document becomes the blueprint for the
customer implementation containing all of the business logic and source extract data
locations. As a partner, you will need to be prepared to walk a customer through what is
included and not included in the specification/metric pack, compared to initial requirements.
It’s a living document; it will have different versions over the course of the project. The first
version you create is based on that questionnaire. It is important to make it as accurate and
complete as possible because it is the document also that the technical consultant team will
use to develop the staging, the framework, and the publication of the site.
After the initial specification document is created, the questionnaire is not needed anymore.
It is important that the Customer understands, and reviews the specification document
carefully. Identification of Customization outside of the Metric Pack offering is critical during
specification Once design. Once signed-off, changes to the specification will result in project
delays and additional development days.
The review of the initial Specification document is handled as part of the Kickoff Meeting.
Framework: The specification document matches the metric pack, it controls the
configuration of the solution, it contains all the sourcing and the data extracts and any
business logic that is specific to customers will be there. It also maintains data standard for
the customer base and this is important for benchmarking.
The specification documents are usually in Excel and they have several worksheets or tabs,
and each tab has a different type of information.
Note:
Configuring the Specification document is covered in detail in Course THR89 WFA
Academy for Functional Consultants
Note:
The entire Explore- Data Acquisition and Data Audit is not required for a WFA on
HANA implementation as it uses replication instead of extraction and transfer
methods described here. Additionally, certain steps are not required when the
data is SAP SuccessFactors data for traditional WFAP implementations.
Now that the data has been identified, we will develop a plan for extracting the data from the
customer’s existing systems, transferring the data to the SAP Technical team (if required)
and the initial steps taken for verification of the data.
The tools and processes used in the data extraction phase:
Note:
Data transfer methods do not need to be considered for SAP SuccessFactors data
sources.
SAP SuccessFactors supports the following data Transfer Methods: SFTP, FTPS, FTP +
Encryption. Once a client is in a Production status, data delivered via one of these methods to
a specific SAP SuccessFactors site is automatically downloaded and put through
predetermined verification processes and queued up for processing.
Alternatively, SAP SuccessFactors can also download data from client’s websites. This
method is not recommended because it does not take advantage of the automated data
retrieval processes which are in place for the other listed methods and can result in delays in
processing the regular refreshes.
Encryption Methods:
The data delivered to SAP SuccessFactors should be encrypted. Our recommendation is to
use PGP with a public encryption key provided by SAP SuccessFactors.
Discussions should be held early in the project to identify the most appropriate Data Delivery
and Encryption method for each client.
Data Extraction
Note:
Data extraction scripts are not required for WFA on HANA data. Currently, WFA on
HANA only supports SAP SuccessFactors data sources. Customers do not need
to create data extraction scripts for SAP SuccessFactors data sources.
provide generic table column listings to help customers generate their own scripts. In some
cases, these standard scripts will require modifications to satisfy specific system
requirements. In these instances, the customer’s team is requested to modify the script to
suit the need.
Note:
We do not provide technical assistance outside of providing stock templates. SAP
SuccessFactors does not have in-depth data extraction knowledge for all systems
as a software provider..
Output Formats
The SAP SuccessFactors standard is to receive TAB, Pipe (|) or comma separated text (.TXT
or .CSV) files.
To give a customer confidence in the published figures on the portal, every implementation
requires a set of customer expected results reports. These reports are used at a later stage of
the project, after the framework has been built.
The expected results report request during implementation relates to the following
verification items:
● End of Period Headcount
● Hires
● Terminations
● Internal Movements (Recommended)
- Transfers
- Promotions
- Demotions
● Analysis Options (used for verification support)
It is important that expected results reports are run the same day that the extract scripts are
executed.
For adequate sampling, SAP SuccessFactors will compare three separate quarters of data.
A document to aid the customer in completing verification reports is available on
PartnerEdge.
Note:
The expected results reports should be based on the same data used for the
extracts.
Expected Results reports were formerly called ‘verification reports’ and still may
appear in some documentation.
Handover
Once the following requirements are met, the project can be handed over to the Technical
Consultant team:
● The Specification document has been accepted and signed-off.
● The Extract Scripts have been developed and tested (if required).
● The initial data extracts (if required) and the expected results reports are created.
At this point all the documentation that is relevant to the project should be made available to
the technical team by an agreed upon mechanism. Management of documentation should
include:
● The Completed Questionnaire and any additional supporting documentation (e.g. lists of
codes and descriptions, table descriptions)
● The signed-off Specification version and all the prior versions
● Any sample data that may be useful
The technical team will then proceed to start the next phase of the project
Data Audit
Note:
Data audit is not applicable for WFA on HANA data.
Audit Findings
The Data Audit is completed by the SAP technical team for external data sources. The data is
audited against the specification document to ensure that the necessary tables, fields and
business logic have been captured and the data received is as expected and contains the
values identified. A ‘Data Audit Issues Log’ is created and provided to the functional for review
with a request to provide responses to the issues raised.
Partners need to work with the customer to answer questions logged during audit, before the
technical build can commence.
It is common that as a result of the Data Audit Issues reported, the specification document
may need to be updated. The responsibility for these updates should be agreed upon between
the Implementation Functional Lead and the SAP technical lead.
Once all the Issues identified have been addressed and any changes to the specification
completed, the technical team is ready to start the development of the Staging Framework
and the processing of the data.
The staging framework is based on a standard SAP SuccessFactors model where the
appropriate tools and steps of this framework will be tailored to capture and interpret the
specific logic applicable to the customer as defined in the Data Specification Document.
Figure 211: Realize Phase Data Staging Framework and Verification Overview
Note:
Data staging and discrepancy reporting for WFA on HANA implementations are
covered in more detail in course THR96: WFA Academy for Technical Consultants.
Currently Discrepancy Reports and Structural Dimension Pivots are not part of a
WFA on HANA project. For WFA on HANA, this phase captures the customer
specific configuration in the framework.
For WFA on HANA, this phase applies the customer specific configuration in the
framework.
Data staging for traditional WFA builds can take up to three weeks.
During this time the technical consultant team will create the staging framework, process the
data received and create the Verification Discrepancy Reports and the Structural Hierarchies
Pivot Tables.
A verification process is followed to ensure WFA is interpreting the data in the same manner
as the customer interprets the same data. Before SAP SuccessFactors can publish a beta
site, the customer must be comfortable with the data that will be displayed. The outcome of
the discrepancy verification process will be the customer acceptance to publish a site.
Once the data has been staged and processed, the technical team will generate reports and
compare the results with the verification reports received from the client. Discrepancies
found between these reports will be recorded and reported in a series of Discrepancy Reports
that will contain details of the apparent reasons for the discrepancies found. As an example,
the technical team can identify the employees that the customer shows as part of headcount
but WFA does not, and the employees that WFA identifies in headcount but the customer
does not.
The technical implementation lead will deliver these discrepancy reports. The client with the
support of the functional lead will be requested to investigate the issues raised and provide
feedback.
Discrepancy Report
Discrepancy reports identify differences in the customer generated expected results reports
and the SAP SuccessFactors data after the customer extract has been loaded into the WFA
framework.
counts, but the identified 94 employees are not included in the customer’s counts. The other
two discrepancies for rehires without terminations.
Along the bottom, we have found employees in the customer’s report that are not in WFA
report.
Continuing the example, 12 have been identified. Four have termination records and there had
not been any rehire since, so the employee should not be counted. The last 7 could not be
associated a record in the job table. For WFA, those individuals are not considered employees.
Detail Section
For each one of the sections, there will be one worksheet where it will give the details for each
one of the types of discrepancies.
For each one of the sections, there will be one worksheet where it will give the details for each
one of the types of discrepancies. For an example of the detailed worksheet, we can review
they type of discrepancies. For each one of those people there is a comment. By having the
employee id and a level of detail, it simplifies the process for the customer to go in and
investigate each one of these people determine the issue.
In some cases customers need to recreate expected results reports to facilitate the
verification process. One of the reasons we often find is timing issue between expected
results report creation and data extract. Another common issue is when someone terminates
on the last day of the month, as WFA doesn’t include that person in counts where as a
customer might count them. The verification process helps to understand whether you have
the correct logic and the correct data or whether you need to adjust something in the
extraction, logic, or customer reporting. Needing to changes something is common.
The same process is done for all the expected results reports we have received.
Figure 214: Realize Phase Beta Site Publication and Testing Overview
Note:
Currently the plan for WFA on HANA implementations to have a beta site available
early in the project (referred to as an alpha site) and customizations are applied
periodically thought the data staging process. Traditional WFAP will have a beta
site publication after all the customization and configuration have been captured.
It is recommended the project manager and/or functional lead review the beta site prior to
the client has been told that the site is available. They should review the site just to see if there
are any glaring issues that could be resolved before the client sees it for the first time.
Once the beta site has been published, the client needs to have a site navigation training and
review the site in detail. Following there is a joint effort between the functional lead and the
client to review the site as part of User Acceptance Testing process.
For traditional WFAP clients, once written sign-off is received from the customer beta
acceptance and completion of verification activities, the technical team will schedule the
publication of the data in the beta site to the Production site. A customer may decide to
provide a more updated data extract before the first Production site publication. In this case,
the new data extract will be processed and published to the beta site and the customer will be
requested for approval before the beta site data is moved to the Production site.
For traditional WFAP clients, upon the release, the client will have two sites, pre-production
site and a production site. The reason that we have those two sites is to support the regular
refresh cycle, for example a refresh on a monthly basis. When the SAP technical team
processes the data they will only publish to the pre-production site. The production site as it
is, so as not to effect the current function.
Upon the publish to pre-production site, the SAP technical team will communicate to the
client that the data is updated and the client can you review it and give them the authority to
move it to production. Once the update is signed off, the team will replicate what’s on the pre-
production site into production.
For WFA on HANA clients, once written sign-off is received from the customer alpha / beta
acceptance and completion of verification activities, the technical consultant can copy the
and build the initial data cube on the production instance.
Data updates for customers on WFA on HANA are automated updates and do not follow the
same refresh cycle. Typically, the data is refreshed daily. The data is replicated directly from
the SF transactional data, therefore there is not an approval process of the periodic loading of
data that is required for WFA on SQL server. Also for WFA on HANA, while a new analytics
cube is being updated, the WFA site will continue to function with the previous data. If the
cube update/build fails, the WFA site is still accessible to WFA users with the last successful
cube build.
The project team should spend one to two days onsite with key users from the customer to
conduct training on administration and usage of WFA. Knowledge transfer to key users is
critical to the success of a WFA project.
Note:
For information on the roles and administration framework, review the HR886:
Workforce Analytics Administration Guide and supporting materials on the SAP
Help Portal.
During the final stage of a traditional WFAP implementation, the technical team will configure
the customer’s existing SAP SuccessFactors (BizX) environment to connect to the tested
beta site for go live. This configuration is completed in provisioning by configuring Single Sign
On (SSO) between the environments. If you need to perform this configuration, for example
connecting a demonstration SAP SuccessFactors environment to an existing WFAP
environment, you should follow the instructions provided in the document SAP
SuccessFactors HXM Suite to Analytics Single Sign-On available on the SAP Help Portal.
SuccessFactors provides the common configuration for the data center with the WFA URL
set.
Note:
If the customer already has the analysis environment enabled, for example for
Canvas Report in Report Center or Analytics for EC, then check with the technical
team or support before making any configuration changes.
The WFA refresh is a task to update the WFA instance with the new information supplied by
extract file uploaded to the SAP SuccessFactors WFA SFTP.
Traditional WFA is based on extracted files that are provided by the customer from a different
HRIS system OR extracted from the customer's SAP SuccessFactors environment by an
internal script. The SAP SuccessFactors internal Professional Services Information Services
Team (PS-IS) imports the WFA data once the files are provided or received on the WFA SFTP
site.
Terms used in the flowchart:
● ADR: Automated Data Retrieval
● PS-IS: Professional Services Information Services Team - This is the team that process the
WFA monthly refresh.
● Variance Report: Report produced by PS-IS to highlight variances in key input measures
across multiple time periods (End of period headcount, Terminations, Hires). This report
can indicate potential data issues for the refresh.
● Missing Codes Report: Customers typically update dimension values, for example Job
Type, Recruitment Source. These new values may begin to appear in the employee data
however the dimension hierarchy may not have been updated by the customer. This report
highlights these 'unallocated' codes to allow the customer to update their HRIS in
preparation for the next refresh.
● Preview (now referred to as Pre-Production): Staging instance for data validation by
customer.
LESSON SUMMARY
You should now be able to:
● Outline the WFAP Implementation Methodology
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Overview of Workforce Analytics on HANA
WFA on HANA
Workforce Analytics (WFA) on HANA provides a framework for creating and delivering
analytics, including partner enablement tools and customer tools. This document will guide
you to each of the steps required to implement WFA on HANA, using the templates and
documents contained in the Implementation Kit.
Implementing WFA on HANA prerequisites
1. Begin with an instance that has Employee Central enabled.
• WFA on HANA uses data from Employee Central. Ensure your instance has EC enabled,
configured and some EC data loaded.
2. Confirm instance has Advanced Reporting enabled.
• If the instance does not have Advanced Reporting enabled, please refer to the Employee
Central Advanced Reporting Implementationguide in SAP Help.
4. Confirm your user is appropriately permissioned to access WFA on HANA Data Factory
● This permission will allow you to access the WFA on HANA configuration screens.
● Navigate to the Analytics > Admin menu and confirm that the WFA on HANA Data Factory
is available.
Note:
Pre-created templates are the starting point for a typical configuration of WFA on
HANA. You should not expect all metrics and dimensions to work after uploading
the template. Manual configuration, adjustment, and troubleshooting is required
for any pre-created templates.
2. Update the Data Specification document with the code mappings used in your instance.
• A template for the specification is found in the Implementation Kit, complete the customer
details in the specification.
• Update Code Mappings tab with how each of the instance codes were mapped into their
relevant dimensions in Dimension Editor.
• The Data Specification will become a living document shared between you and the customer
and will give the customer insight into which tables, columns and business logic have been
used in their WFA on HANA implementation.
3. Perform validation
• A guide to validation (WFA on HANA – Validation Guide) is found in the Implementation Kit.
4. Configure Drill to Detail Display
• A guide to configuring Drill to Detail display (WFA on HANA Drill to Detail Administration) is
found in the Implementation Kit.
After verifying the prerequisites, the next step is to load an implementation template. The
depth of the configuration applied will vary by the supplied template. Typically, there is a Core
Workforce and Mobility metrics pack template for metrics that are standard across all
regions. There may be more templates available on the implementation forum that apply
additional metrics packs / metrics.
After the basic CWM template is loaded and validated, the configuration can be used as a beta
site as you apply customer customizations to the standard configuration.
This course will focus on understanding the configuration of WFA on HANA, therefore it will
utilize the WFA on HANA – Manual Implementation Guide.
It is recommended that you read the WFA on HANA – Template Implementation Guide to
understand the required steps for implementing the baseline configuration.
1. Login with an account that has access to the WFA on HANA Data Factory tool.
2. Navigate to the WFA portal .
Note:
If you do not see this option, your user does not have permission to access the
tool. Please raise a Support case to have this permission assigned.
LESSON SUMMARY
You should now be able to:
● Overview of Workforce Analytics on HANA
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Fundamentals of Analytics Cubes and Fact Tables
An OLAP (online analytical processing) Cube is a data structure that allows fast analysis of
data. They structure data according to the multiple dimensions. A multi-dimensional cube for
reporting on employees might be, for example, composed of 6 dimensions: Employee, Time,
Department, Company, Division, and Location.
Building a Cube
Employee Central provides a rich navigation experience, allowing users to see at a glance all
information relating to an individual employee. We can also see employee history. For
example we can view prior positions, managers, job details and other information back to
their hire date.
The Employee Central information is stored in a variety of tables. Reporting tools like table
report builder and Advanced Reporting queries these tables. Table and Advanced Reporting
report lists and pivots of lists of data extracted from Employee Central with little to no
transformation.
Transforming data is what allows us to move from Reporting into Analytics. Transforming
allows us to:
• Allocate and distribute employees across a span of time
• Make hierarchical relationships and allocate employees to those hierarchies
• Aggregate and collate data at the time of transformation, providing faster performance at
report runtime
If we look at an employee’s Organizational Info and Compensation Info, each of these blocks
current and historical data is stored in separate tables. Employee Central uses Effective Start
Date and Effective End Dates. These fields provide the history of changes and when they
occurred for the employee.
However, the start and end dates are only applicable to the individual table. If an employee
transfer departments (stored in the organizational info table) without a pay change (stored in
the compensation info table), then only a new record (row) is entered in organizational info
table, the compensation table is unchanged.
Time Splicing
For the cube, the value of each applicable field (dimension) need to be captured at each
change of the employee. The build of the complete history of the employee is time splicing. In
the example, Hella Buhr is:
• Hired on 6/5/2005
• Transferred to a new department on 2/2/2015
• Has a compensation change on 8/19/2016
WFA on HANA takes all the records and aligns them into a single time sequence for Hella
Buhr.
These time spliced records become the WFA on HANA Data Warehouse built specifically to
optimize Analytics. This warehouse is in the form of a star schema with the employee
represented in key / main Fact tables (shown here in yellow) linked to secondary tables
containing the dimensional / hierarchical data. The WFA on HANA Data Warehouse then
forms the basis for a “cube” where all the employee data is collated and aggregated.
LESSON SUMMARY
You should now be able to:
● Fundamentals of Analytics Cubes and Fact Tables
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Understanding the WFA on HANA Configuration
The functionality in WFA on HANA will allow for scripts to be written in SQL (Structured Query
Language). While it is valuable to understand SQL, it is not critical to using this guide, as the
required syntax will be included in the necessary section.
Tables and Columns function use a snippet of SQL.
The Tables and Columns function will allow you to enter a snippet of SQL for the [column]
section of the SQL statement and then automatically insert this snippet into the full SQL
statement that is required. Eg: SELECT GENDER FROM PERSONAL_INFO.
Dimensions function use full SQL statements.
The Dimensions function will allow you to enter full SQL statements.
Note:
HA150 - SQL and SQL Script Basics for SAP HANA is a recommended prerequisite
for this course.
There are two ID columns in employee data that are used to identify which records belong to
which employee: Users Sys ID and Person ID. The Emp Employment Info table has both these
employee ID columns and is used as the mapping between the two. Employees can have more
than one record in this table, with the combination of Person ID + User Sys ID the unique
identifier for the employee and their employment assignment.
Person ID: The identifier for an individual. Each employee will only have one Person ID which
uniquely identifies that employee.
User Sys ID: The identifier for an individual’s employment assignment. Employees can have
more than one User Sys ID, typically uniquely identifying each of the employees’ Global
Assignments.
Note:
Global Assignments and Concurrent Employments are NOT currently supported
in WFA on HANA. Please see Workforce Analytics on SAP HANA FAQ in the SAP
SuccessFactors People Analytics Help Portal for more information.
Effective End Date can be configured in WFA in HANA in one of two ways:
• Using the “default” functionality: If a column is not specifically defined as an effective to date
(see next option) the default functionality will calculate effective end dates as next record
start date – one day.
• Setting the Special Use Type as Effective to Date (ToDate) on the Effective End Date
column: This option “forces” the use of the specified column value as the end date. This
option is used when the default next record start date – one day calculation is not
appropriate.
• Employee was tenured for <1 year (1st Jan 2012 >> 24th Dec 2012 (incl) = 359 days).
• Employee was tenured in the Support department for 6 months (1st Mar >> 31st Aug).
• Employee spent 3 months on Leave with Pay.
• Employee was tenured for 8 months before receiving a promotion
• Employee had been in promoted role for <3 months before terminating.
Effective Dating is applied at a daily level. If edits to the employee result in more than one
record on a single day, Transaction Sequencing is used to determine the most “current” (ie:
the last) record for an employee.
To begin, we will create the structure of what will be included in WFA on HANA by flagging and
configuring the different pieces of data that make up the Employee Central data. This will
create the Base Input Measures and Analysis & Structural Dimensions that will then be used
by WFA on HANA to generate Derived Input and Result Measures.
To start, click on Tables and Columns in the Configuration section of the WFA on HANA Data
Factory Screen:
Active: Enabled
The WFA on HANA manual implementation guide walks through adding all the necessary
tables and columns for an EC WFA on HANA configuration. This course will focus on common
examples. For a full list of configurations, please refer to the guide.
When adding a column from a table you will configure the is tenure, rollup type and special
use type for the column.
Special Use Type:
WFA on HANA associates columns for special uses for certain WFA on HANA functionality.
The list contains:
None: no special use for the column
PrimaryPerson: identifies the column as the primary employee ID, or Person ID for EC. Used
to build the facts around an employee
FromDate: identifies the column that stores the start date of the record for effective dating,
usually the effective start date column
ToDate: This option “forces” the use of the specified column value as the effective end date.
Standard functionality (no effective to Date) will calculate effective end date as next record
start date minus one day
EffectiveSequence: identifies the column that stores the effective dating sequence number to
determine the order of changes that occur on the same date, usually the effective sequence
number column
SecondaryPerson: identifies the column as the secondary employee ID, or USER SYS ID for
EC. Used to build the facts around an employee
DOB: Identifies the column as date of birth for calculating age dimensions
HireDate: identifies the column for calculating the organizational tenure dimension
AssignmentType: identifies the column for assignment type, which stores if an employee has
a standard assignment or a global assignment
1. Open the Employee Central folder and click on the Emp Employment Info table. A list of
available columns from this table will appear to the right.
2. Select the following columns from the Emp Employment Info columns:
• Assignment Type
• Original Start Date
• Person ID
• Users Sys ID
Final table configuration with unused columns hidden for Emp Employment Info.
Rollup Type
Allows WFA on HANA to handle when a record is spliced in building the fact table. Rollup type
controls what happens to the value when the splicing occurs.
Normal: the value will stay as is, the value will be the same on both records.
SOP: the value is only maintained in the original record. Useful when you want the value of the
column to only be applicable on the effective start date of a record. For example, a movement
should only be counted on the actual start date of the record. If it is spliced, it should not
count as a movement on the new records.
Prorata: used for numeric data types. When the record is split, it will prorate the values into
old and new record.
Is Tenure
Position tenure can be calculated from the date an employee begins in a job or position. The
standard is to calculate based upon a change in the Position ID in the Emp Job Info T table.
Calculated Columns
Some base measures and dimensions are not simple values in an existing column, but must
be created or calculated based upon logic. WFA on HANA can create calculated columns in
the fact table using SQL scripts. The manual implementation guide provides labels, data
types, and example scripts for the standard EC configuration.
7. Enter the SQL Script in the Editor. Remember the script represents one column in the
SELECT statement.
8. Click Validate to ensure the formula has been entered correctly and save
The first row shows the front-end field name, and the second row shows the possible back-
end field name. For example, for Ethnic Group column in the Global Info for USA table, the
back-end field is SF_VCHAR1 field.
For comparison, here is what the setting in Manage Business Configuration looks like for
Ethnic Group:
Note:
The above steps can be used for other module as well like Succession,
Recruiting, etc. However, not all fields in Table Reports will have corresponding
back-end field in the database. There are calculated fields specifically defined in
the Table Report schema. The fields will vary depending on the schema you use.
For example, Is Internal or compa ratio field in the front-end are calculated
columns in a Table Report but no back-end field exists in the database for these
fields.
Some common complex scripts may have templates that can be added by dragging the
appropriate item from the common algorithms section of the Edit Formula tool. Currently the
generation measure is supplied via common algorithms. Others may be added later.
1. Remove/delete any Key Mappings that use "to be deleted" fact tables.
Note:
if you delete the fact table group without deleting the underlying fact tables first,
this will cause issues building the cube.
It is recommended to keep any logic in Workforce fact table related to other metrics pack, at
least for the Table and Column and Lookup configuration because there may be restricted
calculated columns that use these as sourcing. You can delete any dimension related to other
metrics packs that are not used.
Lookup Tables
Most lookups are used only to change/replace internal codes into a label (or external code).
There are occasions however, where the label (or external code) is specifically required in the
processing of the data. Events and Event Reasons are an example of this, where, to identify
when an employee was hired, terminated or otherwise moved within the organization, the
specific Event or Event Reason external code is required. Lookups allow joining tables that do
not have a primary or secondary person ID.
General Steps to configure a lookup
1. Click on the Lookups tab in the top navigation bar and click Add Lookup
2. Select the appropriate table from the choose lookup table drop down and click OK. If the
table needed doesn’t appear, make sure you have enabled it in the tables and columns section
3. Click Add Join Column and choose the Lookup Column and the Source Column and click
OK
Lookup Example
1. Click on the Lookups tab in the top navigation bar and click Add Lookup.
2. Select the FO Event Reason T table from the drop down and click OK.
3. Click Add Join Column and choose Internal Code for the Lookup Column and Emp Job Info
T > Event Reason Filtered for the Source Column
Create Events
For WFA on HANA, event lists define employee movement (Hire, Termination, Promotion,
Transfer, Demotion, etc) based upon codes. Refer to the Base Input Measures tab of the SF
WFA on HANA Data Specification for the required codes used in the instance.
How Do Movements Work?
Step 2 is to map each code in the in the Recruitment Source and Separation Reason
Dimensions
Once all your events are captured in the system as movements, you will need to map each
code from unmapped grouping into other groupings in the dimension hierarchy. Recruitment
Source allows Movement In measures to break down further into categories, for example Hire
or Promotion (In). Separation Reason allows Movement Out measures to break down further
into categories, for example Voluntary Termination or Promotion (Out). How to configure
code mapping for dimensions is covered in the section Configuring the Hierarchies.
What if Step 2 is not configured properly? Although all events/actions have been captured in
step 1, they don’t know what kind of movements they should be reported as. If you don’t map
the corresponding codes under promotion, demotion, transfer, etc, then you will not get
results for those metrics because the Data Factory does NOT know what events should be
reporting in the appropriate dimension nodes, for example Hire, Transfer, Promotion, or
Voluntary Termination.
1. Navigate to the Events Lists tab on the top navigation bar and click Add.
2. Label this Event List.
3. Open the Employee Central > Emp Job Info T folders and drag and drop the Event column
onto the Event Code Columns bubble. This will populate the descriptions column.
4. Select the Events that correspond to the movement then click OK to save:
Lookup Example
1. Navigate to the Events Lists tab on the top navigation bar and click Add.
2. Label this Event List Hires.
3. Open the Employee Central > Emp Job Info T folders and drag and drop the Event column
onto the Event Code Columns bubble:
4. Select the Events that correspond to an employee’s Hire (here we have chosen Hire and
Rehire), then click OK to save:
Figure 253: Standard Event Lists for WFA on HANA for Employee Central
Create Conditions
In WFA on HANA, conditions identify when an event occurs. The next step is to create
conditions for the events.
1. Navigate to the Hires, Movements, Terms tab on the top navigation bar and click Add
Condition in the appropriate section.
2. Label this Condition.
3. Drag the event code column onto the Event Code Columns box.
4. Select the Condition Method and configure the resulting method:
5. Click OK to save.
Condition Methods
A condition can be identified by one of five methods
Change in Value- compares values in the event code column, looking for any change in value.
Event List- compares the values in the event code column to the event lists created in the
previous section
Starts With- does pattern matching on the event code column
Increase in Value- compares values in the event code column, typically used for promotions
and demotions
Decrease in Value - compares values in the event code column, typically used for promotions
and demotions
Additionally when comparing values you should check the enable table filter. This will limit the
comparison of records to the table that contains the event code column. It should always be
checked.
Condition Example
Internal Movements condition will look for events that match the Internal movements event
list configuration.
1. Navigate to the Hires, Movements, Terms tab on the top navigation bar and click Add
Movements Condition.
2. Label this Condition Internal Movement Events.
3. Drag the Event column from the Emp Job Info T table onto the Event Code Columns box.
4. Select Event List from the Choose Condition Method drop down and select Internal
Movements.
5. Click OK to save.
Create Calculations
Once the conditions have been created, then calculations are created to provide codes that
represent the different types of movements. Calculations are created for Hires, Movements,
and Terminations.
1. Navigate to the Hires, Movements, Terms tab on the top navigation bar and click Edit on the
condition.
2. Complete the If / Then /Else configuration for the Condition.
3. Click OK to save.
Note:
For Hire Event conditions, the date for the hire is usually the Effective Start Date
that is applicable at the Hire event. For System Uploads however, the Effective
Start Date is typically the date of the System Upload (not the hire). Checking the
Use Hire Date ensures that the actual Hire Date for the employee is used, not the
System Upload date.
Calculation Example
The hires calculation will output a concatenation of the Event and Event Reason codes when a
Hire condition is met or a dummy code when a System Uploads is identified.
1. Navigate to the Hires, Movements, Terms tab on the top navigation bar and click Edit on the
Hires condition.
2. In the IF statement, pull in the Hire Events condition.
3. In the Move Condition statement, pull in Emp Job Info T > Event and FO Event Reason T >
Code columns.
4. Insert an underscore “_” to visually separate the Event from the Code by dragging in a
static character between Event and Code using the A button:
Note:
The red border on the Move Conditions is not an error and can be ignored.
9. Click Toggle Options and ensure the Use Hire Date option is checked for the System
Uploads condition:
10. **OPTION** If the employees Hire Date is to be used for the Hire Event (rather than the
default Effective Start Date applicable for the Hire Event, then also select Use Hire Date for
the Hire Event condition.
Standard Conditions and Calculations for WFA on HANA for Employee Central
Figure 258: Standard Conditions and Calculations for WFA on HANA for Employee Central
1. On the Event List tab, configure separate event list for each internal event like promotion,
transfer downgrade, etc.
2. Create conditions for each internal event under Movement section on Hire, Movements,
Terms tab and select Movement Priority option and prioritize the internal event with the
most significant event on the top.
Note:
● If you want to report Hire over internal event, then you need to add another
Hire condition (replicate the condition from Hire section) in Movement
section again. For example, if for an employee Hire and promotion happens
on the same day and you want to report Hire over any internal event, then
you need to create a hire condition again on movement section and place it
on top of the list.
● If you want to report last occurred event, then select Last event occurred
option. In this case, movement priority will not get triggered.
Describe Calculations
Note:
This function is only available to internal SAP admin users.
This example will use calculations to source and aggregate the necessary Pay Components to
create Annual Salary. Then the example will use the Annual Salary to determine the
appropriate node in the Salary Range.
This section will explain how Annual Salary is calculated and its corresponding dimension:
Salary Range.
Calculation Example
Default Currency Code This calculated column configures the desired from_to cur-
rency. The ‘from currency’ should be from “Currency” col-
umn in this table. ‘To Currency’ is what you would like to con-
vert it to. It is not necessarily USD.
Base Salary This calculated column configures where base salary is
sourced from. It could be from one or multiple pay comp ID
depending on your customer’s setting. All we need to keep in
mind here is that if a person has multiple pay comp ID availa-
ble for base salary, they will overwrite each other based on
the sequence of the records. It shouldn’t matter because one
could be in weekly frequency and the other could be in
monthly frequency. The total annual salary after factoring in
frequency should reach a similar figure.
Target Bonus Amount This calculated column configures potential bonus that sits
on top of salary. Similar to Base Salary, it is only able to track
one bonus value. If there are multiple Bonus at different time,
the latter one will overwrite the former one.
InvertCurrencyConversion- This calculated column allows you to revert the conversion
Rate rate from currency conversion table in case the rate is repre-
sented in a different way.
Data("[%EMP_PAYCOMP_RECURRING_T.#CALC_COL_5%]") /
CurrencyConversionRate
Else
'Don't invert currency conversion rate
Variables("CurrentBonusAmount") =
Data("[%EMP_PAYCOMP_RECURRING_T.#CALC_COL_5%]") *
CurrencyConversionRate
End If
End If
End If
Second, review some of the coding above. Here are a few key parts:
CurrentSalary variable This variable calculates salary based on the data configured
in:
a) “Base Salary”
b) “annualization factors”
c) conversion rate
d) “InvertCurrencyConversionRate”.
This will give you the salary associated with each record. If
the current record is a bonus record, variable will end up
storing 0 because it doesn’t have “Base Salary”
CurrentBonusAmount varia- This variable calculate bonus based on the data configured in
ble
a) “Target Bonus Amount”
b) conversion rate
d) “InvertCurrencyConversionRate”
Salary range simply uses the value calculated in “annual salary” and converts it into discrete
units. Then the ID produced from this formula is used in the dimension: Salary Range.
In the example, the highest value node that would appear in the dimension would be 175,000+
(with an ID of 175). A new dimension node ID would be generated for every 1000 values
beneath that. If the annual salary value is empty or less than 0, the id returned would be ??.
Configure Dimensions
Dimensions for analysis of the data need to be configured in WFA on HANA in accordance to
the Hierarchy Options and Analysis Options tabs in the WFA on HANA – Data Specification.
To access the dimension editor, from the Dimensions tab in the top navigation bar, click Add
to open Add Dimension.
For a dimension, complete the following:
1. Standard Dimension: Identifies the standard dimension. This dimension will be linked to or
as a custom dimension. Dimensions identified as Standard Dimension will automatically
include standard nodes for use in Derived Measures (where applicable), Benchmarking
(where applicable) and standard labels for Internationalization if languages other than English
(US) are enabled for the instance.
2. Dimension Type: ‘Analytical’ for smaller one or two level dimensions or ‘Structural’ for
hierarchical type dimensions. (only applies to Custom Dimensions, Standard Dimensions will
be assigned automatically).
3. Dimension Name: Label to be used for the dimension (only applies to Custom Dimensions,
Standard Dimensions will be assigned automatically).
4. Drop Dimension Columns Here: Shows the column/s the dimension will be sourced from.
Drag columns here from the Available Tables and Columns box. The value of this column in
the fact table will match the node ID in the dimension structure.
5. Dimension Structure: Configure ‘Generated’ or ‘Manually Maintained’.
6. Dimension Structure Source: Configure none, ‘SQL’ or ‘Table’.
Generated dimension structures use SQL code or Selected Fields to define the nodes and
hierarchy for the dimension. Dimensions configured as ‘Generated’ cannot be modified
manually in Dimension Editor. The value(s) from the column(s) in the Drop Dimension
Columns Here must match a node ID in the generated dimension or they will be placed in the
unallocated node.
When creating a generated dimension, you have the following structure sources:
● Table: No SQL Code required. Select a source table the you can select fields from that
table in the drop-down list to populate the structure type section
● SQL: Populate the SQL code to generate the dimension nodes and hierarchy. Must be used
with the logic of the dimension is more complex than selection fields from a single table.
Generated Structures also have a Structure Type of Parent Child or Flat, regardless the
choice of Dimension Source Structure:
● Flat: The structure is built with one or more levels with each level ID and level Name
sourced as columns from a SQL statement or table. The structure can be a single or
multiple level depth, with each level’s nodes having an ID and Description at that level.
● Parent/Child: The structure is built by aligning each nodes with a parent where applicable,
recursively building the structure. The configuration must be able to determine the
relationship to other nodes to build the structure. For example, an employee has a
supervisor ID in the job information table. Each child node (employee) can be mapped to a
parent (manager). Structures can have different levels of depth.
Manually Maintained dimension can use SQL code or Selected Fields to define the nodes and
hierarchy for the dimension. However, they also can have a manually built node structure in
Dimension Editor. These dimension can be modified in the Dimension Editor. The value(s)
from the column(s) in the Drop Dimension Columns Here can be mapped to the nodes for the
dimension in the Dimension Editor.
When creating a manually maintained dimension, you have the following Structure Sources:
● Table: No SQL Code required. Select a source table the you can select fields from that
table in the drop-down list to populate the structure type section.
● SQL: Populate the SQL code to generate the dimension nodes and hierarchy. Must be used
with the logic of the dimension is more complex than selection fields from a single table.
● None: No configuration on the Edit Dimension screen. You manually create the dimension
hierarchy, node ID and descriptions, and mappings in the Dimension editor.
Dimensions that are constructed from a picklist can choose the relevant picklist table from
the Choose Source Dropdown box. This list is currently very long and can be hard to navigate.
The manual implementation guide has the name of the picklist required for each dimension to
type into the Choose Source Dropdown box.
Note:
If a dimension override label or label other than the standards supplied are used
on a dimension configured as standard, translations for those labels will not be
automatically supplied.
Note:
For an example of utilizing a picklist in SQL script of a dimension, review the
Employment Type dimension in this section.
2. Click OK to save.
2. Click Choose Source Table and start typing Picklist - Emp Job Info T.Employment Status.
3. Set the ID Column and the Description Column as ID and LABEL respectively.
ID Column > External Code.
Description Column —> Label.
Locale Column —>Locale.
4. Click OK to save.
● Dimension Column = Emp Job Info T —> Regular Temp and Emp Job Info T —> Full
Part Time
2. Click Edit SQL to add the SQL script that will extract the Future Leader labels and enter
the following script:
SELECT REGULAR_TEMP.EXTERNAL_CODE || '_' || A."FULL_PART_TIME" AS ID,
REGULAR_TEMP.LABEL
FROM (SELECT DISTINCT CASE WHEN IFNULL(FTE, 0) = 0 THEN '??' --Unallocated
WHEN FTE = 1 THEN 'FT' --Full Timers
ELSE 'PT' END AS FULL_PART_TIME
FROM "[%ODS_DATABASE%]"."EMP_JOB_INFO_T"
WHERE NOT IFNULL(FTE, 0) = 0) A
LEFT OUTER JOIN [%PICKLIST(PICKLIST__EMP_JOB_INFO_T.REGULAR_TEMP)%]
REGULAR_TEMP
ON REGULAR_TEMP.LOCALE = 'en_US'
4. Before exiting the Add Dimension dialog, set the ID Column and the Description Column
as ID and LABEL respectively.
5. Click OK to save.
Figure 270: Example of a Generated with SQL Parent / Child Dimension: Supervisor
2. Click Edit SQL to add the SQL script that will extract the Supervisor structure and labels.
The script can be found in the manual implementation guide. The script returns 3 columns
used in step 5.
4. Before exiting the Add Dimension dialog, select Structure Type as Parent/Child and set
the ID Column as USERS_SYS_ID, the Description Column as USERS_SYS_NAME and the
Parent Column as MANAGER_ID.
5. Click OK to save.
2. Click Edit SQL to add the SQL script that will extract the Location structure and labels. The
script can be found in the manual implementation guide. The script returns 8 columns
used in step 5. Each column is the ID and description at each level of the hierarchy.
Loosely its Country, State / Province, City, Location.
4. Before exiting the Add Dimension dialog, select Structure Type as Flat.
7. Click OK to save.
Add Measures
For a WFA on HANA configuration, the next step is to create the measures.
From the Measures tab in the top navigation bar, click Add to open Add Measure.
Configure the following:
1. Standard Measure: Choose the applicable measure from the drop down selection or
“None” if a standard measure will not be used.
2. Measure Name: The label to apply to the measure (only applies for measures not
identified as standard).
5. Edit Code: Opens the canvas to enter the syntax that will determine how the measure is
calculated.
1. List of columns available for use in a Measure – these include base columns from the data
source, calculations as configured in the implementation and calculations produced by the
WFA on HANA engine.
4. Selecting a column that is used in the measure syntax will display details of that column.
E.g. in the screenshot we are showing details of the Employment Status column.
Note:
For a list of the SQL syntax and measure configuration of all the standard
measures in WFA on HANA for Employee Central, please see the manual
implementation guide.
Custom Measures
The WFA on HANA cube has available generic measures for each type so technical
consultants can implement custom measures without requiring involvement from
SuccessFactors. When using these generic measures, they will be allocated under a generic
category that the consultant cannot change. Additionally, they come with generic names
which can be adjusted using the measure override tool in the WFA administration panel. These
generic measures can additionally be used to aid in testing.
There are five generic measures created in each rollup type:
● SUM: SumG1....SumG5
● AVERAGE: AvgG1....AvgG5
● EOP: EOPG1....EOPG5
● SOP: SOPG1....SOPG5
Follow the same process as standard measures to create the custom measure. Then use the
measure override to assign an appropriate name.
If you need additional custom measures beyond the generic measures, need direct help from
SAP SuccessFactors Technical Services team building the measure, or want to categorize the
custom measures, you can purchase ‘SAP SuccessFactors Workforce Analytics Custom
Measure’ from the SAP Store. There are two options:
● If you would like SAP to do coding in data factory as well as adding the measure to the
menu, each purchase will cover one custom measure.
● if you would like to do coding in data factory by yourself and just need SAP to add it to the
menu, each purchase will cover 4 measure definition place holders.
4. Edit Hierarchies
The Time Setting sets which date WFA on HANA will be processed from. The default setting is
the January between 3 & 4 years before the first processing and means WFA on HANA
reporting will show data from that date through to today. If a longer (or shorter) total
reporting period is required, click Time Settings on the Global Settings menu and adjust
accordingly:
Validation will check the current configuration and attempt to highlight any issues so that they
can be resolved prior to running the full build.
1. From Data Factory Home click Validation in the Configuration bubble:
2. Validation will run through a set of rules and provide pass/fail statuses for each:
3. Use the Workforce Analytics on HANA Validation Guide as a reference and work through
each error and resolve them.
4. Re-run the validator until there are no further issues.
Return to Data Factory Home and click Process Initial Load (Fact Data and Cube) in the Initial
Process bubble. This will automatically direct you to the Load Status screen:
1. Set Filters (Type, Date, Status) for which Jobs are to be listed in the Load History
2. Choose whether the page is to be refreshed continuously (and the interval). Refreshing
continuously is useful when a load is currently running, not refreshing is useful when
reading a log.
3. Load History lists current and previous Jobs within the parameters set in the Filters.
4. Option to stop a currently running Job – this option only appears if there is a currently
running Job.
Note:
If your build fails, please review the Section on Troubleshooting & Debugging Build
Issues in the course and / or the Validation Guide included in the WFA on HANA
Implementation Kit.
The feature provides you with the flexibility to schedule the initial load for a selected time
frame.
Edit Hierarchies
The following section covers the Hierarchy Editor function. SucessFactors provides a
customer facing dimension editor in the Admin Center called WFA Dimension Editor. You can
learn more about the tool in the Using Workforce Analytics Dimension Editor in the SAP Help
Portal or course HR886: SAP SuccessFactors Workforce Analytics Administration.
Both Hierarchy Editor and WFA Dimension Editor will perform the same function, however the
WFA Dimension Editor uses a new UI.
Edit Hierarchies allows you to manually control the labels and groupings for the Dimension
Nodes in Dimensions that are Manually Maintained:
A. List of Dimensions applicable for the instance. Note that Generated Dimensions are greyed
out and cannot be modified.
B. Canvas for re-labelling selected nodes.
C. Toolbar.
D. Nodes that have been copied or cut will appear on the clipboard for re-use.
Standard Nodes
Standard Nodes
Standard nodes are default nodes automatically generated within dimensions and are
identifiable by the “#” in the node ID. Standard nodes are used to identify dimension nodes
required for derived measures (eg: EOP Headcount – Male). You can manage which
employees are included in these measures by ensuring the right mapping of nodes into these
default standard nodes. Eg: to ensure all temporary employees are included in the EOP
Headcount – Temporary measure, map any Employment Type codes for temporary
employees into the #TEMP node in the Employment Type dimension.
Drag and drop – mapping to a parent: When dragging nodes to map them, drag the node on
top of the parent node and the parent node will highlight. In this screen shot the F node will
become a child of Female:
Drag and drop – rearranging nodes: When dragging nodes to map them, drag the node on top
of the parent node and the parent node will highlight. In this screen shot the M node will be
ordered underneath Male:
A very common question is: Why do all my headcount falls under unmapped category rather
than the categories I created in SQL statement?
Appearing in Unmapped simply means the leaf node IDs in generated via SQL doesn’t match
the values returned in the data. A common mistake is to return external code in SQL
statement while the column dragged into that top right window is internal code or vice versa.
In order to link employee attribute(whether it is a multi-fields concatenated ID or single field
ID), those you dragged into the window must match exactly the one you return in SQL. When
in doubt, add columns in Drill to Detail so you can see the value returned from the fact table.
Map the Unmapped codes into their appropriate groups per the WFA on HANA Data
Specification:
Standard Hierarchies that Require Mapping
Map remaining dimensions as outlined in the WFA on HANA Data Specification. This list only
includes Manually Maintained dimensions that will require some manual re-mapping.
Some issues are surfaced in the Build Fact Data & Cube processing and cause the processing
to fail. These issues can often be resolved by using the subsequent error message in the Load
Status log as a guide.
Debugging Process
What happens when your initial load or incremental load fails? There isn’t a straight answer
for this unfortunately. You will need to check out the error message from load status in data
factory. You can utilize the following steps:
1. Search “error” or “invalid” in load status log. This will likely bring you to the first error it
had before failing.
3. Check the SAP SuccessFactors Partner Delivery Community (PDC). Search the key word
of your error and see if you are able to find similar cases in the past.
4. If an answer cannot be found from historical posts, create a new post in the SAP
SuccessFactors Partner Delivery Community (PDC).
Keep in mind that the configuration that works in one instance may not work in the other due
to data differences. This is very common between a customer’s test instance and production
instance.
How to Debug a Calculated Measure
Calculated measures are based on input measures within a formula. If a calculated measure
doesn’t work on in the application, check if all the input measures used as part of the formula
of the calculated measure are working. If there are input measures that are not working, you
need to resolve those issues first.
All input measures must work before calculated measure could work.
After building a dimension based on SQL statement, you notice that all of the headcount falls
into unmapped category rather than spreading across the levels you just built.
The reason is the concatenation of the 5 fields(Department, Division, Vchar17, Vchar18, Cost
Center) you dragged into the top right window doesn’t match the bottom level (Level 5 ID) in
your SQL query. When the Data Factory determines the category to assign the employee in
the dimension, the ID has to match exactly between the employee’s attributes (in the ‘Drop
Dimension Columns Here’ area) and the bottom level (leaf) ID allocated to your structure
(built from your SQL statement).
How to Debug
If you find yourself in the situation, you can use “LevelxID” in both ID column and Name
column. This will show you the ID behind each level and you can see what is not matching
between your SQL leaf level ID and your employee attribute ID.
You should also consider simplifying your query when initially building the dimension. Start
with 2 levels only, so you get a better idea how it works. Then build up to the 5 levels of the
entire desired structure. Just keep in mind that for each new level, you need to adjust both
your SQL query and that attribute window’s ‘Drop Dimension Columns Here’. They must
match with each other.
The WFA on HANA implementation so far had configured the Dimensions and Base Input
Measures as outlined in the specification. These Dimensions and Base Input Measures and
now be used to generate a much wider set of metrics to include Derived Measures, Restricted
Measures and Result Measures.
LESSON SUMMARY
You should now be able to:
● Understanding the WFA on HANA Configuration
LESSON OBJECTIVES
After completing this lesson, you will be able to:
● Post OLAP Configuration Tasks
Troubleshooting Tips
1. If you receive an application error when clicking on any measure – check that Measure
Templates have been installed on the instance via Measure Templates in the internal SAP
Admin menu. If you are not able to access this menu, please raise a Support case with the
instance details and request a check for Measure Templates in the instance.
2. If the measure page loads, but the tables/charts do not render, please check RBP settings
for your user and confirm you have the required permissions to access that measure.
Data Validation
Data validation will help to prove that the results displayed in metrics & dimensions are
correct and requires auditing aggregated counts from the “raw” data against selected
measures. Data validation can also highlight data quality issues with the input data and may
surface incorrect or incomplete data stored. Input measures are generally the ones chosen
for auditing as they provide the basis for derived and calculated measures.
Data validation can surface data or configuration corrections that are required in different
areas:
Examples:
● Who are the employees in total headcount year on year? Is there anyone missing? Is there
anyone included who should not be counted?
● Who are the 59 employees in Australia, the 61 employees in Brazil and the 72 employees in
China?
● Manager X has a team of 23 people, but WFA on HANA is reporting 25 – who are these
additional 2 employees?
● Manager Y had 5 terminations in the last year, but WFA on HANA is reporting 4 – who is
the missing termination?
This section provides one suggested method for data validation – it will not cover all possible
use cases, but the concepts can be extrapolated as appropriate for the individual WFA on
HANA implementation. WFA on HANA implementers can adjust and adapt this process
according to what works best for their implementation. This section is an excerpt from the
Validation Guide.
1. Use Drill to Detail on the measure page to extract list of data included in metric.
One of the myths is that the number of records returned in drill to detail should equal to
the number you drill into. This is not true. Drill to detail is used to show the data from fact
table that is behind the current measure. There could be more records in the fact table or
less. When looking into drill to detail, rather than focusing on the number of records, you
should be focusing on the column that is behind your current measure.
For example, if your measure is a sum of column A, you should have column A in Drill to
Detail. Export it to excel and sum the number from that column. The sum should equal to
the number you drilled into.
If your measure is a distinct count of column B, you should have column B in Drill to detail.
Export it to excel and apply a distinct count algorithm on that column. You should be able
to find that distinct count in excel matches the number you drill into.
2. Construct an Advanced Reporting / Live Data (Ad Hoc) query to extract employee (or
other counts) for the same time period, using the same logic (for example: including
employee statuses of A, U, P) as is configured for WFA on HANA.
3. Compare the two exports and create list of discrepancies, showing which employees are
in the Drill to Detail export and not the Advanced Reporting / Live Data export (and vice
versa).
4. For the discrepancies, attempt to match employee’s data in Employee Central with WFA
on HANA configuration and the Advanced Reporting / Live Data query.
For Example:
● Does the employee data in Employee Central blocks match to the Advanced Reporting
query results?
● If they don’t match, at what point does the data between Employee Central, Advanced
Reporting and WFA on HANA stop matching?
If they don’t match, it might indicate something in WFA on HANA or Advanced Reporting
query configuration needs to be adjusted. It MIGHT also indicate a discrepancy between
Employee Central and Advanced Reporting data that will need to be resolved, though this
is less common.
5. Correct / change Employee Central data, update WFA on HANA configuration and / or
Advanced Reporting / Live Data query logic as necessary and repeat the process until all
discrepancies are removed or explained.
measure. This may require you remove all conditions, exclusions and simply to start with the
field that you want a sum or distinct count.
Drill to Detail is available in table / pivot views by default on standard Measure pages and can
be made available by report designers in table views on custom report pages. When viewing a
measures results in a table, Drill to Detail is available if the cursor changes to a hand
(indicating a clickable link) when hovering over an individual result. In the example the Drill to
Detail results will show the employee records that comprise the Average Age of 34 for
Physically Challenged employees in Australia.
Key Columns
Select from key columns, integral to the measure results.
PERSON_ID: Displays the column chosen as the Primary Person ID in the WFA on HANA
implementation. In a typical implementation, this is the employees Person ID from their
Employment Information block.
USERS_SYS_ID: Displays the column chosen as the Secondary Person ID in the WFA on
HANA implementation. In a typical implementation, this is the employees Users Sys ID from
their Job Information block.
Effective From Date: The date at which the employee’s record starts. Typically, this is sourced
from the Effective Start Date in the relevant table.
Effective To Date: The date at which the employee’s record ends. Typically, this is sourced
from the Effective End Date in the relevant table.
Current Measure: The results for that employee record for the measure that the user is
displaying Drill to Detail results for.
Dimension Columns
Select from columns related to each dimension. The list of available dimensions available will
change in accordance with what is currently implemented in the instance.
ID:Displays the ID/Code applicable to the employee record for the dimension. In a typical
implementation, this is the objects External Code.
Name: Displays the Name/Label applicable to the employee record for the dimension. In a
typical implementation, this is the objects External Name.
Parent ID: Displays the ID/Code of the parent for the ID (if the dimension ID is part of a multi-
level / parent child structure).
Parent Name: Displays the Name/Label of the parent for the ID (if the dimension ID is part of
a multi-level / parent child structure).
Level X ID: Displays the ID/Code of the selected Dimension Level for the ID (if the dimension
ID is part of a multi-level / parent child structure).
Level X Name: Displays the Name/Label of the selected Dimension Level for the ID (if the
dimension ID is part of a multi-level / parent child structure).
The Level X ID / Name columns in a dimension allow you to display that dimension’s structure
as part of the DTD results.
For example, you can see that the employee in the first record (User Sys ID = 100095) is in
the Philadelphia location, which rolls up through Newtown Square > Pennsylvania > United
States.
Attribute Columns
The list of columns that are attributes to related dimensions. The list of available attributes
will change in accordance with what is currently implemented in the instance.
Code: Displays the ID/Code of the attribute for its related dimensions.
For example, you see that these two employees (User Sys ID = 107026 and 107023) have Pay
Component code BASE_CN and Frequency code of MON for their Total Workforce Annual
Salary.
If adding a Level X column, choose which level of the dimension you wish to use.
Select a column and click edit columns to change its display label.
Select the column that contains the employees User Sys ID, click edit columns and check Is
BizX User ID to enable the Quickcard on that column in the DTD results.
This column will now appear with the Quickcard link in the DTD results.
Change the order of columns in the DTD display using Move Up and/or Move Down.
Choose which columns the DTD display will be sorted on in Order Drill to Detail Results.
Note:
Do not forget to save as you make changes to the Drill to Detail settings.
LESSON SUMMARY
You should now be able to:
● Post OLAP Configuration Tasks
Learning Assessment
X A Strategic Consultant
X B Project Manager
X C Technical Consultant
X D Functional Consultant
2. Which implementation role is in charge of managing the project and its risks?
Choose the correct answer.
X A Strategic Consultant
X B Project Manager
X C Technical Consultant
X D Functional Consultant
X A Strategic Consultant
X B Project Manager
X C Technical Consultant
X D Functional Consultant
4. Which type of measure combines other measures using percentages and ratios?
Choose the correct answer.
X A Dimension Hierarchies
X D Result Measures
5. Which documentation describes all the Measures and Dimensions available in WFA?
Choose the correct answer.
X A Project Summary
X C Metrics Packs
X D Configuration Workbook
X C Result Measure
X D Time Dimension
X E Structural Dimension
7. Which of the following are advantages of WFA on HANA over traditional WFA?
Choose the correct answers.
X D Investigate Tool
X A Employee Central
X C Dashboards is Enabled
X A User Sys ID
X B Person ID
X C Username
X D Subject ID
10. Which column is usually associated with the special use type Secondary Person?
Choose the correct answer.
X A User Sys ID
X B Person ID
X C Username
11. Dimensions with their dimension structure configured as “Generated” cannot be modified
manually in Dimension Editor.
Determine whether this statement is true or false.
X True
X False
12. Which tool enables the user to create reports with multiple pages?
Choose the correct answer.
X A Query Workspace
X B Investigate
X C Report Center
14. Which Dimension Structure and Structure Type has leaf nodes of the dimension on the
same level and is NOT configurable in the dimension editor?
Choose the correct answer.
X D Generated Flat
15. To which of the following formats can an entire measure view be exported?
Choose the correct answers.
X A Word
X B PDF
X C XML
X D PowerPoint
X E Excel
X A Data Factory
X B Investigate
X C BIRT
X D Tile Designer
17. How do you update the label of a Standard Dimension to display in the WFA portal?
Choose the correct answer.
18. How do you identify the standard nodes in the WFA Dimension Editor tool?
Choose the correct answer.
X A # in the node ID
X B ? in the node ID
19. If Investigate has never been enabled in a customer instance, where can you enable it?
Choose the correct answer.
X A Provisioning
X B Upgrade Center
X True
X False
21. When importing a configuration in the WFA on HANA Data Factory, the target instance of
the imported configuration maintains its Drill to Detail configuration.
Determine whether this statement is true or false.
X True
X False
22. Which of the following tools can you use to manage manual dimension configuration?
Choose the correct answers.
23. What is the recommended configuration of the null substitute for the fact table in
Employee Central?
Choose the correct answer.
X A ??
X B ##
X C >>
X D !!
24. You can change the column name that displays in drill to detail.
Determine whether this statement is true or false.
X True
X False
25. In which dimension can you limit access to a subset of the entire hierarchy?
Choose the correct answer.
X A Time
X B Analysis
X C Structural