Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

2021

Summer Internship

Research
Methodology

Submitted By,
Name : Shrestha Verma
Submitted To,
PRN : 20020141193
Prof. Yogesh Brahmankar
Roll No : 43293
1|Page
Internship Program: B.R.E.W. Summer Internship
Organization: AB InBev
Sector: FMCG-Beverage
Project Title: GHQ – General Ledger Master Data Clean-up
Steps/Approach used for the Research Methodology:
1. Problem identification and goal understanding
2. Review of Literature
3. Data collection
4. Data Analysis and Visualization
5. Interpretation and Recommendations

About the Project:


1. Problem Identification and Goal Understanding:
Description:
 Identify OIM GLs with no Ledger specific postings and clear all matched lines
 Identify list of GLs which require Ledger specific postings, GLs which does not require for
Ledger specific postings
 Convert the GL’s to OIM, which don’t have Ledger specific postings.
 Mapping existing Account structure to S4HANA proposed GCOA and aligning on differences.
Setting up automated clearing of line items in partnership with tech team
Purpose:
 Reduces the risk of surprises
 Focus on actual open items vs data cleansing
 OI Analysis and action plan in Blackline
 Enables Global Dashboard for Governance
 S4 readiness & alignment
 GCOA Compliance and Governance

Key Expected Learnings:

2|Page
 Strategy and Concept Creation: Strategy for transformation of the current system
 Planning: How any business planning can be done?
 Data Analysis: How does analytics work?
 Working of financial model
Goals:
 OIM GL’s Identification – 100% accounts identified to be cleared
 100% Ledger Specific GL’s Identification (Single Ledger postings)
 100% Conversion of all (GL’s without Ledger specific postings) GL’s to OIM
 100% Creation of new GL’s for L1 postings
 Governance of Open items to be created and signed off by GHQ Controllers

2. Literature Review:
Central Finance (a.k.a. CFin) helps companies, rather than embarking on intensive upfront relocation, to
merge their financial platforms into SAP S/4HANA during incremental transformation. By using CF,
companies will move from their current EFA landscape in a non-disruptive way rather than migrating all
of their data from one device to another. Implementations in central finance entail a duplication
mechanism across all of the data – feasible across IT and agnostic in terms of market models.
If this mapping has been done, it is no longer appropriate, as this systems are now integrated into the
central funding structure, to transfer data to a different application for financial processes like
monitoring, forecasting and restructuring. You will start right in and make use of the many technologies
common to the latest suite of SAP S/4HANA's memory, predictive, cloud and machine learning.
It is important to complete a stock take before you start optimizing your results. Lack of understanding
of your own data and systems is one of the most common explanations for extra costs and time spans in
migration projects. Be sure that you have a snapshot of the content master data's quantity and
consistency since this is the best means of estimating the overall amount of migration. Therefore, you
can do a critical examination of what details you do need. Data that are no more current or no longer
operational in the new system could still be available to you. When you examine the corporate
procedures, what else is important should soon become clear. This is an incentive for unnecessary
ballast removal and keeping the modern method out of ancient pressures from the beginning. So content
master data no longer used should be archived instead of migrating all, for example if they cannot be
deleted because of legal criteria.
General Ledger master data is a common source of errors. In this context the data records should be
specified where duplicates are considered. Therefore, it is important to distinguish duplicates, in order to
accept duplicates, to decide the extent to which two documents ought to fit. This may be achieved on the
basis, for example, of the ledger description, OIM and NOIM accounts.

3. Data Collection:

3|Page
The key substance of the implementation process of corporate management is the ERP system, data
collection in the preliminary requirement analysis phase is highly significant. Data collected in this case
was primarily via secondary sources. Since the data was second-hand, it involved a lot of data crunching
and validation part. The data included information on various General Ledger Accounts presence in
different regions like UK, US, Luxembourg and few other regions. The following attributes were
covered:
 Number of the GL Account
 Description of the GL Account
 Company Code
 Balances
 Assignment
 Trading Partner

4. Data Analysis and Visualization:


The visualization of data provides a clear picture of how information stands through maps or graphs to
the visual environment. This makes data understandable to the human mind, making it simpler to
discover trends, patterns and outliers in big data sets.
In Power BI, visualization may be created in two methods. First of all, the canvas report is added from
the right side. As nature, the visualization type table in Power BI is chosen. Another approach is to move
the fields to the Axis of Visualization from the right sidebar.

4|Page
It is important to complete a stock take before you start optimizing your results. Lack of understanding
of your own data and systems is one of the most common explanations for extra costs and time spans in
migration projects. Be sure that you have a snapshot of the content master data's quantity and
consistency since this is the best means of estimating the overall amount of migration. Therefore, you
can do a critical examination of what details you do need. Data that are no more current or no longer
operational in the new system could still be available. When we examine the corporate procedures, what
else is important should soon become clear. This is an incentive for unnecessary ballast removal and
keeping the modern method out of ancient pressures from the beginning. So content master data no
longer used should be archived instead of migrating all, for example if they cannot be deleted because of
legal criteria.

5|Page
6|Page

You might also like