Professional Documents
Culture Documents
Department of Accounting and Information Systems: Course Title: Applied Research Methodology Course Code: 506
Department of Accounting and Information Systems: Course Title: Applied Research Methodology Course Code: 506
Term Paper On
“Data, Variables and model and Empirical research in accounting.”
SUBMITTED TO:
Tarik Hossain
Associate Professor
Department of Accounting & Information Systems,
Faculty of Business Studies,
Comilla University
Submitted By:
Hossain Ahmed
Group Leader,
On Behalf of the Group-(E)
Department of Accounting and Information Systems
Comilla University
1
May 7, 2021
Tarik Hossain
Associate Professor
Subject: Submission of Term Paper on ― Data, Variables and model and Empirical
research in accounting
Sir, with due respect, we are submitting this term paper on ― Data, Variables and model and
Empirical research in accounting under the requirement of the course ―Applied Research
Methodology‖ as you have asked us to prepare it.
We all are very grateful to you for your concrete knowledge about the subject matter that helped
us to lead the term paper to its successful completion. We have prepared this through a group
work. We hope that, this term paper will help us in our future. If you face any disorder or
mistakes in interpreting this term paper, then please inform us so that we can correct our
mistakes and learn it properly.
Sincerely yours,
Hossain Ahmed
Group leader,
2
Acknowledgement
By the grace of Allah, we have completed our Term paper on “Data, Variables and model
and Empirical research in accounting”. At first a special thank goes to our honorable course
teacher. We have tried to our best to make the term paper comprehensive and reliable within
the given time period. Nevertheless, some mistake might be occurred, please notice this type
of unconscious mistakes with sympathy and pardon. Your suggestions and comments for the
improvement of this term paper will be received thankfully and if you need any quires the
study, please inform us.
Yours faithfully
Hossain Ahmed
Group leader,
3
Groups Members
Group- E
Name ID No.
4
Data collection
Data collection is the systematic approach to gathering and measuring information from a variety of
sources to get a complete and accurate picture of an area of interest. Data collection enables a person or
organization to answer relevant questions, evaluate outcomes and make predictions about future
probabilities and trends.
To ensure that high quality data is recorded in a systematic way, here are some best practices:
• Record all relevant information as and when you obtain data. For example, note down whether
or how lab equipment is recalibrated during an experimental study.
• Double-check manual data entry for errors.
• If you collect quantitative data, you can assess the reliability and validity to get an indication
of your data quality.
Primary Data:
Primary data is information collected directly from first-hand experience. This is the information that
you gather for the purpose of a particular research project. Primary data collection is a direct approach
that is tailored to specific company needs. It can be a long process but does provide important first-hand
information in many business cases.
Primary data is the original data – from the first source. It is like raw material.
5
➢ Interview (personal interview, telephone, e-mail)
➢ Self-administered surveys and questionnaires
➢ Field observation
➢ Experiments
➢ Life histories
➢ Action research
➢ Case studies
➢ Diary entries, letters, and other correspondence
➢ Eyewitness accounts
➢ Ethnographic research
➢ Personal narratives, memoirs
In fact, the source of primary data is the population sample from which you gather your data. The
sample is selected by some of the different types of sampling methods and techniques.
Secondary data:
Secondary data is the data that have been already collected for another purpose but has some relevance
to your research needs. In addition, the data is collected by someone else instead of the researcher
himself. Secondary data is second-hand information. It is not used for the first time. That is why it is
called secondary.
Secondary data sources provide valuable interpretations and analysis based on primary sources. They
may explain in detail primary sources and often uses them to support a specific thesis or a point of view.
➢ Previous research
➢ Mass media products
➢ Government reports
➢ Official statistics
➢ Letters
➢ Diaries
➢ Web information
➢ Google Analytics or other sources that show statistics and data for digital customer experience.
➢ Historical data
➢ Encyclopedias
➢ Monographs
➢ Journal articles
6
➢ Biography
➢ Research analysis
Big Data:
Big Data is a collection of data that is huge in volume, yet growing exponentially with time. It is a data
with so large size and complexity that none of traditional data management tools can store it or process
it efficiently. Big data is also a data but with huge size.
7
Big Data Analysis
Big Data analytics is a process used to extract meaningful insights, such as hidden patterns, unknown
correlations, market trends, and customer preferences. Big Data analytics provides various
advantages—it can be used for better decision making, preventing fraudulent activities, among other
things.
➢ Stage 1 - Business case evaluation - The Big Data analytics lifecycle begins with a business
case, which defines the reason and goal behind the analysis.
➢ Stage 2 - Identification of data - Here, a broad variety of data sources are identified.
➢ Stage 3 - Data filtering - All of the identified data from the previous stage is filtered here to
remove corrupt data.
➢ Stage 4 - Data extraction - Data that is not compatible with the tool is extracted and then
transformed into a compatible form.
➢ Stage 5 - Data aggregation - In this stage, data with the same fields across different datasets are
integrated.
➢ Stage 6 - Data analysis - Data is evaluated using analytical and statistical tools to discover
useful information.
➢ Stage 7 - Visualization of data -Big Data analysts can produce graphic visualizations of the
analysis.
➢ Stage 8 - Final analysis result - This is the last step of the Big Data analytics lifecycle, where
the final results of the analysis are made available to business stakeholders who will take action.
8
Open data analysis
Open data may be defined as some data which are available for reuse by everyone. Those data are not
confined to patent; copyright etc. with the use of internet, the concept of open data has become a tool
for businesses, analysis, discussion and decision making. The open data is booming as a next big thing
around the globe. The open data concept is not new rather it can be considered as a replacing term for
open definition which can be defined as a piece of data that can be used, reused and redistributed by
anyone. In this paper, we have discussed about the sources of open data, issues regarding open data,
formats for open data and presented some views and opinions.
Cultural: Data about cultural works and artefacts of the countries. These are data normally held by
galleries, libraries, archives and museums.
Science: Data that is produced as part of scientific research in all categories from A to Z. The concept
of open access to scientific data was institutionally established with the formation of the World Data
Center system, in preparation for the International Geophysical Year Of 1957-1958.
Finance: Data such as expenditure and revenue Of government and information on financial markets
such as stocks, shares, bonds etc.
Statistics: Data produced by statistical offices such as the census and key socioeconomic indicators.
Weather: Data obtained from satellite and Other sources to predict the weather and climatic conditions.
Environment: Information related to the natural environment such as pollution, rivers, seas, mountains,
volcanoes etc.
Government: Nowadays many national governments publish data catalogue for the sake of
transparency of their plans and policies among the general public.
9
transparency and better understanding of the plans, policies of the government. Likewise, the
information regarding schools of a particular area can help parents to identify the right school for
educating their children. This school information may not be useful for others.
In general, the advantages Of open data are:
Selecting Variables
After selecting the data source, the next step is to specify the variables to be imported. Three types of
variables can be imported into a project.
10
The basic steps for variable selection are as follows:
a) Specify the maximum model to be considered.
b) Specify a criterion for selection a model.
c) Specify a strategy for selecting variables.
d) Conduct the specified analysis.
e) Evaluate the Validity of the model chosen.
Analytical
Researchers who utilize analytical methods base analysis and conclusions on formally modelling
theories or substantiated ideas in mathematical terms. These analytical studies use math to predict,
explain, or give substance to theory.
Archival
Researchers who utilize archival methods base analysis and conclusions on objective data collected
from repositories of third parties. Also included are studies in which the researchers collected the data
and in which the data has objective amounts such as net income, sales, fees, etc.
Experimental
Researchers who utilize experimental methods base analysis and conclusions on data the researcher
gathered by administering treatments to subjects. Usually these studies employ random assignment;
however, if the researcher selects different populations in an attempt to “manipulate” a variable, we
include these as experimental in nature.
Risk management is important for a bank to ensure its profitability and soundness. It is also a concern of
regulators to maintain the safety and soundness of the financial system. Over the past decades, banking
business has developed with the introduction of advanced trading technologies and sophisticated financial
products. While these advancements enhance bank’s intermediation role, promote profitability, and better
11
diversify bank risk, they raise significant challenges to bank risk management. The risk management of
banks has been considered to be weak compared to the rapid changes in the financial markets. In the light
of the recent global financial crisis, bank risk management has become the major concern of banking
regulators and policy makers.
Cross-sectional data:
Cross-sectional data, or a cross section of a study population, in statistics and econometrics is a type of
data collected by observing many subjects (such as individuals, firms, countries, or regions) at the one
point or period of time. The analysis might also have no regard to differences in time. Analysis of cross-
sectional data usually consists of comparing the differences among selected subjects.
For example, if we want to measure current obesity levels in a population, we could draw a sample of
1,000 people randomly from that population (also known as a cross section of that population), measure
their weight and height, and calculate what percentage of that sample is categorized as obese. This cross-
sectional sample provides us with a snapshot of that population, at that one point in time. Note that we
do not know based on one cross-sectional sample if obesity is increasing or decreasing; we can only
describe the current proportion.
Time series:
Time series analysis is a statistical technique that deals with time series data, or trend analysis. Time
series data means that data is in a series of particular time periods or intervals. The data is considered
in three types:
Time series data: A set of observations on the values that a variable takes at different times.
Cross-sectional data: Data of one or more variables, collected at the same point in time.
Money market:
The money market involves the purchase and sale of large volumes of very short-term debt products,
such as overnight reserves or commercial paper.
An individual may invest in the money market by purchasing a money market mutual fund, buying a
Treasury bill, or opening a money market account at a bank.
Money market investments are characterized by safety and liquidity, with money market fund shares
targeted at $1.
12
Capital market:
A capital market is a financial market in which long-term debt (over a year) or equity-backed securities
are bought and sold,[6] in contrast to a money market where short-term debt is bought and sold. Capital
markets channel the wealth of savers to those who can put it to long-term productive use, such as
companies or governments making long-term investments.[a] Financial regulators like Securities and
Exchange Board of India (SEBI), Bank of England (BoE) and the U.S. Securities and Exchange
Commission (SEC) oversee capital markets to protect investors against fraud, among other duties.
While most of the requirements exist within each of the sectors, the level of importance for the
requirement in each sector varies. For the cross-sector analysis , any requirements that were identified
by at least two sectors as being a significant requirement for that sector were included into the cross-
sector roadmap definition.
13
Data enrichment
data enrichment aims to make unstructured data understandable across domains, application, and value
chains.
data sharing and integration aims to establish a basis for the seamless integration of multiple and diverse
data sources into a big data platform. The lack of standardized data schemas, semantic data models , as
well as the fragmentation of data ownership are important aspects that need to be tackled.
Real-time data transmission aims at acquiring (sensor and event) information in real time. In the public
sector, this is closely related with the increasing capability of deploying sensors and Internet of Things
scenarios, like in public safety and smart cities .
14
Data Quality
The high-level requirement, data quality , describes the need to capture and store high-quality data so
that analytic applications can use the data as reliable input to produce valuable insights. Data quality
has one sub-requirement:
✓ Data improvement
data improvement aims at removing noise/redundant data, checking for trustworthiness, and adding
missing data.
✓ Provenance management
✓ Human data interaction
✓ Unstructured data integration
The high-level requirement data security and privacy describes the need to protect highly sensitive
business and personal data from unauthorized access. Thus, it addresses the availability of legal
procedures and the technical means that allow the secure sharing of data.
An actionable roadmap should have clear selection criteria regarding the priority of all actions. In
contrast to a technology roadmap for the context of a single company, a European technology roadmap
needs to cover developments across different sectors. The process of defining the roadmap included an
analysis of the big data market and feedback received from stakeholders. Through this analysis, a sense
of what characteristics indicate higher or lower potential of big data technical requirements was reached.
As the basis for the ranking, a table-based approach was used that evaluated each candidate according
to a number of applicable parameters. In each case, the parameters were collected with the goal of being
sector independent. Quantitative parameters were used where possible and available.
In consultation with stakeholders , the following parameters were used to rank the various technical
requirements. The ranking parameters included:
15
✓ Estimated export potential of the sector(s)
✓ Estimated cross-sectorial benefits
✓ Short-term low-hanging fruit
Earnings management is the use of accounting techniques to produce financial statements that present
an overly positive view of a company's business activities and financial position. Many accounting rules
and principles require that a company's management make judgments in following these principles.
Earnings management takes advantage of how accounting rules are applied and creates financial
statements that inflate or "smooth" earnings.
Companies use earnings management to present the appearance of consistent profits and to smooth
earnings' fluctuations.
One of the most popular ways to manipulate financial records is to use an accounting policy that
generates higher short-term earnings.
The quality of earnings refers to the proportion of income attributable to the core operating activities of
a business. Thus, if a business reports an increase in profits due to improved sales or cost reductions,
the quality of earnings is considered to be high. Conversely, an organization can have low-quality
earnings if changes in its earnings relate to other issues, such as:
In general, any use of accounting trickery to temporarily bolster earnings reduces the quality of earnings.
A key characteristic of high-quality earnings is that the earnings are readily repeatable over a series of
reporting periods, rather than being earnings that are only reported as the result of a one-time event. In
addition, an organization should routinely provide detailed reports regarding the sources of its earnings,
16
and any changes in the future trends of these sources. Another characteristic is that the reporting entity
engages in conservative accounting practices, so that all relevant expenses are appropriately recognized
in the correct period, and revenues are not artificially inflated.
Value relevance is being defined as the ability of information disclosed by financial statements to
capture and summarize firm value. Value relevance can be measured through the statistical relations
between information presented by financial statements and stock market values or returns
17
Accounting information contained in financial statements is expected to be useful for decision makers.
In order to provide this, financial statements should meet some basic characteristics. “If financial
information is to be useful, it must be relevant and faithfully represent what it purports to represent. The
usefulness of financial information is enhanced if it is comparable, verifiable, timely and
understandable”
An event study is an empirical analysis that examines the impact of a significant catalyst occurrence or
contingent event on the value of a security, such as company stock.
Event studies can reveal important information about how a security is likely to react to a given event.
Examples of events that influence the value of a security include a company filing for bankruptcy
protection, the positive announcement of a merger, or a company defaulting on its debt obligations.
KEY TAKEAWAYS
An event study, or event-history analysis, examines the impact of an event on the financial performance
of a security, such as company stock.
An event study analyzes the effect of a specific event on a company by looking at the associated impact
on the company's stock.If the same type of statistical analysis is used to analyze multiple events of the
same type, a model can predict how stock prices typically respond to a specific event.
An event study, also known as event-history analysis, employs statistical methods, using time as the
dependent variable and then looking for variables that explain the duration of an event—or the time
until an event occurs. Event studies that use time in this way are often employed in the insurance
industry to estimate mortality and compute life tables. In business, these types of studies may instead
be used to forecast how much time is left before a piece of equipment fails. Alternatively, they could
be used to predict how long until a company goes out of business. Other event studies, such as an
interrupted time series analysis (ITSA), compare a trend before and after an event to explain how, and
to what degree, the event changed a company or a security. This method may also be employed to see
if the implementation of a particular policy measure has resulted in some statistically significant change
after it has been put in place.
18
An event study conducted on a specific company examines any changes in its stock price and how it
relates to a given event. It can be used as a macroeconomic tool, as well, analyzing the influence of an
event on an industry, sector, or the overall market by looking at the impact of the change in supply and
demand.
An event study, whether on the micro- or macro-level, tries to determine if a specific event has, or will
have, an impact on a business's or economy's financial performance.
Empirical methods:
Empirical methods are employed in communication studies in an attempt to yield objective and
consistent findings. This approach is positivistic in the sense that the social world is perceived as
governed by laws or law-like principles that make it predictable. Initially, empirical methods have been
equated with the use of quantitative measures (e.g., content analyses, surveys) and primary collection
and analysis of data. Nowadays, secondary analyses and qualitative research are also considered
empirical. It seems plausible to categorize qualitative research as empirical to the extent that scholars
provide sufficient information that allows the reproduction of their findings (e.g., sampling strategy,
data collection and analysis). However, this categorization is likely to be debatable.
• Qualitative: Qualitative evidence is the type of data that describes non-measurable information.
• Quantitative: Quantitative evidence refers to numerical data that can be further analyzed using
mathematical and/or statistical methods.
There is a reason why empirical research is one of the most widely used method. There are a few
advantages associated with it. Following are a few of them.
19
20