Professional Documents
Culture Documents
Term Paper For Quantitative Research
Term Paper For Quantitative Research
Submitted to:
GLORIA P. GEMPES, Ph.D.
Professor
Submitted by:
KARL PATRICK BANO AMPER
Ph.D. in Education major in Applied Linguistics
December 2022
2
TABLE OF CONTENTS
Page
TITLE PAGE 1
TABLE OF CONTENTS 2
CHAPTERS
1) Cluster analysis 4
2) Curve fitting 4
3) Design of Experiments 5
4) Diagnostic Tests 5
5) Distribution Fitting 6
6) Forecasting 6
7) Group-sequential 7
8) Item Analysis 7
9) Meta-analysis 8
10)Method comparison 8
11)Mixed models 9
12)Multivariate analysis 9
13)Non-parametric 10
14)Operations research 10
15)Proportions 11
16)Quality control 11
17)Reference intervals 12
18)Reliability 12
3
19)Survival analysis 13
20)Time series 13
21)Two-way tables 14
1. CLUSTER ANALYSIS
various items into sets where the similarity between two objects is maximized if they are
members of the same group and minimized otherwise. It is a technique for generating
hypotheses rather than testing them. When utilized, this statistical tool can be used
whenever it's necessary to divide vast volumes of data into a manageable number of
https://www.britannica.com/science/psychoanalysis
2. CURVE FITTING
data spread by appointing the best function for the full range in her paper for the
Probability and Statistics Topic Index. It should be able to identify patterns in the
information and provide recommendations about how the data series will behave going
forward. There are thus two (2) different approaches to curve fitting: interpolation, which
involves finding a function that precisely fits the data points, and regression. In contrast
to smoothing, which involves finding a function that roughly fits the data points while
allowing for error and allowing our actual points to be close to each other, this approach
determining if the dots on the scatter plot adhere to a linear, exponential, quadratic, or
other function, we can determine the relationship between two variables using this
statistical tool.
Glen, S. (2018). Curve Fitting from Statistics How To: Elementary Statistics for the rest
of us. https://www.statisticshowto.com/curve-fitting/
3. DESIGN OF EXPERIMENTS
Planning, carrying out, analyzing, and interpreting controlled tests to assess the
variables that determine the value of a parameter or group of parameters is the focus of
the applied statistics subfield known as the design of experiments. This statistical tool is
employed to (1) ascertain whether a factor or set of factors, has an impact on the
response; (2) ascertain whether factors interact in their impact on the response; (3)
model the behavior of the response as a function of the factors; and (4) optimize the
an-essential-tool-for-quality-improvement/
4. DIAGNOSTIC TESTS
According to Altman and Bland (2016), diagnostic tests are often created to
identify certain conditions or features. In order to diagnose diabetes, for instance, a test
would check for physical signs of the disease (such as high glucose blood levels at a
specific time after ingesting a controlled amount of sugar). When utilized, this statistical
method can be used to discover particular strengths and shortcomings among other
things. Another use for diagnostic tests is to identify the root of a behavior or feature. To
6
find out if a kid has dyslexia or a visual impairment, for example, diagnostic testing may
Altman, D. G., & Bland, J. M. (2016). Statistics Notes: Diagnostic tests 2: predictive
5. DISTRIBUTION FITTING
data's distribution significantly deviates from an anticipated distribution. Once the type of
estimating the distribution's parameters in such a way that the sample is as likely as
possible (in terms of maximum likelihood) or that at least some sample statistics (mean,
variance, for example) are as similar to the distribution's statistics as possible. The
normal, gamma, Weibull, and lowest extreme value distributions are a few examples of
statistical distributions. You are attempting to assess the process capabilities of your
https://www.spcforexcel.com/knowledge/basic-statistics/distribution-fitting.
6. FORECASTING
Singh (2018) defined the term "forecasting" refers to the projection of what might
occur in the future using statistics based on historical data. Simply said, forecasting is
the process of creating predictions based on historical and current data, most frequently
via trend analysis. This can be done with any quantitative data, including sales, GDP,
7
property sales, stock market performance, etc. For instance, if one wanted to predict
how much snow would fall in 2016–17, it would be simple to claim that the same
https://blog.arkieva.com/statistical-forecasting-definition/
7. GROUP SEQUENTIAL
of the data has been obtained when the data is accumulated over time. In relation to
at any point during the data collection process. When employed, this statistical tool is
mostly useful for designing clinical trials. It entails that the trial's sample size is not
times. In actuality, group sequential analysis can arrive at a conclusion far sooner than
a classical design would allow. As a result, it can save time and money and limit the
analysis-in-ncss/.
8. ITEM ANALYSIS
Item analysis employs statistics and professional judgment to assess the quality
of individual items, item sets, and complete sets of items as well as the relationships
8
between individual items. When used, this statistical method can be used to examine
how well specific items performed with respect to other items on the test or in relation to
some external criterion. This knowledge is put to use to raise test and item quality.
9. META-ANALYSIS
two or more independent studies. This can be done if an issue is the subject of several
scientific research, and each of those studies reports measurements that are likely to
include some inaccuracy. The objective is then to utilize statistical methods to determine
the pooled estimate that is the closest to the unobserved shared reality based on how
this inaccuracy is perceived. Meta-analyses are frequently, but not always, significant
analysis of many clinical trials of the treatment may be carried out to better understand
how effectively a medical treatment works, for instance, a meta-analysis of many clinical
https://training.cochrane.org/handbook/current/chapter-10.
particularly relevant to cardiac output (CO) monitoring, tries to assess the reliability of a
9
are required to assess the degree to which novel monitors modify hemodynamic
treatment, impact the patient outcome, and cost-effectiveness when validity has been
research.
Montenij, L. J., Buhre, W.F., Jansen, J.R., Kruitwagen, C.L., & de Waal, E.E., (2018).
Methodology of method comparison studies evaluating the validity of cardiac
output monitors: a stepwise approach and checklist. British Journal of
Anaesthesia, 116 (6): 750–8 (2016). https:// doi: 10.1093/bja/aew094
11. MIXED MODELS
According to Vyawahare (2019), a statistical model that includes both fixed and
random effects is known as a mixed model (or, more properly, a mixed error-component
social science fields across a broad spectrum can benefit from these models. When
your data contains both global and group-level trends, regression models are a great
tool for linear regression models. They are especially helpful in situations when
measurements are taken repeatedly on the same statistical units (as in longitudinal
https://medium.com/analytics-vidhya/introduction-to-mixed-models 208f012aa865
observation and analysis of multiple statistical result variables. The technique is used in
design and analysis to carry out trade studies across various dimensions while
10
accounting for the effects of all variables on the relevant responses. Multivariate
should employ multivariate statistical methods since modeling is the greatest way to
variables for each person or thing under study, are made to examine data sets at once.
Gonzales, D., Golpe, R., Marcos, P.J., Mengual, M., (2015). Multivariate analysis in
https://doi.org/10.3978%2Fj.issn.2072-1439.2015.01.43
13. NONPARAMETRIC
that the data come from predetermined models that are based on a limited set of
parameters, such as the linear regression model and the normal distribution model.
Consider a second researcher who wants to determine whether the typical number of
hours of sleep is related to how frequently people get sick. The distribution of illness
frequency is non-normal, being right-skewed and outlier-prone since many people get
sick very infrequently, if at all, and occasionally others become sick far more frequently
https://www.investopedia.com/terms/n/nonparametric-method.asp
operations research, issues are deconstructed into their simplest forms before being
11
mathematically analyzed and then resolved in a set of processes. The following steps
can be used to roughly break down the operations research process, recognize a
problem that needs fixing, and create a model for the issue that incorporates elements
of the real world and variables using the model to generate answers to the issue.
Analyzing the success of each solution after testing it on the model. putting the actual
https://whatis.techtarget.com/definition/operations-research-OR
15. PROPORTIONS
A percentage is the portion of the whole that has a particular quality. Additionally,
illustration is the ratio of deaths to men, which is calculated by dividing deaths to men by
deaths to men and women (i.e., the total population). Consider a sample of four pets,
such as a dog, a cat, a fish, and a bird. We could inquire as to what component has four
legs. Only the dog and the cat are four-legged pets. The percentage of animals with four
https://stattrek.com/statistics/dictionary.aspx?definition=proportion
monitoring and preserving the quality of goods and services through the use of
second approach, known as statistical process control, makes decisions about whether
to continue a process or modify it in order to attain the target quality using graphical
https://www.britannica.com/facts/statistical-quality-control
raised in the culture. Hematological parameters in teleost fish are mostly defined in
terms of erythrocyte and leukocyte parameters. Erythrocytes are the primary blood cells
in fish, both because of their abundance and because of the crucial functional role they
Witeska, M., (2013). Erythrocytes in teleost fishes: a review. Zool. Ecol. 23, 275—281.
https://doi.org/10.1080/21658005.2013.846963.
18. RELIABILITY
measurement consistently yields results that are similar, it is considered to have high
dependability. It is a trait of a group of test results that has to do with how much random
error from the measuring method might be present in the results. Highly trustworthy test
13
results are exact, reproducible, and constant from one testing session to the next. That
example, roughly the same results would be produced if the testing procedure was
survival analysis is to examine the time leading up to an event rather than the event
itself. The failure time or survival time are other names for this period of interest. The
intervals of time that can be employed in a survival analysis include days, months,
weeks, years, etc. Longer studies are typically favored for analysis since they present
stronger evidence and more trustworthy findings. Some of the events, however, are
almost impossible to monitor over an extended length of time. For instance, researchers
may find a relatively low median survival time in a study of pancreatic cancer, one of the
deadliest and fastest-growing types of cancer, which may mean that half of the patients
Etikan I., Bukirova K., & Yuvali M.(2018). Choosing statistical tests for survival analysis.
According to Hayes (2022), a time series is a collection of data points that appear
in a particular order over a certain amount of time. Cross-sectional data, which records
a point in time, can be compared to this. An investment time series records data points
at regular intervals and charts the movement of the selected data points, such as the
price of an asset, over a given period of time. The data can be acquired in a way that
gives the investor or analyst looking at the activity the information they need because
Hayes, A. (2022). What Is a Time Series and How Is It Used to Analyze Data?.
Investopedia. https://www.investopedia.com/terms/t/timeseries.asp
are other names for two-way tables. Both the levels of one categorical variable and the
levels of the other categorical variable are inserted as rows and columns, respectively,
in the table. The data can be condensed in a r x c contingency table if Variable 1 has r
levels and Variable 2 has c levels. The counts (i.e. frequencies) for each combination of
levels within the two variables that the categorical data represent are inserted into
specific cells of the table. The output of a two-way table frequently includes percentages
of individual rows, columns, and the overall sum, (NCSS Data Analysis & Graphics,
2022).
NCSS Data Analysis & Graphics. (2022). Analysis of Two-Way Tables in NCSS.
https://www.ncss.com/software/ncss/analysis-of-two-way-tables-in-ncss/