Reviewer Practical Research 2

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Republic of the Philippines

Department of Education
Caraga Administrative Region
Division of Surigao del Norte

REVIEWER IN PRACTICAL RESEARCH II Quota sampling (also known as dimension


S.Y. 2022 - 2023 sampling) is a non-probability sampling technique
like stratified sampling. In this method, the
Probability Sampling Technique population is split into segments (strata) and you
have to fill a quota based on people who match the
Simple Random Sampling. The chance of characteristics of each stratum.
selection is the same for every member of the
population. The sample is drawn so that all There are two types of quota sampling:
elements have equal number of chances to be 1. Proportional quota sampling gives
selected. It is a way of choosing individuals in proportional numbers that represent
which all members of the accessible population are segments in the wider population. For this,
given an equal chance to be selected. The the population frame must be known.
sampling method that chooses respondents by 2. Non-proportional quota sampling uses
pure chance stratum to divide a population, though only
the minimum sample size per stratum is
Systematic Random Sampling. It refers to a decided.
sampling that follows regular intervals from a list.
May spread the selected samples evenly across the Snowball sampling (also known as referral,
entire population than simple random sampling. respondent-driven, or chain referral sampling)
is a non-probability sampling type that mimics a
Stratified Random Sampling. The population is pyramid system in its selection pattern. You choose
divided into groups called strata and then simple early sample participants, who then go on to recruit
random sampling is applied in selecting samples further sample participants until the sample size
from each group. has been reached.

Cluster Sampling. Used for large scale surveys. Purposive sampling (also known as judgmental,
Used when the target respondents in a research selective, or subjective sampling) is a type of non-
study are spread across a probability sampling where you make a conscious
GEOGRAPICALLOCATION. The population is decision on what the sample needs to include and
group into what we called CLUSTER and simple choose participants accordingly.
random sampling is used in selecting the cluster.
Availability sampling - What is the sampling
Non - Probability Sampling Technique method that allows researchers to pick out people
who are easy to find or locate and willing to
Convenience sampling (also called haphazard, establish contact with them.
grab, opportunity, or accidental sampling).
Convenience sampling is a common type of non- DEFINITION OF FINDINGS
probability sampling where you choose Sustained: Misconduct is substantiated.
participants for a sample, based on their Misconduct may be as stated in complaint or as a
convenience and availability. result of investigation.

Convenience sampling also has two subtypes: Exonerated: Incident occurred, officer acted
1. Consecutive sampling (also known as lawfully and properly.
total enumerative sampling) is the
process of doing research with the sample Unfounded: Incident cited in complaint is falsely
members that meet the inclusion criteria reported or not factual. Not Sustained: Insufficient
and are conveniently available. evidence exists to prove or disprove complaint.
2. Self-selection (also known as volunteer
sampling) is sampling technique uses Voluntarily Withdrawn: Complainant decided
volunteers to fill in the sample size until it not to pursue the original allegation(s) and
reaches a specified amount.
Prepared by: Nessel D. Auditor
School: Surigao del Norte National High School
Republic of the Philippines
Department of Education
Caraga Administrative Region
Division of Surigao del Norte
requested the case file be closed and not
investigated further . Equivalent Forms Reliability. It is established by
administering two identical tests except for
Refused to Cooperate: Complainant and/or wordings to the same group of respondents.
witnesses refuse to assist investigators handling.
the case. Internal Consistency Reliability. It determines
how well the items measure the same construct. It
Instrument is the general term that researchers use is reasonable that when a respondent gets a high
for a measurement device (survey, test, score in one item, he will also get one in similar
questionnaire, etc.). To help distinguish between items. There are three ways to measure the internal
instrument and instrumentation, consider that the consistency; through the split-half coefficient,
instrument is the device and instrumentation is the Cronbach’s alpha, and Kuder-Richardson formula.
course of action (the process of developing, testing,
and using the device). Survey. Data gathering is done through interview
or questionnaire. By means of questionnaire you
Validity - A research instrument is considered use series of questions or statements that
valid if it measures what it supposed to measure. respondents will have to answer. Basically,
respondents write or choose their answer from
Types of Validity of Instrument given choices.

Face Validity. It is also known as “logical Interview is when you ask respondents orally to
validity.” It calls for an initiative judgment of the tell you the responses.
instruments as it “appear.” Just by looking at the
instrument, the researcher decides if it is valid. A questionnaire is a research instrument that
consists of a set of questions that aims to collect
Content Validity. An instrument that is judged information from a respondent. A research
with content validity meets the objectives of the questionnaire is typically a mix of close- ended
study. It is done by checking the statements or questions and open-ended questions.
questions if this elicits the needed information.
A questionnaire is an instrument used to collect
Construct Validity. It refers to the validity of data while a survey is a process of collecting,
instruments as it corresponds to the theoretical recording, and analyzing data. Questionnaires can
construct of the study. It is concerning if a specific be structured, semi-structured, or unstructured.
measure relates to other measures.
There are three structures of questionnaires:
Concurrent Validity. When the instrument can 1.Structured questionnaires employ closed-ended
predict results like those similar tests already questions
validated, it has concurrent validity. 2. Unstructured questionnaires, on the other hand,
use open-ended
Predictive Validity. When the instrument can 3. Semi-structured questionnaires are combinations
produce results like those similar tests that will be of both the structured and
employed in the future, it has predictive validity. unstructured ones.
This is particularly useful for the aptitude test.
Advantages of Using Questionnaire
Reliability refers to the consistency of the 1. Bulk data can be gathered in less time.
measures or results of the instrument. 2. Online survey is quick and cost-effective.
3. Less chance of bias.
Reliability of Instrument 4. Respondents can answer the questionnaire
without revealing their identity.
Test-retest Reliability. It is achieved by giving the 5. Easy analysis and visualization
same test to the same group of respondents twice.
The consistency of the two scores will be checked.
Prepared by: Nessel D. Auditor
School: Surigao del Norte National High School
Republic of the Philippines
Department of Education
Caraga Administrative Region
Division of Surigao del Norte
Disadvantages of Using Questionnaire
1. Questionnaires may not be returned on time. Experimental Statistical Technique - There is
2. Questionnaires may be lost. more than one possible outcome. We can specify
3. Understanding and interpretation of the each possible outcome in advance. And there is an
questions varies by the participants. element of chance. This term is generally used for
4. Participants may not be able to complete the controlled experiments. Like a coin toss, rolling
required responses. dice is a statistical experiment.
5. Emotions and feelings are hard to convey.
6. Participants’ answer may lack depth. Exploratory data analysis is a technique data
scientists use to identify patterns and trends in a
Frequency distribution - It gives you the data set. They can also use it to determine
frequency of distribution and percentage of the relationships among samples in a population,
occurrence of an item in an asset of data. validate assumptions, test hypotheses and find
missing data points. Companies can use
A frequency distribution of qualitative data is a exploratory data analysis to make insights based on
table of definite values and their frequencies that is data and validate data for errors.
valuable since it provides a table of the values of
the observation as well as the frequency with which Measures of central tendency help you find the
it happens. middle, or the average, of a dataset. The 3 most
common measures of central tendency are the
A measure of central tendency is a single value mode, median, and mean.
that attempts to describe a set of data by identifying
the central position within that set of data. As such, The mode is the most frequently occurring value in
measures of central tendency are sometimes called the dataset. It’s possible to have no mode, one
measures of central location. mode, or more than one mode.

A quantitative distribution is typically ordered The median of a dataset is the value that’s exactly
from the smallest count to largest count and in the middle when it is ordered from low to high.
represented in one of the five graphical methods
listed above so we can eventually examine the The arithmetic mean of a dataset (which is
shape, center, and spread of the data. different from the geometric mean) is the sum of all
values divided by the total number of values. It’s
Methods and Techniques of Quantitative Data the most used measure of central tendency because
Analysis all values are used in the calculation.

Descriptive statistics as the name implies is used standard deviation (or σ) is a measure of how
to describe a dataset. It helps understand the details dispersed the data is in relation to the mean. Low
of your data by summarizing it and finding patterns standard deviation means data are clustered around
from the specific data sample. the mean, and high standard deviation indicates
data are more spread out.
It is a quantitative data-analysis technique provides
a summary the orderly or sequential data obtained Research Locale - This discusses the place or
from the sample through the data-gathering setting of the study. It describes in brief the place
instrument used. where the study is conducted. Only important
features which have the bearing on the present
Inferential statistics aim to make predictions or
highlight possible outcomes from the analyzed data study are included. Shows the target population.
obtained from descriptive statistics. They are used
to generalize results and make predictions between Research Design - This describes the research
groups, show relationships that exist between mode whether it is true experimental or quasi
multiple variables, and are used for hypothesis experimental design, descriptive or survey
testing that predicts changes or differences.
Prepared by: Nessel D. Auditor
School: Surigao del Norte National High School
Republic of the Philippines
Department of Education
Caraga Administrative Region
Division of Surigao del Norte
research, historical research, qualitative research,
ethnographic etc. "data collection" refers to the process of
gathering, measuring, and analyzing correct
Validity and Reliability of Research Instrument insights for research purposes through the use of
established approved methodologies
1. This explains the specific type of research The primary goal of data collection is to ensure that
instrument used such as questionnaire, sufficient information-rich and reliable data is
checklist, questionnaire-checklists, collected for statistical analysis so that data-driven
structured interview, teacher–made test, research decisions can be made.
standardized instrument which are adopted
or borrowed with permission from the Editing raw data is the first step in analysis.
author or from other sources. Editing detects errors and omissions, corrects them
2. The parts of the instruments should be whatever possible. Editor’s responsibility is to
explained and what bits of information are guarantee that data are –accurate; consistent with
derived. the intent of the questionnaire; uniformly entered;
3. The establishment of validity and reliability complete; and arranged to simplify coding and
should be explained and only experts tabulation.
should be chosen to validate such
instrument. Specific and appropriate Coding refers to the process of assigning numerals
statistical test used should be given and the or other symbols to answers so that responses can
computed values derived. Interpretation be put into a limited number of categories or
should be included in the discussions. classes. Such classes should be appropriate to the
4. Equations used should also be presented in research problem under consideration. Coding is a
the Appendices of the research paper. process wherein the collected data are categorized
and organized. It is usually done in qualitative
Observation is way of gathering data which research. In quantitative research, coding is done to
involves systematically selecting, watching, assign numerical value to specific indicator
listening, reading, touching, and recording especially if it is qualitative in nature.
behavior and characteristics of living beings,
objects, or phenomena. Observations can be Tabulation is a system of processing data or
controlled, natural, or participant. It can be used in information by arranging it into a table. With
quantitative research when the observable tabulation, numeric data is arrayed logically and
characteristics are quantitative in nature (e.g., systematically into columns and rows, to aid in
length, width, height, weight, volume, area, their statistical analysis.
temperature, cost, level, age, time, and speed).
Observations involves tracking of changes during Presentation and Interpretation of Data
a specified period.
Table helps summarize and categorize data using
Participant Observation is a form of observation columns and rows. It contains headings that
wherein the researcher becomes a complete indicate the most important information about your
observer or a participant in the study through the study.
experience of spending time with a group of people
and closely observing their actions, speech Graphs are visual representations which focuses
patterns, and norms, which in turn the researcher on how a change in one variable affects to another.
can gain an understanding. It allows the observer to They are used to organize information to show
become a member of the group or community that patterns and relationships. A graph shows this
the participants. information by representing it as a shape.
belong to. It can be performed covertly (i.e.,
participants are not aware of the purpose behind the Line Graph illustrates trends and changes in data
observation. It can be done also overtly, wherein over time, Bar Graph illustrates comparisons of
participants know the intention or objectives of the amounts and quantities, while Pie Graph (Circle
observation Graph) displays the relationship of parts to a whole.
Prepared by: Nessel D. Auditor
School: Surigao del Norte National High School
Republic of the Philippines
Department of Education
Caraga Administrative Region
Division of Surigao del Norte
vertical bar graph is a data representation In the process of elaborating the way you analyze
technique that depicts the data using vertical data to determine whether the hypotheses are true
rectangular bars. It makes comparison and data or false, the researcher must consider the
analysis easier.
following:
P-value Approach involves determining the A. The researcher must be observant to its
probability (assuming the null hypothesis were content.
true) of observing a more extreme test statistic in B. The researcher must be resourceful to
the direction of the alternative hypothesis than the support claims.
one observed. If the P-value is less than (or equal C. The researcher must be keen enough to
to) 𝛼 then the null hypothesis is rejected in favor of scrutinize data.
the alternative hypothesis. And, if the P-value is
greater than 𝛼, then the null hypothesis is not External validity refers to the extent to which the
rejected. results of a study are generalizable to patients in our
daily practice, especially for the population that the
The significance level is the probability of sample is thought to represent.
rejecting the null hypothesis when it is true. For
example, a significance level of 0.05 indicates a 5% The strongest linear relationship is indicated by
risk of concluding that a difference exists when a correlation coefficient of -1 or 1. The weakest
there is no actual difference. Lower significance linear relationship is indicated by a correlation
levels indicate that you require stronger evidence coefficient equal to 0. A positive correlation means
before you will reject the null hypothesis. that if one variable gets bigger, the other variable
tends to get bigger.
Recommendations of the study are the added
suggestions that you want people to follow when Non-experimental research is research that lacks
performing future studies. the manipulation of an independent variable.
Rather than manipulating an independent variable,
Research Implications are important content of researchers conducting non-experimental research
your conclusion. It refers to the logical relations simply measure variables as they naturally occur
and their result in each situation. The conclusions (in the lab or real world).
you draw from the findings, how you linked those
Intervention research is all about learning what
to a specific theory or practice comprises the
treatments or strategies work best to improve
implication of the study. There are two forms of outcomes and making a difference in what matters
implications: the practical and theoretical most to you. Although developing and testing
implications. interventions can be a challenging and lengthy
process, establishing the efficacy of a new
Practical Implication is also called as convenient intervention or treatment that improves the
implication. It is a realistic explanation of what health of a population for whom you care deeply
your research findings might mean and the fact that about is both personally and professionally
would arise if those circumstances were met. rewarding.

Conclusions are statements drawn from the Quantitative data are expressed through the
findings which present the implications of the following forms:
results and answer the research questions presented
1. Fractions
at the beginning of the paper. 2. Numbers
3. Percentages
A conclusion is the last part of something, its end
or result. When you write a paper, you always end
by summing up your arguments and drawing a
conclusion about what you've been writing about.
Prepared by: Nessel D. Auditor
School: Surigao del Norte National High School
Republic of the Philippines
Department of Education
Caraga Administrative Region
Division of Surigao del Norte
Data gathering procedure.
You may include your manner of asking Data is raw, unorganized facts that need to be
permission from the institution or school you will processed. Data can be something simple and
be conducting the study, how you will do the seemingly random and useless until it is organized.
informed consent for the participants in the study, Data are the facts or details from which
and explain that you will be ensuring information is derived. Individual pieces of data are
confidentiality in personal data collected from rarely useful alone. For data to become
participants as well as explain the research information, data needs to be put into context.
purposes of the collected data.

Statistical Tools - You may ask the help of your


teacher in Statistics or your research teacher if you
get lost in this part.

Data turn to numbers.


To analyze quantitative data means to quantify or
change the verbally expressed data into numerical
information.

Null hypothesis is rejected in a two-tailed test,


when the test value falls in any of the two critical
regions.

A type I error occurs during hypothesis testing


when a null hypothesis is rejected, even though it
is accurate and should not be rejected. The null
hypothesis assumes no cause-and-effect
relationship between the tested item and the stimuli
applied during the test.

validity is harder to assess than reliability, but it is


even more important. To obtain useful results, the
methods you use to collect data must be valid: the
research must be measuring what it claims to
measure. This ensures that your discussion of the
data and the conclusions you draw are also valid.

What is validity?
Validity refers to how accurately a method
measures what it is intended to measure. If research
has high validity, that means it produces results that
correspond to real properties, characteristics, and
variations in the physical or social world.

High reliability is one indicator that a measurement


is valid. If a method is not reliable, it probably isn’t
valid.

Prepared by: Nessel D. Auditor


School: Surigao del Norte National High School
Republic of the Philippines
Department of Education
Caraga Administrative Region
Division of Surigao del Norte

Prepared by: Nessel D. Auditor


School: Surigao del Norte National High School

You might also like