PRACRE2-3 (Semi-Final)

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

PRACRE2-3 (Semi-Final)

Chapter 1, 2 and 3 of a Research Paper (kayo na bahala maghanap about dito)

Theoretical and Conceptual Framework


Theoretical -general frame of reference used for observation, defining concepts, developing
research designs, nd interpreting and generalizing findings.
Conceptual -a system of ideas, beliefs, assumptions, and theories that inform, support and
cater the study.
Theoretical Conceptual
Scope Broader in scope as it can be used More focused in scope as it
in different studies directly relates to a specific
study
Focus of content Particular theory Set of related concepts that will
be specifically used in the study
Number of theories Only one theory May combine different theories
into one cohesive framework
Time of development Already existing prior to the Developed while planning and
conduct of the study writing a research
Skills in Drafting Literature Review
Synthesizing -involves review of several references that talk about the same subject and
consolidating them into one cohesive text
Note-taking -involves writing information from a source text and integrating info into current
study
Summarizing -involves condensing a lengthy piece of source material.
-can be done through outline and non-outline form.
Paraphrasing -is a form of note-taking that involves rewording ideas from the original text in a
more detailed way.
Different Ways of Arranging Literature Reviews
Topical Order -by main topics or issues, showing relationship to the main problem or topic
Chronological Order -simplest of all, organise by dates of published literature
(Wala makita sa book, kulang pa to) :>
Different Style in Citing References
American Psychological -business, economics, education…
Association APA Style (Falchikov & Boud, 1989)
name. (&)| article title. | book name, |59 (4), 395-430.
Modern Language -English, and some fields in the humanities and Arts.
Association MLA Style (Falchikov and Boud 415)
name (and) | “article title” | book name | vol.59, no. 4, 1989, pp.359-430.
Chicago Manual of Style -various dicplines, most popularly music and history.
CMOS (Falchikov and Boud 1989, 415)
name (and) | “article title” | book name 59, no. 4, (1989): 359-430.
Types of Synthesis
Microlevel synthesis -consolidation of ideas to explain a concept
Macrolevel synthesis -performed when several studies are consolidated to establish a research gap
and not just to explain a concept
Quantitative Research Designs
Descriptive -to observe and report on a certain phenomenon
Correlational -determine the nature of relationship between variables without looking into
cause
Ex post facto -infer the causes of phenomenon which already occurred
Quasi-experimental -establish cause and effect.
no random assignments of individual
Experimental -establish cause and effect.
individual assignments are randomly assigned

Hawthorne effect – refers to observed change in the behavior of participants


who know they are being observed.
Prototyping -development approach to improve the planning and execution of creative
designs such as software and technology development.
Ways of Determining the Sample Size
Heuristics -refers to the rule of thumb for the sample size used in a study
Literature review -it will serve as a reference for providing the validity of the sample size that you
plan to use.
Formulas -formula used by National Education Association in United States can be used to
compute for the needed sample size.
-it will obtain the ideal sample size for a given population.
Power analysis -considered as the most precise among the four.
-this strategy was founded on the principle of statistical power which refers to
the probability of rejecting a false null hypothesis.
-used to determine the sample size sufficient for measuring the effect size which
refers to the degree of difference between the control and treatment groups.
Determining the Sample Size Using the Formula given by the National Education Association in United
States
Solution 𝑿𝟐 𝑵𝑷(𝟏 − 𝑷)
𝒔= 𝟐
𝒅 (𝑵 − 𝟏) + 𝑿𝟐 𝑷(𝟏 − 𝑷)
s = required sample size
X^2 = table value of chi-square for 1 degree of freedom at the desired confidence
level (1.96)^2
N = population size
P = population proportion (0.5)
d = the degree of accuracy expressed as a proportion (.05)

(1.96)2 (105)(1 − 0.50)


𝑠=
[(0.5)2 (105 − 1)] + [(1.96)2 (0.50)(0.50)]

(3.8416)(105)(0.50)(0.50)
=
[(0.0025) + (104)] + [(3.8416)(0.50)(0.50)]

100. 842
= = 82.63(𝒐𝒓 𝒔𝒂𝒎𝒑𝒍𝒆 𝒔𝒊𝒛𝒆 𝒐𝒇 𝟖𝟑 𝒑𝒂𝒓𝒕𝒊𝒄𝒊𝒑𝒂𝒏𝒕𝒔)
1.2204

Different Sampling Procedures


Purposive sampling -deliberate selection of samples based on characteristics
Random sampling -selection of a group participants from a larger population by chance.
Simple random --individuals have an equal chance of being selected is considered the best way
sampling to obtain a represented sample.
Stratified sampling -is similar to simple random sampling. However, the population is divided into
subgroups.
Cluster sampling -involves the groupings of the population into subgroups or clusters.
Systematic sampling -participants are selected from a list based on their order in the population or on
a predetermined interval.
Research Instrument Validity
Validity -the degree to which an instrument measures what is supposed to measure
Face validity -appears to measure the variables being studied
Content validity -degree which an instrument cover a representative sample of the variable to be
measured.
Construct validity -degree which an instrument measures the variables being studied as a whole
Criterion validity -degree which an instrument predicts the characteristics of a variable in a certain
way.

Concurrent validity - when it is able to predict results similar to those of a test


already validated in the past.
Predictive validity - when it produces results similar to those of another
instrument that will be employed in the future.
Research Instrument Reliability
Reliability -refers to the consistency of the measures of an instrument
Test-retest reliability -achieved by administering an instrument twice to the same group of
participants and then computing the consistency of scores.
Equivalent forms -is measured by administering two test identical in all aspects except the actual
reliability wording of items.
Internal consistency -measures how well the items in instrument measure the same construct.
reliability
3 ways of measuring Internal consistency reliability:
Split-half coefficient – obtained by administering a single instrument aimed at
measuring only one construct.
Cronbach’s alpha – measures reliability with respect to each item and construct
being examined.
Kuder-Richardson formula – test reliability in terms of instruments of a
dichotomous nature, such as yes or no tests.
Inter-rater reliability -measures the consistency of scores assigned by two or more rates on a certain
set of results.

Kappa coefficient - one of the most popular statistical tools in measuring inter-
rater reliability.
Data Collection Procedure
Before
1. Develop data collection instru and materials
2. Seek permission where the study will be conducted
3. Select and screen population using sampling technique
4. Train the research personnel involved in data gathering
5. Obtain informed consent form the participants
6. Pilot-test the instruments to determine potential problems that may occur when administered
During
1. Provide instruction to participants
2. Administer the instruments, and implement the intervention or treatment
3. Utilize triangulation in the method where there is two or more sources and methods for
validating data
After
1. Encode and archive the data
2. Safeguard the confidentiality of the data
3. Examine and analyze the data using statistical methods
Types of Test

1. Multiple Choice
2. Matching Type
3. Sequencing
4. Computation

Remember this:
Parts of Research Methodology

1. Research Design
- Quantitative Research design to be used with definition and citation
- Explanation for choosing the specific quantitative research design
2. Population and Sample
- Number of Respondents
- Characteristics of the Respondents
3. Research Instrument
- Description of the survey questionnaire – construction, validation and scoring of responses
4. Validation of the Research Instrument
- Narration of the critiquing and checking of the questionnaire done by the experts
5. Data Gathering Procedures
- Narration of all the processes done in gathering data
6. Data Analysis
- Discussion of the utilization of statistical measures

You might also like