Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

NAME: Castro, Nonie B.

DATE: Feb 18, 2024

SECTION: BSP – 2A

ARCHIVAL RESEARCH

Archival research: Expanding the methodological toolkit in social psychology.


Heng, Y. T., Wagner, D. T., Barnes, C. M., & Guarana, C. L. (2018). The article advocates for the
incorporation of archival research as a valuable addition to the methodological toolkit in social psychology,
complementing traditional laboratory experiments. While laboratory experiments offer advantages, relying solely
on them poses limitations. The authors emphasize the importance of diversifying research methods to enhance
the discipline's robustness. Four published examples are presented to illustrate the benefits and limitations of
archival research. The article offers suggestions for social psychologists to capitalize on the strengths of archival
research while mitigating its weaknesses. Furthermore, it provides useful resources and directions for utilizing
archival data, encouraging researchers to bolster scientific literature by combining laboratory experiments with
archival research.

CASE STUDY

A case study of the predicting power of cognitive, metacognitive and motivational strategies in girl
students’ achievements

Samadi and Davaii (2012). The researcher explored the impact of cognitive, metacognitive, and
motivational strategies on the academic achievements of 245 third-grade girls in a Tehran middle school.
Validated instruments included Self-regulation in Learning and Motivational Strategies questionnaires. Results
from multiple regression analysis showed that while all strategies predicted academic success, metacognitive
strategies emerged as the strongest predictors. The study highlights the significance of developing metacognitive
skills to enhance girls' academic achievements.
CONTENT ANALYSIS

Content-Analysis Research: An Examination of Applications with Directives for Improving


Research Reliability and Objectivity
Kolbe, R. H., & Burnett, M. S. (1991). The researcher conducted a comprehensive empirical review of
128 published studies employing content-analysis methods, employing Harold Kassarjian's critical guidelines for
evaluation. They extended these guidelines to encompass an empirical exploration of multiple dimensions of
objectivity, while also addressing reliability concerns through an examination of factors crucial to replication and
interjudge coefficient calculations. The findings underscore a widespread need for improvement in the
application of content-analysis methods across the studies reviewed. The authors not only identify this need but
also offer practical suggestions for enhancing reliability coefficients and improving overall research objectivity.
In essence, the article provides valuable insights and directives to guide researchers in elevating the quality and
rigor of their content-analysis research.

You might also like