Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 52

Key Concepts

of the
Research Methodology
The Research Process
The Research Process
Research Idea
 Literature Review
 Formulate The Research Problem/Question
Research Design
 Collecting Data
 Data Analysis
 Answering The Empirical Research Questions
 Results Interpretation
 Comparing with Earlier Research
Conclusion
Evaluate/Reflect
Overview
Steps of the Scientific Method
Formulate the Research Problem
Collecting Data
Developing The Research
Hypothesis
Conclusion
Evaluate/Reflect
Steps of the Scientific Method - Recap

•Shaped like an hourglass -


starting from general questions,
narrowing down to focus on one
specific aspect.
•Designing the research where
we can observe and analyse this
aspect.
•Conclude and generalize to the
real world.
Steps of the Scientific Method
General Question
Narrow Down
Designing Experiment
Analysis
Conclusion & Publishing
Cycle
Overview
Steps of the Scientific Method
Formulate The Research Question
Collecting Data
Developing The Research
Hypothesis
Conclusion
Evaluate/Reflect
Overview
Steps of the Scientific Method
Formulate The Research Question
Collecting Data
Developing The Research
Hypothesis
Conclusion
Evaluate/Reflect
Formulate a Question

Define the Question


Review the Literature
Create a Hypothesis
Prediction
Conceptual Variable
Formulate the Research Question

A stabile step in research is formulation of a research


question.
Research starts with a question or assumption you have on a
real world phenomenon.
Narrow it down to a research question that defines what you
want to figure out and review the research and literature
already done on that subject.
With an understanding of the subject and a well defined
question form an hypothesis that will be tested against an
opposite assumption called the null hypothesis.
How to define a research problem
It is one of the first statements made in any research paper and,
as well as defining the research area, should include a quick
synopsis of how the hypothesis was arrived at.
Operationalization is then used to give some indication of the
exact definitions of the variables, and the type of scientific
measurements used.
This will lead to the proposal of a viable hypothesis.
 As an aside, when scientists are putting forward proposals for
research funds, the quality of their research problem often
makes the difference between success and failure.
Defining the research problem
 Defining a research problem is the fuel that drives the scientific process, and is the foundation
of any research method and experimental design, from true experiment to case study.
 Formulating the research problem is one of the first step of the research process.
 It is one of the first statements made in any research paper/thesis and, as well as defining the
research area, should include a quick synopsis of how the hypothesis was arrived at.
 Interest as a result of a literature review and a study of previous experiments, and research.
 Many scientific researchers look at an area where a previous researcher generated some
interesting results, but never followed up.
 It could be an interesting area of research, which nobody else has fully explored.
 A scientist may even review a successful experiment, disagree with the results, the tests used, or
the methodology, and decide to refine the research process, retesting the hypothesis.
 This is called the conceptual definition, and is an overall view of the problem.
 A technical report will generally begin with an overview of the previous research and real-world
observations.
 The researcher will then state how this led to defining a research problem.
Structuring the Research Problem

The research problem is usually written almost like a statement of intent


in any scientific paper.
Defining a research problem is crucial in defining the quality of the
answers, and determines the exact research method used.
 A quantitative experimental design uses deductive reasoning to arrive at
a testable hypothesis.
A qualitative research design use inductive reasoning to propose a
research statement.
When researchers are putting forward proposals for research funds, the
quality of their research problem often makes the difference between
success and failure.
Formulation of the Research Problem
Depends On:
• The political and cultural climate
• The intellectual and scientific environment
• The maturity and the collected, knowledge of the
topic area (field)
• The knowledge of the researcher about existing
achievements in the field
• The ability of the researcher to delimit the problem
• The ability of the researcher to listen
• The patience of the researcher to re-consider and
to develop an idea
Examples of Research Questions
Existence:
• Does X exist?
Description & Classification
• What is X like?
• What are its properties?
• How can it be categorized?
• How can we measure it?
• What are its components?
Descriptive-Comparative
• How does X differ from Y?
Examples of Research Questions
Frequency and Distribution
• How often does X occur?
• What is an average amount of X?
Descriptive-Process
• How does X normally work?
• By what process does X happen?
• What are the steps as X evolves?
Examples of Research Questions
Relationship
• Are X and Y related?
• Do occurrences of X correlate with
occurrences of Y?
Causality
• What causes X?
• What effect does X have on Y?
Examples of Research Questions
Causality-Comparative

Does X cause Y?
 Does X prevent Y?
Does X cause more Y than does Z?
 Is X better at preventing Y than is Z?
 Does X cause more Y than does Z under one condition
but not others?
Design
What is an effective way to achieve X?
 How can we improve X?
When Looking for Research Problems:
Consider if the research will
 Educate you?
 Lead to even better research projects?
 Be an enjoyable way to spend your time? A PhD takes 3 or more
years…
 Serve a goal that will still seem worthy 6, 12, or 48 months from
now?
 Be likely to "succeed" in some sense? (guideline: will it make an
interesting conference/journal article?)
 Get the academic research community interested in you and
your work?
 Prove to an industrial employer that you have what they want?
 Make you a so-called "famous grad student"?
Overview
Steps of the Scientific Method
Formulate The Research Question
Collecting Data
Developing The Research Hypothesis
Conclusion
Overview
Collect Data
Preparation: Make Hypothesis Testable
(Operationalization)
Preparation: Design the Study
Conduct the Experiment or Observation
Organize the Data
Organize the data and analyze it to see if it supports or
rejects your hypothesis.
The exact type of test used depends upon many
things, including
the field
 the type of data and
 sample size among other things.
The vast majority of scientific research is ultimately
tested by statistical methods, all giving a degree of
confidence in the results.
Operationalize
Operationalization is used to give some indication of the
exact definitions of the variables, and the type of
measurements used.
This will lead to the proposal of a viable hypothesis.
The hypothesis should be both testable and falsfiable.
Then design a study and construct a test/experiment to
collect data.
 Be aware of validity when choosing variables, especially
when studying people.
You might not be measuring what you think you are
measuring.
Validity
1. Construct validity
 Do the operational measures reflect what the researcher had in mind ?

2. Internal validity
 Are there any other factors that may affect the results ?
 Mainly when investigating causality !

3. External validity
 To what extent can the findings be generalized ?
 Precise research question & units of analysis required

4. Reliability
 To what extent is the data and the analysis dependent on the researcher
(the instruments, …)
 Other categories have been proposed as well
 credibility, transferability, dependability, confirmability
Qualitative vs. Quantitative
Qualitative tends to have more open questions and
hypotheses.
Quantitative research have an experimental
approach focusing more on counting and classifying
observations.
Data Collection for Qualitative Studies
Interviews
Field Notes
 Observations
Documents
Log Files/Content Analysis
Biographies
Oral histories and story telling
Newsletters, rules, principles and pictures
Data Collection for Quantitative Research
Quantitative, Experimental
objective
data collection, logging
numerical measurements
performance metrics
questionnaires on large groups
comparison with a control
group or “baseline” system
Types of Evaluation Data
Qualitative, Rich
Subjective
in-depth
Qualitative
Interviews
Descriptive
Naturalistic
Types of quantitative data

Nominal or categorical data – categories with counts


e.g., no. of men versus no. of women
Ordinal data – ranked data
e.g., grades A, B, C, D, E
Cardinal data – continuous measurements
e.g., velocity, height, age etc
Overview
Steps of the Scientific Method
Formulate The Research Problem
Collecting Data
Developing The Research
Hypothesis
Conclusion
Evaluate/Reflect
Different Methods Used
1. Feasibility study

2. Pilot Case

3. Comparative study

4. Observational Study [a.k.a. Etnography]

5. Literature survey

6. Formal Model

7. Simulation
Feasibility Study
 Is it possible to solve a specific kind of problem … effectively ?
 + computer science perspective (P = NP, Turing test, …)
 + engineering perspective (build efficiently; fast — small)
 + economic perspective (cost effective; profitable)
 • Is the technique new / novel / innovative ?
 + compare against alternatives
 ➡ See literature survey; comparative study
 • Proof by construction
 + build a prototype
 + often by applying on a “CASE”
 • Conclusions
 + primarily qualitative; "lessons learned"
 + quantitative
 - economic perspective: cost - benefit
 - engineering perspective: speed - memory footprint
Pilot Case (a.k.a. Demonstrator)

 Here is an idea that has proven valuable; does it work for us ?


 ➡ Metaphor: Portugal (Amerigo Vespucci) explores western route
 • proven valuable
 + accepted merits (e.g. “lessons learned” from feasibility study)
 + there is some (implicit) theory explaining why the idea has merit
 • does it work for us
 + context is very important
 • Demonstrated on a simple yet representative “CASE”
 + “Pilot case” ≠ “Pilot Study”
 • Proof by construction
 + build a prototype
 + apply on a “case”
 • Conclusions
 + primarily qualitative; "lessons learned"
 + quantitative; preferably with predefined criteria
 ➡ compare to context before applying the idea !!
Comparative Study
 Here are two techniques, which one is better ?
 • for a given purpose !
 + (Not necessarily absolute ranking)
 • Where are the differences ? What are the tradeoffs ?
 • Criteria check-list
 + predefined
 - should not favor one technique
 + qualitative and quantitative
 - qualitative: how to remain unbiased ?
 - quantitative: represent what you want to know ?
 + Criteria check-list should be complete and reusable !
 ➡ If done well, most important contribution (replication !)
 ➡ See literature survey
 • Score criteria check-list
 + Often by applying the technique on a “CASE”
 • Compare
 + typically in the form of a table
Observational Study [Ethnography]

 Understand phenomena through observations


 ➡ Metaphor: Diane Fossey “Gorillas in the Mist”
 • systematic collection of data derived from direct observation of the
 everyday life
 + phenomena is best understood in the fullest possible context
 ➡ observation & participation
 ➡ interviews & questionnaires
 • Observing a series of cases “CASE”
 + observation vs. participation ?
 • example: Action Research
 + Action research is carried out by people who usually recognize a problem or limitation in
 their workplace situation and, together, devise a plan to counteract the problem,
 implement the plan, observe what happens, reflect on these outcomes, revise the plan,
 implement it, reflect, revise and so on.
 • Conclusions
 + primarily qualitative: classifications/observations/…
Literature Survey

 What is known ? What questions are still open ?


 • source: B. A. Kitchenham, “Procedures for Performing Systematic
 Reviews”, Keele University Technical Report EBSE-2007-01, 2007
 Systematic
 • “comprehensive”
 ➡ precise research question is prerequisite
 + defined search strategy (rigor, completeness, replication)
 + clearly defined scope
 - criteria for inclusion and exclusion
 + specify information to be obtained
 - the “CASES” are the selected papers
 • outcome is organized
Literature Survey
Formal Model

 How can we understand/explain the world ?


 • make a mathematical abstraction of a certain problem
 + analytical model, stochastic model, logical model, re-write
 system, ...
 + often explained using a “CASE”
 • prove some important characteristics
 + based on inductive reasoning, axioms & lemma’s, …
 Motivate
 • which factors are irrelevant (excluded) and which are not
(included) ?
 • which properties are worthwhile (proven) ?
 ➡ See literature survey
Formal Model
Simulation
 What would happen if … ?
 • study circumstances of phenomena in detail
 + simulated because real world too expensive; too slow or impossible
 • make prognoses about what can happen in certain situations
 + test using real observations, typically obtained via a “CASE”
 Motivate
 • which circumstances are irrelevant (excluded) and which are not
 (included) ?
 • which properties are worthwhile (to be observed/predicted) ?
 ➡ See literature survey
 Examples
 • distributed systems (grid); network protocols
 + too expensive or too slow to test in real life
 • embedded systems — simulating hardware platforms
 + impossible to observe real clock-speed / memory footprint / …
 ➡ Heisenberg uncertainty principle
Hypothesis
 A research question is a statement that identifies the phenomenon to be studied. A well-thought-out and
focused research question leads directly into your hypotheses.
 What predictions would you make about the phenomenon you are examining? This will be the foundation of your
application.
 Hypotheses are more specific predictions about the nature and direction of the relationship between two
variables.
 For example, “Those researchers who utilize an online grant writing tutorial will have higher priority scores on their
next grant application than those who do not.”
 Such hypothesis can be than verified or falsified.
 It is desirable to create strong hypothesis. What are the characteristic features of such hypothesis? It should:
 – give insight into a research question,
 – be testable and measurable by the proposed experiments,
 – spring logically from the experience of the staff
 Normally, no more than three primary hypotheses should be proposed for a research study. A proposal that is
hypothesis-driven is more likely to be funded than a “fishing expedition” or a primarily descriptive study (The
Research Assistant, 2003).
 Hypothesis is the result of researcher's creativity. It is a rational assumption on the possible cause(s) of the
observed phenomenon. Hypothesis is a source of questions focused to the core of the research problem. Answers to
the questions may support (verify) or reject (falsify) formulated hypothesis.
 The research process can never completely verify or falsify the studied hypothesis, it can be done with
certain probability, only.
 Researcher who has not any doubts on his own research results is not real researcher.
Creating a testable hypothesis
e.g. “Does our system really work?” – does it lead to
some measurable difference in performance?
Often need to specify:
Independent variable (the aspect of the environment
that we are interested in – “what do I change?”)
Dependent variable (the behaviour that we are
interested in –“what do I observe?”)
Controlled variables (“what do I keep the same?”)
(variable = something that changes, takes different
values,that we can alter or measure)
Test The Hypothesis
Organize the Data
Analyse the Results
Check if the Results Support your Hypothesis
Testing The Hypothesis
More accurately, you should have two hypotheses, the
alternative and the null.
For testing, you will be analyzing and comparing your
results against the null hypothesis, so your research
must be designed with this in mind. It is vitally
important that the research you design produces
results that will be analyzable using statistical tests.
Statistical analysis of data

 When analysing data, your goal is simple: to make the strongest


possible conclusion from limited amounts of data.
 To do this, you need to overcome two problems:
 Important differences can be obscured by natural variation and
experimental imprecision (noise).
 This makes it difficult to distinguish real differences from random
variability.
 Is the observed effect genuine or merely due to chance?
 The human brain excels at finding patterns, even from random data.
 Our natural inclination (especially with our own data) is to conclude
that differences are real, and to minimize the contribution of random
variability.
 Statistical rigour helps to prevent you from jumping to false
conclusions.
Hypothesis
 The exact type of statistical test used depends upon many things,
including the field, the type of data and sample size, among other
things.
 The vast majority of scientific research is ultimately tested by statistical
methods, all giving a degree of confidence in the results.
 For most disciplines, the researcher looks for a significance level of 0.05,
signifying that there is only a 5% probability that the observed results
and trends occurred by chance.
 For some scientific disciplines, the required level is 0.01, only a 1%
probability that the observed patterns occurred due to chance or error.
 Whatever the level, the significance level determines whether the null
or alternative is rejected, a crucial part of hypothesis testing.
Hypothesis: Example
 A null hypothesis (H0) is a hypothesis to be disproved.
 E.g. The hypothesis “Maximum reflex efficiency is achieved after eight hours of sleep.” can
be turned into a working null hypothesis simply by adding “not”, i.e.
 Maximum reflex efficiency is not achieved after eight hours of sleep.

Another null hypothesis is: Sleep does not have an effect on reflexes.
 Null hypotheses are used in the sciences.
 In the scientific method, a null hypothesis is formulated, and then a scientific
investigation is conducted to try to disprove the null hypothesis.
 If it can be disproved, another null hypothesis is constructed and the process is repeated.

 As an example, begin with the null hypothesis: Sleep does not affect reflexes.
 If we can disprove this, we find that sleep does have an effect, then go to the next null
hypothesis: Different amounts of sleep have the same effect on reflexes.
 If we can disprove this, go to: Maximum reflex efficiency is not achieved after eight hours
of sleep.
 And so on. At each stage in the investigation, we conduct experiments designed to try to
disprove the null hypothesis
Hypothesis: Significance Test
Significance tests determine the probability that the null
hypothesis is true.
 E.g. we use a significance test and find that the probability
that the null hypothesis is true is less than 5 in 100, i.e.
stated as p <.05.
If the chances that something is true are less than 5 in
100, It’s a good bet that it’s not true.
 If it’s probably not true, we reject the null hypothesis.
Hypothesis: Significance Test
There is no rule of nature that dictates at what probability
level the null hypothesis should be rejected.
 However, conventional wisdom suggests that .05 or less
(such as .01 or .001) is reasonable.
 Researchers should state in their reports the probability level
they used to determine whether to reject the null hypothesis.
Note that when we fail to reject the null hypothesis
because the probability is > .05, we “fail to reject” the null
hypothesis and it stays on our list of possible explanations
 i.e. we never “accept” the null hypothesis as the only
explanation.
Conclusion
Measure performance of your approach
allow replication of results by other researchers –
repeatability
Allow objective comparisons
e.g. your new method vs. state-of-the-art method, vanilla
method,“straw man”, etc.
lesion experiments – measure contribution of system
components
Statistical testing allows us to express the level of
confidence in a result
Provide benchmarks for future research
Conclusion

Look for Other Possible Explanations


Generalize to the Real World
Suggestions to Further Research

You might also like