Download as pdf or txt
Download as pdf or txt
You are on page 1of 197

Lesson

1
Research Methodology
Introduction
Dr Joe Essien
Course Coverage
• Overview of academic business research
• What must be in a project plan and a project?
• Formulating research aims
• The design of research projects
• Evaluating research
• Statistical analysis for research
• Qualitative data analysis
• Analysing data and presenting results
• Philosophy of research
• Questionnaire design
• Interview design and qualitative research
• Reminders about the project
• Interviews and qualitative research – more detail
• More on literature reviews
Organization of this lecture
Research and Methodology:
• Research defined and described
• Some classifications of research
• Define and discuss methodology
• Description of the research
process
• Discuss creativity and its role in
the research process

3
Why do research?

• Validate intuition

• Improve methods

• Demands of the Job

• For publication
Purpose and Characteristics of Research

• Purpose:
• Discover truth about something; and/or
• Find a good way of doing something
• Must be
• Systematic and as thorough and
trustworthy as possible
• Clearly written and with sufficient detail
for readers to check credibility
• Ethical
Types of research include …
• Large scale surveys (of people, organisations,
events, etc) analysed statistically
• Small scale surveys with emphasis on “qualitative”
detail
• Case studies (to see how something works in detail)
• Experiments (change something to see what
happens)
• Models can be set up, tested and used for …
• Participant observation (observe as participant)
• Action research (combine research and action)
• Evaluation
• … and may other possibilities …be imaginative!
Many projects combine several of these
Research Defined and Described

“Research is the systematic approach to


obtaining and confirming new and
reliable knowledge”
• Systematic and orderly (following a
series of steps)
• Purpose is new knowledge, which
must be reliable

This is a general definition which applies


to all disciplines

7
Research is not

Accidental discovery :
1. Accidental discovery may occur
in structured research process
2. Usually takes the form of a
phenomenon not previously
noticed
3. May lead to a structured research
process to verify or understand
the observation
8
Research is not … cont.

Data Collection
• an intermediate step to gain
reliable knowledge
• collecting reliable data is part
of the research process

9
Research is not … cont.
Searching out published research
results in libraries (or the internet)

• This is an important early step of


research
• The research process always
includes synthesis and analysis
• But, just reviewing of literature is
not research

10
Research is…
1. Searching for explanation of events,
phenomena, relationships and causes
• What, how and why things occur
• Are there interactions?

2. A process
• Planned and managed – to make the
information generated credible
• The process is creative
• It is circular – always leads to more
questions

11
• All well designed and conducted research has
potential application.
• Failure to see applications can be due to:

• Users not trained or experienced in the


specialized methods of economic research
and reasoning

• Researchers often do not provide adequate


interpretations and guidance on
applications of the research

• Researchers are responsible to help users


understand research implications (How?)

12
Public good
• Public research is a public good
• May be more rigorous and
objective because it is subject to
more scrutiny
• Private research may also be
rigorous
• But research on a company’s
product may be questioned as
biased.

13
Classification of Research

• Before classification, we must first


define types of research
• Different criteria are used to
classify research types

(All of these are somewhat arbitrary


and artificial)
14
Basic vs Applied Research
• Basic – to determine or establish fundamental
facts and relationships within a discipline or field
of study. Develop theories … (examples in
economics?)

• Applied – undertaken specifically for the purpose


of obtaining information to help resolve a
particular problem

• The distinction between them is in the application


• Basic has little application to real world policy
and management but could be done to guide
applied research

15
Disciplinary, Subject-matter, and
Problem-solving Research

16
Disciplinary
• designed to improve a discipline
• dwells on theories, fundamental
relationships and analytical procedures
and techniques
• In economics, the intended users are
other economists
• Provides the conceptual and analytical
base for other economic research
• It is synergistic and complementary
with subject matter and problem-
solving research
17
Disciplinary… cont.
• Provides the foundations for applied research

• Circular as applied research reveals the


shortcomings of disciplinary research

• Examples of some economic theories?


(supply & demand, price elasticity, consumer
utility …)

18
Subject-matter research
• “research on a subject of interest to a set of
decision makers “

• Tends to follow subject-matter boundaries


within a discipline ( eg. resource economics,
production economics, labor economics)

• Inherently multidisciplinary, drawing


information from many disciplines

• eg. consumer economic draws from


psychology, natural resource economics
from biology, economic policy from political
science

19
Subject-matter research … cont.

• Provides policy makers with general


knowledge to make decisions about
various problems.
• A primary source of policy applications for
economics
• Subject-matter research is a cornerstone
in economics – it involves direct
application of economics to contemporary
issues.

20
Problem-solving research
• Designed to solve a specific problem for a
specific decision maker

• Often results in recommendations on


decisions or actions

• Problem-solving research is holistic – uses all


information relevant to the specific problem
(while disciplinary research tends to be
reductionist)

• Disciplinary research is generally the most


“durable” (long lasting); problem-solving
research the least durable

21
Analytic vs Descriptive Research

• Descriptive Research – the attempt to


determine, describe, or identify something

• The intent is often synthesis, which pulls


knowledge or information together

• Analytic – the attempt to establish why


something occurs or how it came to be

• All disciplines generally engage in both

22
Methodology Defined & Described

Methodology and Method are often (incorrectly) used


interchangeable

•Methodology – the study of the general approach to


inquiry in a given field

•Method – the specific techniques, tools or


procedures applied to achieve a given objective

• Research methods in economics include


regression analysis, mathematical analysis,
operations research, surveys, data gathering,
etc.

23
Methodology Defined & Described

• Contrast research methodology in


economics (the approach to research)
to economic methodology (the general
approach to economic reasoning and
economic concepts)

• While these are different they are


interdependent ( in the same way as
science and research are related)

24
Some guided platitudes
The following are often assumed (I think
wrongly):
• There are two distinct kinds of research:
• Quantitative (=positivist=hypothetico-deductive),
and
• Qualitative (=phenomenological=inductive).
Instead …
• Positivist research (only) starts from
hypotheses.
Instead …
Qualitative vs. quantitative
• Quantitative usually means statistical –
often with largish samples
• Qualitative means focusing on qualities –
usually with smallish samples studied in
depth
• Disadvantage with statistical approaches
is that the data on each case is often very
superficial
• Disadvantage with qualitative approaches
is that case(s) studied may be untypical
and can’t be used for statistical
generalisation
• Often best to use both approaches. This is
known as Quasi or “mixed methods” –
Regrettable tendency to reduce
things to a simple dichotomy
If you’re a soft and cuddly person:
• Soft and cuddly (e.g. interpretivist,
qualitative, inductivist …) … is good
• Hard and spiky (e.g. positivist, quantitative,
deductivist, …) … is bad

But if you are a hard person you will probably


reverse good and bad above. There are
really many different dichotomies. Reducing
them all to one is neither right nor useful.
But …

• To hard and spiky people, soft


and cuddly research is lacking
in rigour
• To soft and cuddly people, hard
and spiky research is naïve and
lacking in richness
Induction vs hypothetico-
deductive method

• Generalise from the data without preconceptions


(induction)
• Grounded theory. Rigour is in process used to
generate theory from data
Versus
• Use data to test hypotheses or theories
(hypothetico-deductive method)
• Karl Popper. Rigor is in the testing.

Theory building vs theory testing is much the same


distinction
Other useful approaches besides
induction and hypothetico-deduction
• Use a framework or theory or “paradigm” (Kuhn,
1970) to define concepts, questions, and
measurements, but without trying to test the theory
• Arguably what most scientists do most of the
time (c.f. Kuhn, 1970). Rigour is in ensuring the
theory is a good one, and in using it properly.

• Deduction from data, theories and framework. E.g.


the differences between two quality standards can
be deduced.
• Rigour is in checking the deduction and the info
you start with
• Differs from the hypothetico deductive method in
that the result is the deduction itself, not a
confirmation, rejection or revision of a
hypothesis or theory
Lesson

2
Research Methodology
Formulating Research Area
Dr Joe Essien
How to do research
• Read about topic
• Draft aims of research. Clear, simple, focused.
• Draft literature review.
• Draft research plan – check it is really likely to meet
your research aims. Check again.
• Do research/analysis
• Draft research/analysis and
recommendations/conclusions
• Check it fits together and revise all sections
• If it doesn’t fit together revise aims and …
Choose a subject
• Interest
• Career
• Feasibility
• Usefulness

Finding a suitable topic


• Based on an idea
• Based on your experience
• Based on your reading
• Originality
Identify & Refine your Topic
Using your assignment as a guide, brainstorm several
interesting subjects. Refine those subjects into one topic by
listing keywords, similar words or phrases, and broader and
narrower words.

Sample Assignment:
Find an area of interest and write an in-depth, research report (4-6
pages) that investigates a significant issue within that discipline.

Sample Topic:
“What effect does television
have on the eating habits
of children

ERIC has one of the


most comprehensive
thesaurus’ available
Keywords & Related Words:
Television (view related words)
eating habits (view related words)
children (view related words)
What information do you need 1?

1. Do you need facts, figures, statistics?


• View Statistical Databases
2. Do you need a general overview of the
topic?
• World Almanac and Encyclopedia
3. Does your information need to be very
current?
• Learn how to limit your searches by year
What information do you need 2 ?

4. How in-depth does your research need to be?


• Try to use at least 1 source per page for you
assignment. Example: If you are writing a 10
page paper, use 10 scholarly sources.
• You will actually need to find more sources then
you will end up using. If you need to use 10
sources, you should initially gather 15-20.

5. Do you need to consider different or conflicting


points of view?
• Try LexisNexis Reference sources
(click Reference > Polls & Surveys)
Formulating Research aims or questions
• Usually start from vague idea
• Then formulate a clear aim, or list of aims,
that your research will achieve. Think of
these as hoped-for outcomes.
• Alternatively…formulate a clear question or
list of questions.
• This process may require some creative
thinking
• Techniques like brainstorming and mind
maps may be useful
Aims, objectives, questions
• You can formulate your research
aims as aims (or objectives if you
prefer that word) or questions.
• These are different ways of saying
the same thing. Doesn’t matter
which you use, but don’t confuse
things by having aims and
questions
• May be helpful to have a list or
hierarchy of aims, but keep it
simple
Hypotheses
• Hypotheses are statements whose truth you want to
test, or “predicted answers” to research questions
(Robson, 2002)

• Occasionally appropriate as a top level research aim


• e.g. to test the hypothesis that “Working at home
improves quality of life”

• Usually best to avoid hypotheses when formulating


main research aims because questions or aims tend
to be more flexible
• e.g. “How does working at home affect quality of
life?”

• Null hypotheses have a (controversial) role in some


statistical analysis (… as you will see), but they are
not relevant to formulating your overall research aims
Research aims or questions
• Research aims or questions should:
• Be clearly and simply expressed
• Fit together (so that you have a coherent
project)
• Clarify the intended outcome and scope
of the research
• Your research aims or questions
should also
• Be relevant to your degree
• Be achievable
• Present a reasonable level of challenge
Research aims or questions
• Must be research aims, not business or personal
aims.
• However, business or personal aims may be part
of the background motivating your research
aims, and research aims would normally include
the aim of making recommendations to people or
organisations.
• Should generally have a limited scope or focus.
• The danger with general aims is that they lead
to superficial research.
• May relate to theoretical issues. You may be aiming
to test, modify or create a theory
Define Your objectives
• Try to keep these simple
• The more variables the more
difficult
• However use the opportunity
• Get help at this stage
• Senior colleagues
• Experienced researchers
Theory
• “Theory” includes models,
explanatory frameworks,
generalisations, recommendations …
• Examples ….
• Your research should link with any
relevant theory. It may
• Use a theory
• Demonstrate that a theory is useful
• Test a theory
• Modify a theory or create a new theory
Also ask yourself
• Is the research worth doing?
• Are there any ethical or political
problems?
• Is it possible? Have you got
access to the necessary data?
Is it really going to be useful?
• What use do you want the results to be? This may
be a practical use – to find out how to make more
money, or to make life easier – or a contribution
to theory, but it should be something that is
really worth achieving. Must pass the “so what?
test.

• May help to clarify your aims if you imagine


you’ve done the research and write down what
you think your conclusions and
recommendations might be.

• Then work backwards from what you want to


achieve to the best methods to achieve it.
Example of research aims

The aims of this research are to


1 Describe the decision making
strategies of small investors

2 Determine the effectiveness of


these strategies

Any comments? Does this seem


reasonable for a Masters project?
Another example of
research aims
• The aims of this research project are to

• Evaluate Method X for planning


mountaineering expeditions

• If necessary propose and justify


Amended Method X for planning
mountaineering expeditions
Another example of research aims

• What are the important quality


problems in Company X?
• How serious are these problems?
• What is the best strategy for reducing
these problems?

Any comments? Does this seem


reasonable for a Masters project?
Does it matter that they are expressed
as questions?
Three more examples of research aims

1. The aim of this research is to investigate


the role of the internet in banking.

2. This research project aims to explain


activity based costing.

3. The aim of this project is to


• Test the efficient market hypothesis for
the Athens stock exchange, and
• Determine how global warming will
influence share prices.
Any comments? These are not reasonable for
an Masters projects! Why not?
Possible research topics
• Research in a specific organisation
• Best if they are likely to implement any
recommendations
• Take care you have adequate access to
data
• Easier if you have a recognised / paid job
there and / or know key players well.
• Research based on publicly available data
• Eg share prices, the www, published
statistics
• Research based on surveys of the “public”
• These are just some possibilities. There
are more …
Design of research projects
• Design means deciding on the methods and
approaches which will best achieve your aims
• Needs thinking out carefully starting from your
aims
• Check the proposed design will achieve all your
aims
• The design may require the use of a theoretical
framework – which should be explained and its
use justified
• May incorporate several approaches (e.g. earlier
slide)
• Some advocate “flexible” designs (E.g. Robson,
2002)
• E.g. Poppy Jaman’s summary. Any comments?
• E.g. check aims and designs of these projects.
Designing research is not
easy!
• Think about how you can best achieve your
aims
• Consider all possible types of research
• Be imaginative
• Think about it again … and again
• Check you’ve found the best way you can
for meeting all your aims
An example …

• How would these four


approaches work with a project
of interest to you …
Karl Popper’s ideas (1)
• Science works by putting forward
bold theories (or hypotheses) and
then testing them as thoroughly as
possible
• Provisionally accept theories that
have withstood this testing process
• Theories must be sufficiently precise
to be falsifiable – otherwise not
proper science (eg Freud’s theories
are too vague…)
Karl Popper’s ideas (2) - eg
• Einstein’s theory of general relativity
predicts that light from a distant star will be
bent by a small amount by passing close to
the sun. Newton’s theory predicts the light
will not be bent.
• Only possible to check during a total eclipse
of the sun. In an eclipse in 1918 light was
bent as Einstein’s theory predicted

• Newton’s theory is falsified; Einstein’s lives


on and seemed much more credible.
Karl Popper’s ideas (3)
• Theories can come from anywhere –
guesswork, intuition, other theories, etc
• The process of criticising theories and
trying to show they are wrong is vital for
science
• The method applies to both natural and
social sciences
• How would you apply Popper’s ideas to a
management research project? … in
practice, has elements in common with a
“critical” attitude …
Group exercise
Design a research plan for one of the
projects below, and do a pilot study for
part of it. (You may find you need to
make the aims / questions more precise.)
Michael’s project. The provisional aims are:
1. To evaluate the suitability of the PBS
website for prospective PhD students
2. To suggest improvements to the website
from this perspective
Alison’s project on the impact of a
Blackberry on family/work-life balance.
What are the problems and opportunities,
and what would you recommend?
… or …
Email project
How much time do “people” spend on
emails, is it time “well spent”, and if
not how can things be improved?
• Provisional method: Survey to find
how much time is spent on emails,
and respondents’ opinions on
whether this is time “well spent”,
and on recommendations (is this a
satisfactory method?)
• And / or other possibilities … ?
When starting your project
you should …
• Have a clear aim, and a rough idea of your methods
and the relevant literature, and a few ideas about
problems
• Make an appointment with your supervisor and
discuss what you will do and the timescale. Take
your proposal and comments
• Remember your supervisor may have a holiday
planned … agree when you will meet / email. Usual
to send drafts of chapters when completed
• Remember the deadline and plan back from this.
Send your supervisor a draft of the project at least
a month before the deadline
Lesson

3
Research Methodology
Planning &Research Process
Dr Joe Essien
A general design for a
typical degree project
If the aim is to find a good strategy to "improve" X in
org Y, then a possible design may be:

1. Survey/case studies of Org Y to investigate


problems and opportunities
2. Survey/case studies to see how other
organisations do X and which approaches work
well
3. Based on (1), (2), the literature, and perhaps
creative inspiration, consultations within the
organisation, simulation or modelling, devise a
strategy likely to improve X
4. Try/test/pilot/monitor the proposed strategy,
probably in a limited domain
The Process of Research
• The process is initiated with a question or
problem (step 1)

• Next, goals and objectives are formulated to


deal with the question or problem (step 2)

• Then the research design is developed to


achieve the objectives (step 3)

• Results are generated by conducting the


research (step 4)

• Interpretation and analysis of results follow


(step 5)

62
The Process of Research

2
5

4 3

63
Creativity in the Research Process
• Research is a creative process

• “…research includes far more than mere logic … It


includes insight, genius, groping, pondering –
‘sense’ … The logic we can teach; the art we
cannot”

• Research requires (or at least works best) with


imagination, initiative, intuition, and curiosity.

• There are different types of creativity,


characteristic of different situations – “applied” and
“theoretical” most closely associate with economic
research

64
Fostering Creativity
A. Gather and use previously developed
knowledge
B. Exchange ideas
C. Apply deductive logic
D. Look at things alternate ways
E. Question or challenge assumptions
F. Search for patterns or relationships
G. Take risks
H. Cultivate tolerance for uncertainty

65
Fostering Creativity … cont.
I. Allow curiosity to grow
J. Set problems aside … and come back to them
K. Write down your thoughts
“… frequently I don’t know what I think until I
write it”
L. Freedom from distraction … some time to think.

Creativity may provide the difference between


satisfactory and outstanding research.

66
Planning the Study
• Assignment of roles
• Projected time to completion
• Get all equipment before start
• Get ethical approval
• Get funding
• Responsibility
• Data collection
• Accurate testing and measurements
• Stick to the protocol
• Sample size
The protocol
• Write out introduction and methodology in detail
• Give it to people to read to check major flaws
• Get help at this stage

Basics of the protocol


• This where you start writing the paper
• Write intro, methods in detail
• Ethical considerations
• Analytical methods in detail
• Budget
Practical issues

• Timing
• Plan this remembering that your
supervisor may suggest extensive
changes.
• Gantt chart may help.

• Ethics (remember the form!)


• Access to information.
• Take care: this is often difficult!
What must be in a project and
a project plan?

• Reading
• Project guidelines
• Proposal guidelines
• Saunders et al (2007), or another
similar book
What must be in a project?
• Abstract (short summary of project including
conclusions)
• Background and aims (what you’re trying to find out and
why it’s important)
• Literature review (of relevant previous research which
you will build on or extend)
• Research methods – plan and justification (what you did
to meet the aims, and why it was a sensible approach)
• Analysis (in detail, to convince sceptical readers and
impress examiners: important tables, diagrams etc must
be in the text, only details in appendix)
• Results, conclusions, recommendations, limitations,
further research
• References (list works cited in text in alphabetical order)
• Appendices – Ethics form, extra details for the reader
Flexible designs can be more flexible – but everything must
be there!
Features of a good project

• Obviously important and interesting


• Difficult to disagree with because
• Arguments and analysis detailed, clear and
obviously valid
• Possible objections considered and if possible
answered
• Fits together
• Aims met by methods (check this in your project
plan)
• Conclusions follow from analysis
References and citations
• You must give references to publications
which you draw on or quote
• Exact (word for word) quotes must be in
“…” and the reference must be given
• Maximum about one paragraph
• Use one of the standard referencing
systems – preferably the Harvard (see
university website)
• Copying word for word without “…” and
reference is treated as cheating and you
will fail!
Harvard referencing system
• Very important to use this (or another
established system)
• Seems easy to me, but causes a lot of
difficulty
• Check library website (search for Harvard)
and/or copy an academic article or book.
• All references in text like Smith (2001)
• Then alphabetical list of references at the
end. Should include everything referred to,
and nothing else.
What must be in your project
plan (proposal)?
See assignment description
• You may be able to put parts of it in your
project!
• You should describe and justify your
research methods in as much detail as
possible
Authorship
• Should be directly involved at the
• Idea stage
• Protocol development
• Actual performance of the study
• Interpretation of results
• Writing up

• All authors must take full responsibility for the study


and be fully involved
Experiments (randomised controlled
trials)
• Put people (or whatever you investigating) in
randomly assigned groups, give the groups different
treatments, and compare groups to see what
differences emerge.
• Used for testing drugs, diets
(http://tinyurl.com/yp2t2o,
http://tinyurl.com/489hns), educational methods,
different designs for websites, social policies, etc.
Lots of examples in Ayres (2007)*.
• Advantages of experiments over non-interventionist
research
• Disentangle cause and effect. Can control
variables you haven’t even thought of. If done
well evidence can be very convincing.
• Can investigate new things
* Ayres, Ian. (2007). Super Crunchers: how anything
can be predicted. London: John Murray.
But …
• Experiments are often impractical or unethical
• Difficulties include
• Hawthorne effect
• Failure to assign groups at random (this matters
a lot because …)
• So use less rigorous quasi-experiments instead
(Grant & Wall, 2008)* – e.g. in action research you
may do a before and after comparison. This is a sort
of crude experiment but it is not as convincing as a
proper RCT.

* Grant, A. M. & Wall, T. D. (2008). The Neglected Science and Art of


Quasi-Experimentation: Why-to, When-to, and How-to Advice for
Organizational Researchers. Organizational Research Methods
(published online, July 18, 2008).
Qualitative data analysis
• Aim is detail and depth of understanding
• Demonstrate and understand possibilities,
but not how frequently they occur
• Use direct quotes (“…”) as evidence and to
reduce danger of imposing your
perspective
• Sometimes may be helpful to
• Summarise in a table or similar
• Use coding scheme to analyse
statistically (but be careful if the sample
is very small!)
Analysing data and presenting results

• Questionnaires and interview plans, and


possibly some data, in appendix
• Graphs and tables and important quotes
from interviewees etc in the main text
• Focus on your research aims, not the
questions in your questionnaire
• Readers want an analysis which shows
how your aims are met. They don’t want
to know the answers to all the questions
in your questionnaire!
• Use appropriate summaries – e.g. tables of
averages, or of main points from interviews
Questionnaire design: summary
• Read a (chapter of a) book on
questionnaires
• Develop a pilot. Remember questionnaires
are far more difficult to design than they
appear! Check with your pilot respondents:
• Is it clear?
• Is it interesting / appealing / user-friendly
/ not too long? Would you answer it?
• Does it provide (only) the information
you want?
• Are you still sure a questionnaire is a good
idea?
Questionnaire design (1)
Write down what you want to find out
• Closed questions
• Tick boxes
• Rating (Likert) scales
• Open questions
Pros and cons of each …
Check your questions will enable you
to find out what you need to for your
research
Questionnaire design (2)
• Covering letter
• Pilot it
• 3-4 nice friendly people to tell you
what’s wrong with it
• Pilot the analysis too
• Consider sample to send it to
• Anonymity / confidentiality
• How to send it / get it back (email?)
• What to do about non-response?
Questionnaire design (3)
• Far too many questionnaires about - many
of them very silly. What is the response
rate likely to be? Would you fill it in?
• Are you sure a questionnaire is
necessary???
• Many companies have a policy of not
responding to questionnaires
• Are there any alternatives?
• Check with your supervisor before sending
it out
Take care with opinion surveys
• You can ask someone
• What she did last week
• What she does in general terms
• Her opinion of what she does
• What she thinks other people do
• Her opinion of what she thinks other people do
• How she thinks things can be improved
• What she thinks about particular suggestions
about how things can be improved
• What she likes / wants / values
Etc
Think about what type of question you are using and
whether it is really useful
Interview design: in brief (1)
• Read a (chapter of a) book on interviews
• Follows, or precedes questionnaire, or
stands alone
• Be clear what you want to find out
• Consider telephone interviews
• Small sample. Don’t do too many
interviews.
• Plan your questions. Should be open-ended
and flexible, and aim for a detailed
understanding
• Probes and prompts
Interview design (2)
• Ask for permission to tape record
• Transcribe interesting bits to get quotes
for your project
• Get interviewee relaxed. Anonymity /
confidentiality (take care here!)
• Check you’ve covered everything
• Send interviewee transcript afterwards?
• Some transcripts or parts of transcripts in
appendix?
Primary data collection:
interviewing
Useful for accessing peoples’
perceptions, meanings, definitions of
situations, eliciting their
constructions of reality, etc.
• Alternative types
• structured
• semi-structured
• in-depth
• Ethical considerations
Forms of qualitative interviews

Qualitative interviews

One to one One to many

Focus
Face to face Telephone group
interviews interviews interviews
Interview respondents
• Who will be interviewed and why?
• How many will be interviewed and how
many times?
• When and for how long will each person be
interviewed?
• Where will each person be interviewed?
• How will access to the interview situation
be organised?
How to sample
• Clarify target population (the whole group
of interest)
• May be a population of people, organisations or

• Decide sampling approach. There are many
methods of taking a sample from your
target population, including
• Random
• Stratified
• Purposive
• Convenience (or opportunity, haphazard,
accidental)
• Cluster, snowball, quota, etc (see a book)
• Decide size of sample – need to balance
cost with information obtained. If you
analysis is statistical, statistical theory
can help …
Random sampling
• Make a numbered list of the target
population (a sampling frame)
• Use random numbers to choose sample
• Each member of population has the
same chance of being selected (and it’s
independent of any biases)
• Each member of sample selected
independently
• In practice, likely that some members of
the sample can’t be found or won’t help,
so the sample may be biased. Difficult to
deal with this … possibilities …
• The principle is to ignore all variables and
choose at random. This allows for all
“noise” variables.
Adequacy of study

• Study sample
• must be representative
• large enough size to ensure
sufficient power

• Quality control
• Accurate measurements
• Compliance of cases and controls
Which sampling method?
• Usually random samples are best for
large samples, and purposive samples for
small samples analysed qualitatively.

• Done properly, with a large enough


sample, random or stratified samples
(probability samples) should be
reasonably representative of the
population. Can’t assume this about
purposive or convenience samples (non-
probability samples) because these are
selected by factors that are likely to bias
the result in one direction or another.
Sampling in practice
• Many samples are biased and so will not give a
good idea of the population – regardless of
sample size.
• E.g. NRE, non-response bias in surveys,
survivor bias in …
• Ideal for large samples is random sampling, but
this is often difficult to do properly.
• E.g. Iraq war death rate (see
http://www.iraqbodycount.org/ for another
approach), TV audience research.
• Be suspicious of statistical results from
purposive or convenience samples
• Need to be especially careful with small,
purposive samples for detailed analysis –
consider the purpose and choose accordingly
Three surveys to check accuracy of
NRE phone service – which is right?
1. A Consumer’s Association survey used a sample of
60 calls, mainly about fares. The worst mistake
was when one caller asking for the cheapest fare
from London to Manchester was told £162 instead
of the cheaper £52 fare which was available via
Sheffield and Chesterfield. The percentage correct
was …
32%
2. A reporter rang four times and each time asked for
the cheapest route from London to Manchester.
The proportion of the four answers which were
correct was
25%
3. An NRE sponsored survey found that the answers
were
97% correct
More sampling problems
• An MBA student sends out 100
questionnaires to 100 organisations asking
if they would be interested in a particular
service. Twenty are returned, and of these
6 indicated they may be interested in the
service
• There are 650 firms in the relevant
industry sector. How big is the market
for the service? Are you sure?
• Suppose you wanted to find out how
common it is for women aged 30-40 to
enjoy running.
• How would you choose a sample to ask?
• Other examples and exercises attached
Measurements
(Indicators)
• If you want to find out whether customer
satisfaction, or quality or profits have improved you
must have a sensible way of measuring them.
• Moreno-Luzon (1993) used managers’ “perceived
achievement of objectives” as a measure. Can
you see any problems with this?
• How would you measure quality of service in a
casino?
• How would you check if your proposed measure is
valid / reliable / right / accurate?
Things to remember with
measurements (1)
• Conventional to distinguish between validity (are
you measuring the right thing?) and reliability
(consistency)
• If possible use an existing measurement system
(with acknowledgement / permission). This has two
advantages – there may be evidence validating it,
and you can compare your results with previous
results.
• Remember that some informants may be biased, or
too lazy to give good answers, or just ignorant.
• If possible use triangulation (check with
information from different sources)
• Ask yourself whether your proposed method of
measurement really measures the right thing
Things to remember with
measurements (2)
• Be especially careful with measures of value. This
may have several dimensions (Keeney, 1992)*. E.g.
the success of a firm might depend on profitability,
worker satisfaction, contribution to the community

• If you are measuring the success of a change,
remember there may be several different criteria.
E.g. …
• May be useful to use the average (mean) response
to a series of questions. Use your common sense to
see if this is reasonable, or if they should be kept
separate. (See literature on Tests and scales – e.g.
Robson, 2002: 292-308).
Reliability of measurements
• Same answer at different times?
• If anything depends on subjective
judgments, check agreement between
different judges
• Eg – marking projects
• If you’re asking a number of questions to
get at the same information, check the
relationship between answers to these
questions – with two questions use a
correlation coefficient, with more than
two use Cronbach’s Alpha (if you are
keen on stats!) – see
http://www.statsoft.com/textbook/statho
me.htm
Theoretical assumptions
• If the research uses a theory, is the theory
right for the purpose? And is it a “valid”
theory? (Some theories, of course, are
stupid or wrong!) You need a critical
evaluation in your literature review.
• A questionnaire or interview plan may be
based on assumptions about what is
relevant. Are these assumptions OK?
Is the research sufficiently Imaginative?
• Imagination helpful in
• Thinking of hypotheses to explain things

• Thinking of new methods for researching

• Thinking of new ways of doing things …
• Many recommendations for boosting
imagination and thinking creatively – e.g.
• Brainstorming
• Doing something else and coming back
to the task
• etc
Making sure that you are not
being misled by Chance
• Could your results just be due to chance?
• Have you taken account of sampling
error? (If you repeated your research
with another sample are you sure the
answer would be the same?)
• Is the sample large enough?
Null hypothesis tests or confidence
intervals can be used to answer these
questions.
• Are the measurements reliable?
Sampling for small sample
qualitative research
• Usually best to use theoretical (purposive)
sampling - the selection of individuals who
you think will best contribute to the
development of a theory

• Results apply to immediate situations

• May be tentatively generalised, but the


small sample means …
Difficulties with interviews
• Mistrust by respondents
• e.g. researcher is a management spy
• Loyalty to organisation/colleagues
• Adherence to stereotypical views rather
than their own inner feelings and
knowledge
• Complete indifference
• An opportunity for respondent to ‘sell’ their
ideas
Managing the interview
• Preparation for the interview
• the interview schedule
• Beginning the interview -
establishing rapport
• Communication and listening skills
• Asking questions
• sequence and types of questions
• Closing the interview
Verifying interview data
• Body language
• Material evidence
• e.g. company/factory tour
• Writing notes
• as soon as possible after
interview
• Use informant verification
and secondary sources
Remember
• Need to demonstrate rigour
• Good research acknowledges
bias and the need to expose it.
• Devil’s advocates are useful for
revealing bias and other
problems, but are seldom used.
…Is all research is biased?
Lesson

4
Research Methodology
Literature Review
Dr Joe Essien
Literature Review
• the literature should be clearly
focused on your research aims, and
it should be critical in the sense that
you should point out strengths and
weaknesses where appropriate

• 1.Finding material
• 2. Mapping relevant literatures
• 3. Evaluating literature
• 4.Some practical hints
Sources of data: many possibilities
• Interviews
• Including focus groups, Delphi technique
(Robson, 2002:57), various approaches to
eliciting comments (e.g. “photo elicitation” –
Sam Warren)
• Questionnaires, including via email (be careful …)
• Documents (minutes of meetings, company reports,
etc)
• The web
• Databases – within organisation, of share prices,
etc
• Observations of various kinds
• Etc …. Be imaginative!
Sources of literature is a different issue (Judith’s
session is very important for this)
Writing a literature review
Finding material

• There is no prescribed number of sources


you should use, it depends on the topic
• Be wary if you feel that you are drowning
in material you found for your topic, it
probably means you have not narrowed it
down enough
• Be wary if you find no sources or very little
sources. You normally need some
academic sources to be able to write a
meaningful literature review
What secondary sources
should you use?
• Books:
• Use textbooks only to get an overview over
a topic
• Academic monographs (edited books with
chapters by different authors) can be very
useful. They often explore a topic from
different angles or cover different aspects
of a topic
• Don’t use “airport bookstall” books as
serious sources
Secondary sources
• Journals:
• Peer-reviewed academic journal articles
should normally be the backbone of your
literature review
• They provide up-to date discussions of
topics and are usually more narrowly
focused than textbooks
• “Trade journals” (non peer-reviewed) can
provide good introductions to topics and
overviews of developments but carry
considerably less academic “weight” than
academic journals.
(Secondary) sources
• Sometimes you may be able to find
article titles like “ …:A review of the
literature” in academic journals.
They can save you lots of work
• Internet:
• Make sure you are able to
distinguish between credible
sources and Joe Block’s
unsubstantiated views
• Reputed organisations’ websites can
be good sources of information (but
may have a bias/self-interest).
(government Agencies, internat.
Organisations)
(Secondary) sources
• Dissertations and PhDs:
• Checking dissertations stocked in
the library may help you to get a feel
for what is expected in a
dissertation as well as provide
information on a topic
• Government reports/EU reports/other
organisations’ reports can be very
useful (but are sometimes biased).
Literature search
• Check to see if your idea is original
• Look for a new slant to present
• Try to get the full article
• Read all the references
• Most of these will be vital when
writing up
Searching for literature
• The key is the use of electronic databases
• Some databases are full text (you can
download articles directly), others are
bibliographic databases (you need to
check with library or use inter-library loan
requests)
• Business Sources Premier/Emerald Full
Text/Econlit/Science Direct are all
recommended
• Be patient and creative in the use of
keywords
Searching for literature

• CD-Rom newspaper databases


(FT, Economist) can be useful
tools
• Financial Data and Marketing
databases mainly provide
primary data
Mapping out relevant
literatures
• Don’t put everything you find or everything
you read in your literature review
• Time spent on familiarising yourself with
and assessing literature for relevance is
never wasted
• Only after you have gained a good
overview over the literature will you be
able to decide on your particular “angle”
and your research questions
Mapping out relevant
literature
• Your database search should tell you how
much and what type of literature is
available
• For some well-researched topic you will be
able to concentrate on the literature
directly dealing with your specific topic
• For other research ideas, you may need to
think about “related areas” or similar
experiences in other industries or possible
insights from other subject disciplines for
enlightenment
Mapping out relevant
literature
• An simple example: If you are interested in
TQM and small firms you may wish to
• Look at the TQM literature in general for
the pros and cons, constraints and motives
• Identify success and failure factors from
the TQM literature
• Check the small business literature for
general business conditions and
constraints
• Check the small business literature to find
out if these success factors apply there
Mapping out relevant
literature
• You can draw this as a conceptual
map of overlapping circles or as a
flow diagram if this suits your
learning style
• Brainstorming and drawing
conceptual maps is best done after
you have gained a feel for the
literature from your literature search
Evaluating literature
• This becomes easier with experience
• When reading literature:
• identify the key arguments that are
made
• The reason(s) for the conclusions
drawn
• They should be either derived from
logical deduction (a conclusion
following necessarily from premises)
and /or based on empirical evidence
Evaluating literature
• Check the logic of the arguments made
• Does this necessarily follow?
• Check the supporting evidence
• Is this data relevant? Is it meaningful and
accurate? Could it be interpreted in
another way? Which data would I need to
challenge this?
• Check for flaws: tautologies, simplistic
analogies, redefinition of terms, moral
judgements (ought to)
Some practical hints
• Make sure you refer to key texts that are
frequently cited in the literature
• Find out whether there are different
“schools” or “camps” in the literature and
cover their positions.
• Use your research questions to structure
your literature review
• Check the validity (logic, empirical
evidence) of arguments made
• Make clear on what basis you decide to
side with a “camp” or author or why you
remain unconvinced or oppose a
judgement
Some practical hints
• Don’t overstate your case and be realistic
about what you can conclude
• Be particularly fair to views and arguments
you don’t agree with (avoid to be seen as
biased)
• Don’t be shy to critique established “trade
names”(academic gurus)
• Write your literature review for non-
specialists and avoid jargon
• Write it well structured and easy to read
Search Strategies 1
1. Use encyclopedias, almanacs and
dictionaries to find background information
on your topic

• Consult the Kinds of Information Chart


• Browse books in your subject (CBU uses
the Library of Congress system. In this
outline, the letters are specific areas of
the library where you will find books on
your subject)
Search Strategies 2
1. Use the library’s many databases to find in-
depth
information in books and journals
• Online Research Databases (articles &
journals)
• WebCat Catalog (books & AV)

Improve your Search Results


Use Boolean operators in all online databases (including
Google) to improve the relevancy of your results.
Evaluate your information 1
• Everything that is written has at least some bias or
point-of-view. You need to evaluate how much that
bias affects the content of the article or website.

• Who is the author?


• Did the author have any authority in what they
wrote? What credentials do they have?

• Why was the article written?


• Many articles and websites were written to
present specific arguments or theories. Make
sure you know if the information you are using
was written for a specific purpose.
Evaluate your information 2
• Where was it published?
• Was it published in a peer-reviewed,
scholarly, or otherwise authoritative
journal? Or, merely on someone’s
personal website?
Learn how to
identify scholarly
journals

• When was it published?


• Obvious, yes. But, make sure that the
website you use is not outdated.
Lesson

5
Basic Principles Of
Research Design
Dr Joe Essien
Research and research methods

• Research methods are split


broadly into quantitative and
qualitative methods

• Which you choose will depend on


• your research questions
• your underlying philosophy of
research
• your preferences and skills
Research Design
Four main features of research design, which are distinct, but closely related

• Ontology: How you, the researcher, view the world and the assumptions
that you make about the nature of the world and of reality
• Epistemology: The assumptions that you make about the best way of
investigating the world and about reality
• Methodology: The way that you group together your research techniques
to make a coherent picture
• Methods and techniques: What you actually do in order to collect your
data and carry out your investigations

• These principles will inform which methods you choose: you need to
understand how they fit with your ‘bigger picture’ of the world, and how
you choose to investigate it, to ensure that your work will be coherent
and effective
Four main schools of ontology
(how we construct reality)
Ontology Realism Internal Realism Relativism Nominalism
The world is Scientific laws Reality is
The world is
real, but it is are basically entirely
‘real’, and science
almost created by created by
Summary proceeds by
impossible to people to fit people, and
examining and
examine it their view of there is no
observing it
directly reality external ‘truth’
There is a single Truth exists, but There are There is no
Truth
truth is obscure many truths truth
Facts exist, and Facts are Facts depend
Facts are all
can be revealed concrete, but on the
Facts human
through cannot always viewpoint of
creations
experiments be revealed the observer

However, none of these positions are absolutes.

They are on a continuum, with overlaps between them.


Epistemology
i.e. the way in which you choose to investigate the world

Two main schools are positivism and social


constructionism:

• Positivists believe that the best way to investigate the


world is through objective methods, such as
observations. Positivism fits within a realist ontology.

• Social constructionists believe that reality does not


exist by itself. Instead, it is constructed and given
meaning by people. Their focus is therefore on
feelings, beliefs and thoughts, and how people
communicate these. Social constructionism fits better
with a relativist ontology.
Methodology
• Epistemology and ontology will have
implications for your methodology
• Realists tend to have positivist approach
 tend to gather quantitative sources of data
• Relativists tend to have a social
constructionist approach
 tend to gather qualitative sources of data
• Remember these are not absolutes! People
tend to work on a continuum  role for mixed
methods and approaches
• Also consider the role of the researcher*:
internal/external; involved or detached?
A note about data
• Quantitative data is about quantities,
and therefore numbers
• Qualitative data is about the nature of
the thing investigated, and tends to
be words rather than numbers
• Difference between primary and
secondary data sources
• Be aware of research data
management practices and archives
of data sets (both in terms of
downloading and uploading)
Choosing your approach
• Your approach may be influenced by your colleagues’ views, your
organisation’s approach, your supervisor’s beliefs, and your own
experience
• There is no right or wrong answer to choosing your research methods
• Whatever approach you choose for your research, you need to consider
five questions:
• What is the unit of analysis? For example, country, company or individual.
• Are you relying on universal theory or local knowledge? i.e. will your results be
generalisable, and produce universally applicable results, or are there local
factors that will affect your results?
• Will theory or data come first? Should you read the literature first, and then
develop your theory, or will you gather your data and develop your theory from
that? (N.B. this will likely be an iterative process)
• Will your study be cross-sectional or longitudinal? Are you looking at one point
in time, or changes over time?
• Will you verify or falsify a theory? You cannot conclusively prove any theory; the
best that you can do is find nothing that disproves it. It is therefore easier to
formulate a theory that you can try to disprove, because you only need one
‘wrong’ answer to do so.
Quantitative approaches
• Attempts to explain phenomena by collecting and
analysing numerical data
• Tells you if there is a “difference” but not necessarily
why
• Data collected are always numerical and analysed
using statistical methods
• Variables are controlled as much as possible (RCD as
the gold standard) so we can eliminate interference
and measure the effect of any change
• Randomisation to reduce subjective bias
• If there are no numbers involved, its not quantitative
• Some types of research lend themselves better to
quant approaches than others
Quantitative data
• Data sources include
• Surveys where there are a large
number of respondents (esp where
you have used a Likert scale)
• Observations (counts of numbers
and/or coding data into numbers)
• Secondary data (government data;
SATs scores etc)
• Analysis techniques include
hypothesis testing, correlations
and cluster analysis
Black swans and falsifiability

• Falsifiability or refutability of a
statement, hypothesis, or theory is the
inherent possibility that it can be
proven false
• Karl Popper and the black swan;
deductive c.f. inductive reasoning

• Hypothesis testing
• Start with null hypothesis
i.e. H0 – that there will be no
difference
Type I and Type II errors
Analysing quantitative data
• Always good to group and/or
visualise the data initially 
outliers/cleaning data
• What average are you looking for?
Mean, median or mode?
• Spread of data:
• skewness/distribution
• range, variance and standard
deviation
What are you looking for?
• Trying to find the signal from the noise
• Generally, either a difference
(between/within groups) or a
correlation
• Choosing the right test to use:
parametric vs non-parametric (depends
what sort of data you have –
interval/ratio vs nominal/ordinal and
how it is distributed)
• Correlation does not imply causation!
Example correlations

From ‘Spurious
correlations’
website
http://www.tylervige
n.com/spurious-
correlations
Interpreting test statistics
• Significance level – a fixed probability of
wrongly rejecting the null hypothesis H0, if it is
in fact true. Usually set to 0.05 (5%).
• p value - probability of getting a value of the
test statistic as extreme as or more extreme
than that observed by chance alone, if the null
hypothesis H0, is true.
• Power – ability to detect a difference if there is
one
• Effect size – numerical way of expressing the
strength or magnitude of a reported
relationship, be it causal or not
Example of quant data/analysis*
• Matched users were those who learning styles were
matched with the lesson plan e.g. sequential users with a
sequential lesson plan. Mismatched participants used a
lesson plan that was not matched to their learning style,
e.g. sequential users with a global lesson plan.

• H0 – there will be no statistically significant difference in


knowledge gained between users from different
experimental groups
• H1 – students who learn in a matched environment will
learn significantly better than those who are in
mismatched environment
• H2 – students who learn in a mismatched environment
will learn significantly worse than those who learn in a
matched environment
Interpreting test statistics

• Statistical testing was carried out using a univariate ANOVA


in SPSS, to determine if there was any significant difference
in knowledge gained.

• Initial conjecture suggests that the mismatched group


actually performed better than the matched group.

• However, the difference between the two groups was not


significant (F(1,80)=0.939, p=0.34, partial eta squared = 0.012) and
hence hypotheses 1 and 2 can be rejected.
What quant researchers worry about

• Is my sample size big enough?


• Have I used the correct statistical
test?
• have I reduced the likelihood of
making Type I and/or Type II errors?
• Are my results generalisable?
• Are my results/methods/results
reproducible?
• Am I measuring things the right
way?
What’s wrong with quant research?
• Some things can’t be measured – or
measured accurately
• Doesn’t tell you why
• Can be impersonal – no engagement
with human behaviours or individuals
• Data can be static – snapshots of a
point in time
• Can tell a version of the truth (or a
lie?)
“Lies, damned lies and statistics” –
persuasive power of numbers
Qualitative approaches
• Any research that doesn’t involve
numerical data
• Instead uses words, pictures, photos,
videos, audio recordings. Field notes,
generalities. Peoples’ own words.
• Tends to start with a broad question
rather than a specific hypothesis
• Develop theory rather than start with
one
 inductive rather than deductive
Gathering qual data
• Tends to yield rich data to explore how
and why things happened
• Don’t need large sample sizes (in
comparison to quantitative research)
• Some issues may arise, such as
• Respondents providing inaccurate or false
information – or saying what they think the
researcher wants to hear
• Ethical issues may be more problematic as
the researcher is usually closer to
participants
• Researcher objectivity may be more difficult
to achieve
Sources of qual data
• Interviews (structured, semi-structured
or unstructured)
• Focus groups
• Questionnaires or surveys
• Secondary data, including diaries, self-
reporting, written accounts of past
events/archive data and company
reports;
• Direct observations – may also be
recorded (video/audio)
• Ethnography
Analysing qual data
• Content analysis
• Grounded analysis
• Social network analysis (can
also be quant)
• Discourse analysis
• Narrative analysis
• Conversation analysis
Example of qual data research*

• Describing and comparing two


types of audio guides: person-
led and technology-led

• Geolocated audio to enable


public, informal learning of
historical events

• Data sources: questionnaires,


researcher observations, and
small focus groups
Data analysis and findings
• Comparison of the two different walks
• Differences/similarities of the walks
• Issues surrounding participant engagement
• Thematic analysis
• Mode of delivery
• Number of participants and social interactions
• Geographical affordances of places and locations
• User experience
• Opportunities for learning
• Other factors
• Findings, lessons learned, recommendations
What qual researchers worry about
• Have I coded my data correctly?
• Have I managed to capture the
situation in a realistic manner?
• Have I described the context in
sufficient detail?
• Have I managed to see the world
through the eyes of my participants?
• Is my approach flexible and able to
change?
What’s wrong with qual
research?
• It can be very subjective
• It can’t always be repeated
• It can’t always be generalisable
• It can’t always give you definite
answers in the way that
quantitative research can
• It can be easier to carry out (or
hide) ‘bad’ (poor quality) qual
research than ‘bad’ quant
research
Other aspects of
research design
• Validity
• Reliability
• Trustworthiness*
• Dependability: showing that the findings are
consistent and could be repeated
• Confirmability: a degree of neutrality or the
extent to which the findings of a study are
shaped by the respondents and not
researcher bias, motivation, or interest
• Credibility: confidence in the 'truth' of the
findings
• Transferability: showing that the findings
have applicability in other contexts
Summary
• The type of approach you choose will be
determined by your research question, your
epistemological and ontological stances and your
skills or ability to utilise a certain appoach
• For most people in ed tech, a mixed methods
approach will be used
• So long as you make an informed choice and can
justify it, it should be fine 
• Just be aware of the limitations of your
approach(es) and try to compensate where
necessary
Choose a study design
• Case report
• Case series
• Case controlled study
• Cross sectional
• Cohort
• Retrospective comparison
• Prospective Comparison
A Case report

• Description of one interesting


and unusual case

• This is anecdotal and may form


the basis for further study

• This may be the only way to


report on something very rare
Case series

• Description of several cases in


which no attempt is made to
answer specific hypotheses or
compare results with another
group of cases.
Cross sectional study

• A survey of the frequency of


disease, risk factors or other
characteristics in a defined
population at one particular
point in time.
Cohort study

• An observational study of a
group of people with a specific
characteristic or disease who
are followed over a period of
time to detect change

• Comparison with control group


is allowed
Case control study

• An observational study where


characteristics of people with a
disease (cases) are compared
with selected people without
the disease (controls)
Controlled Trials

• An experimental study in which


an intervention is applied to one
group and the outcome
compared with that in a similar
group (controls) not receiving
the intervention
Lesson

6
Research Methodology
Writing the Paper
Dr Joe Essien
Writing the paper
• Two reasons your papers are rejected
• Content
• Format
• Get a copy of the Journal you wish to publish
in similar article or detailed instructions

Writing up
• Your paper is reviewed by experts
• Get help before sending it away
• Reading a protocol or a paper or offering
advice does not entitle one to become an
author on a paper
Writing style (1)
• Keep it simple.
• Short sentences
• Clear, short paragraphs
• Clear subheadings
Read it through to make sure you can follow
it. Swap with a friend and check each
others’
Writing style (2)
1 I think the EMH was true in this situation…
2 In my opinion the EMH was true …
3 In the author’s opinion the EMH was true …
4 The evidence suggests that the EMH was true …
5 This shows that the EMH was true …
Use 4 or 5.
Avoid 1, 2 or 3 because it gives the impression that
it’s just your opinion and that other, even wiser,
people may see it differently.
Writing style (3)
1 I work for … and the problems are … / I
interviewed three managers.
2 The author works for … and the problems
are … / The author interviewed three
managers.
3 Then problems of this organization are …
/ Three managers were interviewed.
Opinions vary here. I (MW) prefer (1). Others
prefer (2) or (3).
Check with your supervisor.
Take care with opinion surveys
• Suppose your research is about risk management
and its effectiveness. You decide to investigate by
means of a questionnaire and come up with:

1. 70% of people in the organisation think our risk


management is unsatisfactory
2. 60% think Method X is the best way of
improving it

• You then present this as the rationale behind your


recommendations to improve risk management.
• But … how do they know?
• Surely the researcher should find out by
rigorous and sensible methods, rather than
asking people who may neither know nor care?
Exercise
• There are many problems with interviews
and questionnaires. Your respondents may
• Not know the answers
• Not understand the questions
• Be too lazy to think about the issues
• Want to deceive you

• Try to design the methods for a research


project without using interviews or
questionnaires. (This is not usually a good
idea but it should help you to consider
alternatives.)
Lesson

7
Research Methodology
Evaluating the Research
Dr Joe Essien
Evaluating research
• Relevant to
• Planning your own research. Use the
following slides to
• Check your proposal
• Check your final project
• Critically reviewing published
research

• These slides are intended as a


checklist for your research and others’
Good research should be:
• As User-friendly as possible
• Simple as possible given the message?

• As Uncritisable (trustworthy) as possible


• Trustworthiness or credibility is
particularly important. Can you trust the
conclusions? Do you believe them? Are
there any flaws? Essential to give readers
enough detail to check.

• As Useful or interesting as possible


• Clear implications for future? New results?
Then …
• Having designed your research get
someone to act as a devil’s advocate and
tell you

• What’s wrong with it – why it may fail to


deliver what you are aiming for
• What may go wrong
• Would they trust the answer?
Devils Advocate
• Use of a devil’s advocate or critical friend.
Remember the problem of confirmation bias –
you are likely to be more enthusiastic about
evidence that confirms your pet ideas than
about evidence that undermines it! Get
someone to try and be critical and find
difficulties with your research – then fix or (if
unfixable) discuss the problems.

• Triangulation – compare results from different


sources. Applies to data, methods, observers,
theories (Robson, 2002: 174).
In groups …
• Choose one of the articles you have been
given
• Assess its
• User-friendliness
• Trustworthiness (pay particular
attention to this)
• Usefulness
• Brief feedback session, then we will
compare your critiques with my slides
Jargon
• Most of these checks are covered by
technical jargon, concepts and techniques
– e.g. lots of types of validity (internal,
external, construct, face …), lots of types
of reliability, ideas about test and scale
construction (see Robson, 2002), etc
• Read up only those areas which you think
are relevant. I have largely avoided jargon
here.
• Always check sampling – always
necessary to consider whether your
sample is likely to be representative of
your area of interest.
Deciding what is Cause and what is effect
• Important to try to work out what causes what, and
how strongly and under what circumstances, so that
you know what you should change to achieve a
particular effect.

• Take care – may be more complicated than it


appears.
• Variable you haven’t thought of may be the
important cause!
• Experiments (randomised controlled trials) for
definitive answers, but may be difficult, so …
• Quasi-experiments (e.g. a before/after comparison
of a trial of a new innovation) instead, but …
• May be lots of causes. Be suspicious of simple
explanations
Deciding what is Cause and what is effect –
more examples
• A survey of organizations showed that those that
used the balanced scorecard were more profitable
than those that didn’t.
• Does this show that the balanced scorecard
makes firms more profitable?
• A survey showed that the average job satisfaction
score for a department rose substantially and
significantly between 2006 and 2008. In 2007
everyone was sent on a week’s computer course in
the Seychelles.
• Would you recommend a computer course for
other departments?
• Does high staff turnover cause poor performance or
vice versa? (Glebbeek and Bax, 2004). Does
extraversion help people get promoted, or vice
versa (Moutafi et al, 2007). Does it matter?
To ensure results Representative …
check Sampling

1. Decide what you’re interest in – often called the


population or target population.
2. Usually this is too big to look at everything so
take a sample. Normally we want the sample to
be representative of the population or wider
context—so you must check if this is likely.
3. Need to consider how the sample is selected and
its size. Badly chosen samples can be biased and
give very misleading results.
• E.g. TV audience research, word length, NRE,
non-response bias in surveys, survivor bias in
stock price samples
Trustworthiness of research:
main things to check

C
R
I
T
I
C

Each letter represents an issue you should


consider
The first CRITIC

• Cause and effect assumptions OK?


• Representative sample?
• Indicators (measurements) OK?
• Theoretical assumptions OK?
• Imaginative enough?
• Chance ruled out as explanation?

(Checks needed are mostly common sense –


except for Chance.)
The second CRITIC
• C Claim?
• R Role of the claimant?
• I Information backing the claim?
• T Test?
• I Independent testing?
• C Cause proposed?
Teaching skepticism via the CRITIC acronym
and the skeptical inquirer
Skeptical Inquirer, Sept-Oct, 2002 by Wayne
R. Bartz
Anything else…?

• Is this list complete?


• Does it address all the flaws
you noticed in the paper you
looked at?
• What would you add or change?
Checklist: the 3 U’s, the
CRITIC and Extra checks
• User-friendly?
• UnCRITICisable (trustworthy)?
• CRITIC
• Useful?
• Extra checks
• Triangulation
• Devils advocate (critical friend)
Another measurement
problem
• Andy had answers from lots of questions on a SD,
D, N, A, SA scale
• He wanted a measure to tell him which questions
produced responses which gave a a clear overall
view (COV) from his respondents
• His defined his measurement as
COV = abs(SD+D–A–SA) – N
(where SD is the number of SD responses, etc, abs
= absolute value)
• He then highlighted questions for which COV > 0
• Do you think this is a sensible measurement?
Critique of an article
• Do you accept what the article says, or are
there flaws in the research?
• Think about the article! Use your common
sense.
• Check the CRITIC.
• Is it worth publishing? Could you do
better?
• Read round the subject – e.g. other
research on the same theme.
• Would the research benefit from some
qualitative work, p values or confidence
intervals, case studies, different
perspectives, experiments…
Some ideas which are worth mulling over

• Detailed study of a small sample vs


less detailed study of a large sample
• Induction vs the Hypothetico-
deductive method (Popper) vs
Following a framework / paradigm /
theory vs Deduction
• Subjective vs Objective; Facts vs
Values
Critical attitude
• Try to anticipate and discuss criticisms
• Get a friend to act as a devil’s advocate
• Your work should be so convincing that it
can’t be disputed!
• Think of any criticisms you have of articles
you have read. Make sure the same faults
don’t apply to your work.
Word “critical” sometimes used in a slightly
different, more specific, sense.
Reminders about the project
• Research aims should be simple, explicit,
focussed, motivated, useful
• Literature review should focus on relevance to
your project
• References should be complete and in order
• Methods should be the best which are feasible.
• Analysis chapter should show how hard and
skilfully you’ve worked, and why readers should
believe you. You need to convince a sceptical
reader who may want to know details of how your
data was obtained – e.g. source of samples,
location of interviews (pub or office?), etc, etc –
and analyzed.
• Conclusions and recommendations should
summarise what you have found, and clearly meet
the research aims. Also discuss limitations.
• Changing your mind is to be expected – if
necessary rewrite aims after doing the research!
Reminders (2)
• Docs/links at
http://userweb.port.ac.uk/~woodm/projects
• Keep to the 15,000 word limit. You can get a good
mark with 13,000 words but not with 17,000 words.
• Remember the ethics form – no form, no pass!
• Be particularly careful about NHS ethics clearance
• Make use of your supervisor (see Project Guidelines)
• Plan the timescale (Gantt chart) – allow time for
delays
• Allow time at the end for your supervisor to read it for
you to make any necessary amendments
• If it’s good, consider publishing a summary in a
journal. Ask your supervisor.

You might also like