Professional Documents
Culture Documents
BBRC4103 Research Methodology
BBRC4103 Research Methodology
BBRC4103 Research Methodology
BUSINESS AND
MANAGEMENT
BBRC4103
Research Methodology
Copyright © Open University Malaysia (OUM)
BBRC4103
RESEARCH
METHODOLOGY
Assoc Prof Dr Ahmad Shuib
Dr Thinagaran Perumal
Assoc Prof Dr Nagarajah Lee
www.oum.edu.my
5.1 Conceptualisation 57
5.2 Operationalisation 58
5.3 Variables 59
5.4 Measurement 60
5.4.1 Level of Measurement 61
5.5 Scaling Techniques 65
5.5.1 Rating Scales 65
5.5.2 Ranking Scale 68
5.6 Measurement Quality 70
5.6.1 Reliability, Validity and Practicality 71
5.7 Sources of Measurement Errors 72
Summary 75
Key Terms 76
INTRODUCTION
BBRC4103 Research Methodology is one of the courses offered at Open University
Malaysia (OUM). This course is worth 3 credit hours and should be covered over
8 to 15 weeks.
COURSE AUDIENCE
This course is offered to all learners who need to acquire fundamental knowledge
in research methodology.
STUDY SCHEDULE
It is a standard OUM practice that learners accumulate 40 study hours for
every credit hour. As such, for a 3 credit hour course, you are expected to
spend 120 study hours. Figure 1 shows the student learning time (SLT).
COURSE SYNOPSIS
This course is divided into 10 topics. The synopsis for each topic is listed
as follows (refer to Table 1):
Topic Description
3 It explains the importance and procedure of literature review. The topic also
explores common weaknesses in literature review and explains how to
critique journal articles.
(b) Self-Check
This component of the module is included in strategic locations throughout
the module. It may be located after one subtopic or a few subtopics.
It usually comes in the form of a question. When you come across this
component, reflect on what you have already learnt thus far. By attempting
to answer the question, you should be able to gauge how well you have
understood the subtopic(s). Most of the time, the answers to the questions
can be found directly in the module itself.
(c) Activity
Similar to Self-Check, the Activity component is also placed at various
locations or junctures throughout the module. This component may
require you to answer questions, explore short case studies or conduct
an observation or research. It may even require you to evaluate a given
scenario. When you come across an Activity, you should try to reflect
on what you have gathered from the module and apply it to real situations.
You should, at the same time, engage in Higher Order Thinking skills
(HOTs) i.e. analysing, synthesising and evaluating instead of only recalling
and defining.
(d) Summary
You will find this component at the end of each topic. It summarises various
important parts of each topic and helps you to recap the whole topic.
By going through the summary, you should be able to gauge your
knowledge retention level. Should you find points in the summary that
you do not fully understand, it would be a good idea for you to revisit the
details in the module.
(f) References
A list of relevant and useful textbooks, journals, articles, electronic contents
and sources can be found in this section. The list may appear in a few
locations such as in the Course Guide (in the References section), at the end
of every topic or at the back of the module. You are encouraged to read or
refer to the suggested references to obtain additional information and
enhance your overall understanding of the course.
PRIOR KNOWLEDGE
There is no prior knowledge needed.
ASSESSMENT METHOD
Please refer to .
REFERENCES
Black T. R. (1999). Doing quantitative research in the social sciences. Sage
Publication.
Uma Sekaran. (2003). Research methods for business. A skill building approach.
(4th ed). Wiley.
INTRODUCTION
The purpose of science is to expand knowledge and discover the truth. By
building theory, researchers undertake research to achieve this purpose.
Prediction and understanding are the two purposes of theory and they usually
go hand in hand. To make a prediction, one must know and understand why
variables behave as they do and theories provide this explanation. A theory is a
coherent set of general propositions used as principles to explain the apparent
relationships of certain observed phenomena. The scientific method is a series of
stages used to develop and refine theories.
Scientific methods and scientific thinking are based on concepts. Concepts are
invented so as to enable us to think and communicate abstractions. Higher-level
concepts are used for specialised scientific explanatory purposes that are not
directly observable. Concepts and constructs are used at the theoretical levels
while variables are used at the empirical level. The scientific research process is
used to develop and test various propositions using inductive-deductive
reflective thinking. Scientific research uses an orderly process that combines
People analyse problems differently because they have selective perception and
conditioning of the environment affecting them; the kind of questions asked
would be different depending on how they see the world. Scientific inquiry is
one of the ways to analyse problems. Understanding the relationship between
science and research will help researchers in formulating the study.
The basic goal of science is to obtain, with confidence, valid generalisations and
to establish relationships between variables. By understanding the relationships,
scientists will be able to understand a phenomenon in terms of the patterns of
relationships, to make predictions and to determine causal relationships. Good
science uses the scientific method and can be characterised by the following:
(e) It is logical, meaning that conclusions are drawn from the results based on
logic; and
(f) It is rigorous, meaning that every effort is made to reduce error.
It is noted here that the difference between hard science and soft science is
control over confounding variables. For example, in business, there are factors
which may be beyond the control of managers, so there has to be some trade-off
between the rigours of science and the pragmatics of business. There has to be
some give and take between the desires of the businesspeople and the desires of
the researchers.
Although this will lead to error, as long as the researcher informs the decision-
maker of the limitations, and the results are qualified based on the limitations,
the research should go on to produce the information. Good scientific research
also follows the principle of parsimony, that is, a simple solution is better than a
complex solution. Parsimonious research means applying the simplest approach
that will address the research questions satisfactorily.
SELF-CHECK 1.1
To reduce this deficiency, one has to generate the right kind of knowledge
and common sense knowledge needs to be examined systematically to find
the actual cause. The actual cause can be found by setting up experiments
for systematic testing or continually collecting data to examine the repeat
(iii) Premature closure: This often operates with and reinforces the first
two errors. Premature closure occurs when people feel they have all
the answers and do not need to listen, seek information or raise
questions any longer.
From this tentative theory, prediction and hypotheses are derived for further
investigation or testing. The process of further investigation or testing will
continue until the theories and laws derived are refined. The refined laws or
theories are tentative. If an anomaly is found when a new observation does not fit
into a current body of knowledge or the theories or laws are proven wrong, a
modification has to be carried out. The process will continue again and again
when new knowledge is generated from new observations.
(a) Inductive Model Moves from the particular to the general, from a set of
specific observations to discovery of a pattern that represents some degree
of order among all given events; the logical model in which general
principles are developed from specific observations.
(b) Deductive Model Moves from the general to the specific, from a pattern that
might be logically or theoretically expected and observations that test whether
the expected pattern actually occurs; the logical model in which specific
expectations of hypotheses are developed on the basis of general principles.
Figure 1.2 illustrates the differences between the inductive and deductive models.
ACTIVITY 1.1
(a) It narrows the range of facts needed to study; any problem can be studied
in many different ways. A theory can suggest the ways that are likely to
yield the greatest meaning;
(b) It suggests a system for the researcher to impose on data in order to classify
them in a meaningful way;
(c) It summarises what is known about an object;
(a) Concepts
A concept is a bundle of meanings or characteristics associated with certain
events, objects, conditions and situations. Concepts may be developed
because of frequent, general and shared usage over time. It may be acquired
through experience. Some concepts are unique to a particular culture and
not easily translated into another language.
(b) Constructs
A construct is an image or idea specifically invented for a given research
and/or theory-building purpose. Constructs are developed by combining
simpler concepts, especially if the idea or image we want to convey is not
directly subject to observation. Intelligent quotient (IQ) is constructed
mathematically from observations of the answers given to a large number
of questions in an IQ test. No one can directly or indirectly observe IQ but it
is a real characteristic of people.
(c) Definitions
If the meaning of the concept is confused, the value of the research may be
destroyed. If the concepts used give different meanings to different people,
it indicates that the parties are not communicating on the same wavelength.
A concept may be defined with a synonym. For research purposes, the
definition must measure concepts, thus, needing a more rigorous definition.
(d) Variables
At the theoretical level, constructs and concepts are used to relate to
propositions and theory; at this level, constructs cannot be observed. At the
empirical level, propositions are converted into hypotheses and tested; at
this level, the concepts are termed as variables. The term „variable‰ is used
as a synonym for construct or the property being studied. Quantitative
variables usually take numerals or values as the indicator of the degree of
level. The followings are some commonly used quantitative variables:
On the other hand, qualitative variables do not have any numerical values
and mostly describe in subjective terms.
(e) Propositions
Propositions are statements about concepts which may be judged as true or
false if they refer to observable phenomena.
(f) Hypothesis
Hypothesis is a proposition that is formulated for empirical testing:
(g) Model
A model is a representation of a developed system used to study some
aspects of the system or the system as a whole. It is different from theory
because theory explains relationships in the system whereas a model is a
representation of the relationships in the system.
(h) Framework
A framework is an abstract representation of a phenomenon. It describes
the variables studied and the relationships among the variables. It can be
represented graphically in a diagram. Thus, in the early stage of a research,
a theoretical framework is usually constructed based on initial studies or
literature search. The theoretical framework is used to explain the
relationships that need to be investigated and tested in research. A
framework that has been successfully tested will be considered as the final
framework. A research will report the research findings by presenting the
final framework.
(i) Process
A process is developed for a specific purpose in a business organisation. It
aims to make some change in the organisation. For example, letÊs say a
company implements a process to improve its quality performance. This
process may involve changes in the structure (for instance, someone is
transferred to a different department) or operations (for example, the
quality inspection procedure is modified) of the organisation. In research, a
process is developed to help solve an organisationÊs problem or improve its
performance. The output of this research will be in the form of a new
process rather than a framework or model. A process is also called a tool,
procedure, method or system.
SELF-CHECK 1.2
1. What are the differences between proposition and hypothesis?
2. What are the differences between concept and construct?
3. What are the differences between model and framework?
SELF-CHECK 1.3
Variables are concepts and constructs used at the empirical level. They are
numerals or values that represent the concepts for the purpose of testing and
measurement.
A good hypothesis can explain what it claims, is testable and has greater range.
Concept Model
Construct Generalisation
Deductive models Process
Definition Proposition
Empirical Replicable
Framework Variables
Hypothesis
INTRODUCTION
Research usually involves a multi-stage process. Although the actual number of
stages may vary, research must include formulating and identifying a topic,
reviewing literature, planning a strategy, collecting data, analysing data and
writing a report. In discussing the research process, the presentation depicts a
stage by stage and straightforward rational discussion, although in real working
conditions of doing a research project, this is unlikely to be the case. The
researcher may have to revisit each stage more than once because each stage is
interrelated and may influence or be influenced by other stages. Each time a
researcher revisits a stage, he may have to reflect on the associated issues and
refine his ideas; in addition, he has to consider ethical and access issues during
the process.
SELF-CHECK 2.1
Identify the purpose of the research process and the main factors to
make it successful.
Investigative Questions
Once the research problem has been identified, the researcher has to think of the
problem in a more specific or focused way; this is the investigative question.
These are questions that the researcher must ask in order to get the most
satisfying conclusion regarding the research question. The specific questions will
help in determining the types of data to be collected.
Measurement Questions
These are questions that are actually asked of respondents in order to obtain
necessary data for analysis; these are questions that appear in the questionnaire.
If the research uses an observational approach, the measurement questions take
the form of records of the observations of the subject made by the researcher.
ACTIVITY 2.1
ACTIVITY 2.2
The general manager of the company you work for calls you to his office.
He is very worried about the companyÊs engineering department as the
turnover rate is quite high for technicians. He asks you to do a survey
among other major companies in the region to learn how they take care
of the problem of high turnover of technicians.
(a) Level of abstractness They are more metaphorical than real, for example,
profits cannot be observed directly but the effects can be recorded.
(b) Ability to be proven When the sensory experiences produce the same
result consistently, then the data is reliable and can be verified.
(c) Difficulty in obtaining data Obtaining data may be difficult due to the
speed of change at which events occur and the lapse in time of the
observation; changes occur with the passage of time.
(d) Level of representation of the phenomenon under study That is how close
it is to the real phenomenon.
Copyright © Open University Malaysia (OUM)
20 TOPIC 2 RESEARCH PROCESS
(a) Secondary data Data that have been collected and processed by one
researcher and reanalysed for a different purpose by another researcher.
(b) Primary data Data that has close proximity to the truth and control over
error, so careful designing for the collection of the data becomes pertinent.
Example 2.1
A subject/general problem must lead to a good topic one that raises
some questions which have not been answered to the satisfaction of all
authorities on the topic.
(b) Topic A reasonably narrow, clearly defined area of interest that could be
thoroughly investigated within the limits of the resources available to
undertake the research. A good topic raises questions that have no simple
answers. However, there are no absolute right or wrong answers. Research
is not geared towards making judgments as to who or what is right but
instead consists of assembling information from various sources in order to
present readers with a composite picture.
Example 2.2
The effects of parental attitudes on teenage pregnancy
The demand for recreation from domestic visitors in Langkawi
The role of certain traditional herbs in the cure of certain cancers
The effectiveness of improved communication systems on the
productivity of airline catering workers
The effects of the growth of the component manufacturing industry
on rural-urban migration of women
The role of local universities in the use of the English language in
primary schools
(c) Thesis A general statement which announces the major conclusions that
may be reached after a thorough analysis of all sources. The statement
should appear in the beginning of the research report (in the problem
statement); the main body of the report should explain, illustrate
(introductory stage), analyse (methodology sections), argue for, and in
some sense, prove the thesis (discussion and conclusion). The defence of the
If the thesis can be thought of early, then the researcher can easily limit the
reading in each source to just those passages that relate directly to the
thesis. However, this is not always possible. Thus, it will be most helpful to
think of the topic in terms of the possible thesis or hypothesis.
(d) Hypothesis The predictions (of the eventual thesis), made sometime
before reading the sources, as to what the research will reveal about the
topic i.e. what answers are expected to be found for the major questions
raised by the topic. As can be seen, the hypothesis (educated guess) can
help the researcher to find exactly what information (data, methods) is
needed as quickly and efficiently as possible, by keeping attention focused
on a limited number of specific aspects of the topic. A carefully worded
hypothesis can greatly reduce problems of searching for sources and
extracting from them the most useful information. In other words, the
hypothesis points to the right direction by indicating the specific questions
that need answers. The information/answer that either agrees or disagrees
with the hypothesis will bring the researcher closer to the truth, which is
the thesis of the researcher.
Forming the hypothesis should be done while choosing the topic. This is
because the topic involves unanswered questions and the hypothesis
predicts the possible answers. The hypothesis can thus test the
thoroughness of the research. The hypothesis should not be defended by
using only those that support it; for validity of the conclusions. Different
sources representing different viewpoints should be considered. The
mission of the research is to present readers with the full picture so that
they will have enough information to evaluate the conclusions.
SELF-CHECK 2.2
Once the problem has been identified, the researcher has to plan for the later
stages which are very much dependent on the type of problem identified.
In the planning phase, the researcher has to identify the methods of collecting
data, the techniques to analyse the data and the preparation of the report.
The researcher has to determine the design of the research because the design
will determine the type of data to be collected and the method of collecting
the data.
Creative design of the research will help in reducing the cost of the research.
INTRODUCTION
The review of literature is not properly understood by some learners. Some have
the opinion that literature review means collecting and compiling facts for the
research being undertaken. In fact, the literature review process needs analytical
thinking, critiquing ability and empirical approach. Review of literature is an
integral part of the entire research process. When you undertake a research
process, review of literature will help you to establish the theoretical roots of your
field of interest, clarify your ideas and develop your methodology. The review of
literature also helps you to integrate your findings with the existing body of
knowledge. You must remember that one of your important responsibilities in
research is to compare your findings with those of others, and that is why review
of literature plays a very important role in the research process.
For example, your literature review could justify whether your work is an
extension of what others have done. It could also indicate whether you are trying
to replicate earlier studies in a different context.
ACTIVITY 3.1
List some obstacles that learners may face in doing a literature review
for their theses or research reports. Discuss your answer during your
tutorial.
Figure 3.2 lists the main reasons why literature review is important.
It is fundamental that you know what others are doing in your field of
interest or similar topics as well as understand theories that have been put
forward and gaps that exist in the field.
SELF-CHECK 3.1
1. What is meant by literature review?
(a) Step 1: Search the Existing Literature in Your Research Area of Interest
Once you choose your topic of interest, make sure it is a well-researched
and well-studied area which could give you more literature of research to
choose from. Narrow your topics so that you can cover your selected topic
in depth. Comprehensiveness and narrowness of topic go hand in hand.
Now, you can proceed to search the existing literature. To effectively search
literature, have in mind some idea of the broad subject area and the
problem you wish to investigate. The first task would be compiling a
bibliography in your research area. Books and journals are the best sources
for literature in a particular research area. The sources include:
(i) Note the theories put forward, critiques, methods used (sample size,
data used, measurement procedure);
(iii) Find differences of opinions among researchers and jot down your
opinions about their validity; and
(a) The review is a mere description of various materials without showing the
relation between the studies and the main objective of the research topic.
(b) Students tend to cut and paste, which SHOULD NOT be encouraged.
Original works should be cited and quoted.
(c) Journals or reports that are included are not critically evaluated.
Critically evaluate the research questions, the methodology used and
recommendations made by the researchers.
There is some evidence to suggest that students sometimes do not read the
original works and instead take someone elseÊs work and cite it as though they
had read the primary source.
SELF-CHECK 3.2
1. What are the procedures involved in the review of literature?
ACTIVITY 3.2
Select three journals in the research area you are interested in and
identify the main contributions of those papers.
Literature review shows what has been done in the research topic and how
the intended study relates to earlier research.
Literature review delimits the study, relates the methods used by other
researchers as well as recommendations of earlier works and provides the
basis for the intended research task.
Literature review can reveal methods of dealing with the research problem
that may be similar to the difficulties you are facing.
Literature review will increase your confidence in your research topic if you
find other researchers have an interest in this topic and have invested time,
effort and resources in studying it.
INTRODUCTION
This topic introduces strategies to collect primary data. The process of collecting
primary data must be identified properly based on the purpose and objectives
of the research. Data used to answer research questions must come from the
appropriate population in order to be useful. If data is not collected from
the people, events or objects that can provide the correct answers to solve the
problem, then the process of collecting the data is a waste. The process of
selecting the right individuals, objects or events for study is known as sampling.
In order to ensure that the data collected is representative, a few terms related to
the concept of sampling must be understood.
(g) Sampling Frame Actual list of sampling units from which sample is taken.
ACTIVITY 4.1
SELF-CHECK 4.1
1. What are the advantages and disadvantages of a census?
2. What are the reasons for sampling? When is a census appropriate?
Accuracy means the degree to which bias is absent from the sample. There is
no systematic variance in the data and no variation in measures due to some
unknown influences that cause the scores to lean in one direction more than
another. For example, the peak season for a tourist destination falls during the
long school holidays; if a sample is taken only during the school holidays to
collect data on congestion, then accuracy of the data will be reduced.
ACTIVITY 4.2
Although a researcher cannot get 100% accuracy in the research
findings, why is it still important to have a good sample design?
SELF-CHECK 4.2
If a suitable list does not exist, then the researcher has to compile his own
sampling frame. It is important that the list is unbiased, accurate and
current. There may also be organisations that specialise in selling lists of
names and addresses for surveys. If the researcher uses this sample frame,
he must make sure of the way the sample is to be selected as well as how
the list was compiled and when it was last revised.
and checking the data. Thus, the determination of the sample size within
this compromise is influenced by:
(i) The confidence level in the data that is the level of certainty that the
characteristics of the data collected will represent the characteristics of
the total population.
(ii) The margin of error tolerated the level of accuracy required for any
estimate made from the sample.
SELF-CHECK 4.3
If Harun calculated that the adjusted minimum sample size was 439
and his estimated response rate was 30%, what would his actual
sample size be?
Periodicity within the population may skew the sample and results. For
instance, assume the sampling fraction is k = 4, and the list contains the
names of every male followed by a female. If the first selection is a male,
then the sample will contain only male respondents. Consequently, the
sample will be biased. If the population list has a monotonic trend, listing
from the smallest to the largest element, a biased estimate will result based
on the starting point.
It often involves large samples since there must be sufficient data to stratify
or cluster the population. However, if the method is indiscriminately used,
it will increase costs.
SELF-CHECK 4.4
What are the factors influencing the choice of the following sampling
designs?
(a) The use of non-probability sampling can satisfactorily meet the sampling
objectives. Sometimes, the true cross section of the population may not be
the objective of the research. For instance, if there is no desire or need to
generalise the results to the population parameters, the sample does not
have to be representative of the population.
(b) Another reason for choosing non-probability sampling is the lower cost and
time factor. Probability sampling is time consuming and expensive. If the
non-probability sampling is carefully controlled, it can produce acceptable
results.
(d) The non-probability technique may be the only feasible method if the total
population is not available for the study or not known. In such cases, the
sampling frame will not be available to choose the elements. It may
not be possible to determine completely that the respondent of the mail
questionnaire is actually the person selected or the true cross section of the
population.
The quota sampling has several advantages over the probability sampling.
It is in particular less costly and can be set up rather quickly. It does not
require a sampling frame and may be the only technique that can be used if
other techniques are not available. It is most useful if the population is
large; since the sample size is governed by the need to have sufficient
responses in each quota to enable subsequent analyses to be undertaken,
hence the total sample size may be more than 2000.
ACTIVITY 4.3
In a situation where the respondents live in rural areas, what is the
most effective type of sampling that can be used?
The sample size is the number of elements to be studied in the research project.
Determining the size is one of the great challenges of many junior researchers.
Some of the major considerations in sample size determination are namely:
(a) Importance of the decision (larger and representative sample size for
important decisions);
(b) The nature of the research (smaller size for exploratory);
(c) The number of variables (larger size if more variables are involved);
(d) The nature of data analysis (detailed and sophisticated statistical analysis
require larger random samples); and
(e) Resource availability. The size can be determined statistically or non-
statistically.
One of the commonly referred rules of thumb for determining sample sizes
especially in the exploratory research is the user-friendly model of Krejcie and
Morgan (1970). Many also have used their suggestion in determining sizes in
certain phrases of the probability sampling (for example, determining the size of
each stratum in a stratified sampling). They simplified the sample size
determination based on the respective target population sizes. Table 7.3 shows
some of the suggested sizes.
effect size, alpha, beta, and the population standard deviation in the calculation.
The sample size determination for confidence intervals presented in this manual
was adapted from Malhotra (1999) while the effect size approach was adapted
from Brewer (1996).
Alpha is the probability of rejecting the Null when the Null is indeed true.
Because the focus of hypothesis testing is to minimise the errors in making
a decision, an adequately small value of alpha is essential for the results to
be meaningful.
Power is the probability of correctly rejecting the Null. Since power refers to
correct rejection for the rejection to be meaningful, the power should be set
substantially high.
Effect size (ES) is the degree of association between the variables under
investigation. If the study is concerned with differences between two
populations, then the effect size refers to the magnitude of difference that
make it meaningful. A small effect size will allow the researcher to detect
even a small effect of the phenomenon. For example if it is hypothesised
that there is a true difference between male and female employees in terms
of their job satisfaction levels, a small difference in the mean scores of these
two populations (if the null is rejected) is good enough to provide evidence
of practical importance if the effect size is set to be small. A small effect size
is able to detect even small Âtrue differencesÊ if there is a difference between
the null and the alternate hypothesis.
Using this method, mainly in hypothesis testing, the minimum sample size
is defined as a function of alpha, power and the effect size (Brewer, 1996;
Cohen, 1977).
For one sample hypothesis testing, the minimum sample size is defined as
N = [(Z + Z ) /ES]2
SELF-CHECK 4.5
SELF-CHECK 4.6
For each of the following research questions, it has not been possible to
obtain a sampling frame. Suggest the most appropriate non-probability
sampling technique to obtain the necessary data, giving reasons for
your choice.
(a) What can social services provide to homeless people?
(b) Which television advertisements are most remembered by the
public watching last weekend?
(c) How are manufacturing companies planning to respond to the
introduction of highway tolls?
(d) Would users of a squash club be prepared to pay a 10% increase
in subscription fees to help fund two new extra courts (answer
needed by tomorrow morning)?
The logic of sampling is that there are similarities among the elements in a
population that can adequately represent the characteristics of the total
population.
Some of the elements may underestimate the true value of the population,
but others may overestimate the value. The combination of these estimates
gives the statistics; which give a true value estimated population.
A good sample must be precise; the sampling error is within acceptable limits
for the purpose of the study.
The choice of the sampling design depends on the objectives and the research
questions of the study.
The size of the sample depends on the accuracy of the results required, the
confidence level of the study and the resources available to collect and
analyse the data.
The probability sample design is the ideal design, since it allows the
determination of the level of error likely to be produced. It is often time
consuming and expensive.
Census Population
Cluster sampling Parameters Population case
INTRODUCTION
This topic begins with an explanation of conceptualisation and
operationalisation. The definition of concepts and the methods of measuring the
concepts will help the researcher to determine the methods of collecting and
analysing data. The process of defining concepts is important in a research to
ensure that readers have the same understanding as the researcher; this will
prevent any confusion or misunderstanding by readers in interpreting the
meaning of the concept.
5.1 CONCEPTUALISATION
In a research, we use concepts that vary in levels of abstraction; from simple
concepts such as shoes, table and height, to the most abstract such as satisfaction,
marketability, love and stress. It is necessary to clarify the meaning of the
concepts used in order to draw meaningful conclusions about them.
Conceptualisation may differ among researchers but definitions are specific and
unambiguous. Therefore, even if one disagrees with the definitions, he has a
good idea of how to interpret the results because the definitions are clear and
specific.
ACTIVITY 5.1
How do you define the concept of socio-economic status in terms of
nominal definition and operational definition?
5.2 OPERATIONALISATION
Once the concepts have been identified, the next step is the process of developing
the specific research procedures/operations that will result in empirical
observations representing those concepts in the real world.
Example 5.1
Operationalising the concept of an individual/person: Variable Individual
Attributes Gender characteristics (male/female)
Nominal Definition An individual is either a male or female
Operational Definition If B defines/represents an individual
5.3 VARIABLES
At the theoretical level, concepts and constructs are used; whereas at the
empirical level, the constructs are transformed into variables. Thus, variables are
the construct or property to be studied. A variable consists of logical groupings
or sets of attributes/values.
(e) Intervening IVV shows the link between IV and DV; it acts as a DV with
respect to an IV and as an IV with respect to a DV.
ACTIVITY 5.2
What are the relationships between IV, DV and IVV? How does the
inclusion of MV change or affect the relationship?
5.4 MEASUREMENT
The concepts used in a research are divided into objects or properties. Objects are
things such as shirts, hands, computers, shoes, books and papers. Things that are
not so concrete such as genes, nitrogen, attitudes, stocks and peer-group pressure
are also included as objects. Properties or attributes, on the other hand, are the
characteristics of the objects.
Measuring the properties indicators of the objects makes the measurement of the
objects or characteristics more sensible. It is easy to see that A is older than B, and C
participates more than D in a group discussion. Indicators such as age, working
experience and number of reports done can be easily measured. Hence, they are so
commonly accepted that one considers the properties to be observed directly.
The accepted rules in using numbers to map the observation of the indicators
include:
(a) Order of numbers One number is greater than, less than or equal to
another number;
(b) Difference between numbers The difference between any pair of numbers
is greater than, less than or equal to the difference between any other pair
of numbers; and
(c) The number series has a unique origin indicated by the number zero.
SELF-CHECK 5.1
Using these rules of order, distance and origin of the data are classified into the
following types of scales:
SELF-CHECK 5.2
Example 5.2
Please indicate your preference among the types of examination designs
below by using the following scales:
Example 5.3
Using the scale below, please indicate your choice for each of the items
that follow, by circling the number that best describes your feeling.
The interval scale has equal magnitude of differences in the scale point. The
magnitude of difference represented by the space between 1 and 2 on the
scale is the same as the magnitude of difference represented by the space
between 4 and 5, or between any other two points. Any number can be
added to or subtracted from the numbers on the scale. Assuming the
magnitude of the difference is still retained, if a 6 is added to all five points
on the scale, the interval scale will become 6 to 11; the magnitude of the
difference between 7 and 8 is still the same as the magnitude of the
difference between 10 and 11. Thus, the origin or the starting point could be
any arbitrary number.
The interval scale taps the differences, the order and the equality of the
magnitude of the differences in the variable. It is a more powerful scale
than the ordinal and nominal scales. It allows the measuring of the central
tendency, mean, dispersion, range, standard deviation and variance.
Highlights Measure of
Measure of
Scales Unique Central
Difference Order Distance Dispersion
Origin Tendency
Example 5.4
(c) How many books have you read in the last two weeks?
(d) How many times have you visited a shopping complex in the last
month?
The measures of central tendency of the ratio scale could be either the arithmetic
or the geometric mean; and the measure of dispersion could be the standard
deviation, variance or the coefficient of variation.
ACTIVITY 5.3
1. What is the meaning of measurement in a research study? Give
three steps of the measurement process.
SELF-CHECK 5.3
The category scale uses multiple items to elicit a single response; the
nominal scale is also used to measure the response.
Among the easy reading magazines listed below, which ones do you
like to read?
(i) Time
(ii) ReaderÊs Digest
(iii) National Geographic
(iv) Far Eastern Economic Review
(v) Vogue
(vi) Family
(vii) Others (specify)
Bad Good
Fair Unfair
Clean Dirty
Modern Traditional
Bad 1 2 3 4 5 Good
Fair 1 2 3 4 5 Unfair
Clean 1 2 3 4 5 Dirty
Modern 1 2 3 4 5 Traditional
Room space
Room décor
Cleanliness
Price
Housekeeping service
Total points 100
Please indicate how you would rate the restaurant with respect to each
of the characteristics mentioned below, by circling the appropriate
number.
Services 3 2 1 +1 +2 +3
Cleanliness 3 2 1 +1 +2 +3
Prices 3 2 1 +1 +2 +3
Example 5.5
For each pair of national parks, place a check beside the one you most
prefer if you had to choose between the two.
Example 5.6
Please rank the following daily newspaper you would like to subscribe
in order of preference, assigning 1 for the most preferred choice and 5
for the least preferred.
Example 5.7
Compared to your previous visit to this holiday destination, your
present visit is:
1 2 3 4 5
Reliability and validity are associated with how concretely connected the
measures are to the constructs because perfect reliability and validity are
impossible to achieve. It is important to establish the truthfulness, the credibility
or the believability of findings, with no random or systematic errors. Thus,
reliability and validity are considered as the scientific criteria of the
measurement.
(a) Will the measures give the same results on other occasions?
(b) Will similar observations be reached by other observers?
(c) Is there transparency in how sense was made from the raw data?
It reflects how well an idea about reality fits with actual reality the extent to
which the empirical measurement adequately reflects the real meaning of the
concept. In other words, it measures what it is supposed to reflect. Major threats
to validity include:
(a) History If certain events or factors that have impact on the relationships
occur unexpectedly while the study is being conducted, and this history of
events confounds the cause-effect relationship between the variables, then
the validity of the results may be affected.
(b) Maturation Effects The time passage of the relationship can influence the
cause and effect among variables and cannot be controlled. The maturation
effects are a function of processes operating within the respondents as a
result of the time passage. Examples of maturation processes include
growing older, getting tired, getting bored and feeling hungry.
(c) Testing Effects A pre-test given to the subjects in order to improve the
instruments used may actually have effects on the actual test or post-test;
the very fact that the respondents were exposed to the pre-test might
influence their responses.
Example 5.8
Relationship between reliability and validity are shown using this example:
You use a bathroom scale to measure your weight. If the scale measures your
weight correctly, then the scale as a measuring tool is both reliable and valid.
If the scale is tampered and consistently gives an overweight of 6 kg every
time it is used, it is reliable but not valid. If the scale gives an erratic weight
reading from time to time, it is neither reliable nor valid.
The measuring device should also be easy to administer; the design of the
instruments used should allow easy comprehension and have complete and clear
instructions. If the instrument is to be administered by people other than the
designer, then it must also be easy to interpret.
SELF-CHECK 5.4
What are the four major sources of measurement errors? Give an
example of how each source can affect the measurement results in a
face-to-face interview.
SELF-CHECK 5.5
Measurements usually use some type of scale to classify or quantify the data
collected.
Four types of scales are used in increasing order of power: nominal, ordinal,
interval and ratio.
The ratio scale indicates the magnitude and proportion of the differences.
The data becomes more precise when we move from the nominal to the ratio
scale and allow the use of more powerful statistical tests.
INTRODUCTION
The type and amount of data collected depends on the nature of the study
together with its research objectives. If the study is exploratory, the researcher is
likely to collect narrative data through the use of focus groups, personal
interviews or observation of behaviour or events. These types of data are known
as qualitative.
Surveys are popular because they allow the collection of a large amount of data
from a sizeable population in a highly economical way. This data is standardised
and often obtained by using a questionnaire to allow for easy comparison. In
addition, the survey strategy is perceived as authoritative by people in general.
Every day, a news bulletin or a newspaper article reports the results of a new
survey indicating a certain percentage of the population that thinks or behaves in
a particular way. The reliability and validity of the findings in survey depends on
the quality of the instrument used.
Methods of collecting survey data fall into two broad categories: self-completion
and interviewer-administered.
On the other hand, questionnaires are used to collect quantitative data from a
large number of individuals in a quick and convenient manner. In this topic, the
focus will be on the survey technique used for data collection.
SELF-CHECK 6.1
Explain the difference between questionaire and observation inventory.
Explain the use of these instruments by providing appropriate
examples.
SELF-CHECK 6.2
You have been asked by the management to carry out a study on
sexual harassment at the workplace after the female employees
expressed their concerns on the matter. Which method would you
choose to collect data?
Example 6.1
Probing questions can be used to explore responses that are of
significance to the research topic. They can be worded like open
questions but can also require a particular focus or direction. Examples
of this type of question include:
„How would you evaluate the success of this new marketing
strategy?‰
„Why did you choose a compulsory method to make
redundancies?‰
„What external factors caused the corporate strategy to change?‰
perhaps an hour and a half. A rule of thumb for mail surveys is that they do
not exceed more than six pages.
Our image of the person who does business research is a typical „dedicated
scientist.‰ Unfortunately, interviewers who are hired as researchers do not
necessarily conjure the perceived image. Sometimes, interviewers may „cut
corners‰ to save time and energy. They may fake parts of their reporting by
„dummying up‰ part of or the entire questionnaire. Control over interviewers is
important to ensure that difficult and time-consuming questions are handled
properly.
(a) Cost
Personal interviews are generally more expensive than mail and telephone
interviews. The geographical proximity of respondents, the length and
complexity of the questionnaire, and the number of non-respondents can
affect the cost of the personal interview.
(c) Callbacks
When a person selected to be in the sample cannot be contacted on the first
visit, a systematic procedure is normally initiated to call him or her back at
another time. Callbacks, are the major means of reducing non-response
error. The cost of an interviewer calling back on a sampling unit is more
expensive (per interview) because subjects who were initially not at home
are generally more dispersed geographically than the original sampling
units.
Callbacks are important because individuals who are away from home at
point of call (working women) may vary from those who are at home (non-
working women, retired people, etc).
ACTIVITY 6.1
(a) Speed
In telephone interviewing, the speed of data collection is a major
advantage. For example, union officials who wish to conduct a survey on
membersÊ attitudes towards a strike may conduct a telephone survey
during the last few days of the bargaining process. Rather than taking
several weeks for data collection by mail or personal interviews, hundreds
of telephone interviews can be conducted overnight. When the interviewer
enters the residentsÊ answers directly into a computerised system, the rate
of data processing escalates.
(b) Cost
As the cost of personal interviews continues to increase, telephone
interviews are becoming relatively inexpensive. Telephone interviews cost
approximately 40 percent less than the cost of personal interviews. Costs
are further reduced, when travelling costs are eliminated and the
interviews are centralised and computerised.
(d) Cooperation
In some neighbourhoods, people are reluctant to allow a stranger to come
even to the doorstep. The same individual, however, may be willing to
cooperate in a telephone survey. Likewise, interviewers can be reluctant to
conduct face-to-face interviews in certain neighbourhoods, especially
during evening hours. Some individuals will refuse to participate and the
researcher should be aware of potential non-response bias. The likelihood
of an unanswered call and not-at-home respondent varies by the time of
day, the day of the week and the month of the year.
(e) Callbacks
Situations like an unanswered call, a busy signal or a respondent who is not
at home require a callback. Telephone callbacks are less expensive than
personal interview callbacks. Houses with telephone answering machines
are more common nowadays. Although their effect has not been studied
extensively, it is clear that many individuals will not return a call to help
someone conducting a survey. Some researchers argue that leaving a
proper message on an answering machine will produce return calls. The
message left on the machine should explicitly state that the purpose of the
call is not sales-related. Others believe no message should be left on the
machine because respondents can be reached eventually if the researcher
calls back. Many people do not allow their answering machines to record
100 percent of their calls. If enough callbacks are made at different times
many respondents can be reachable through the telephone.
ACTIVITY 6.2
Do you think that the interviewers can get accurate information from
telephone interviews? What if the respondents give biased answers?
How can the interviewers be certain?
SELF-CHECK 6.3
Questionnaires can also be distributed via fax machines. These fax surveys
eliminate the senderÊs printing and postage costs and are delivered and/or
returned faster than traditional mail surveys. Of course, most households do not
have fax machines. However, when the sample consists of organisations that are
likely to have fax machines, the sample coverage may be adequate.
Questionnaires are usually printed on paper but they can be programmed into
computers and distributed via e-mail or on the Internet. No matter how a self-
administered questionnaire is distributed to the members of the sample, it is
different from interviews because the respondent takes responsibility for reading
and answering the questions.
ACTIVITY 6.3
On the other hand, the respondent will not have the opportunity to
ask the interviewer questions. Problems or misunderstandings will
remain in a mail survey. Unlike face-to-face interview, probing cannot
be done to get additional information or clarification of an answer.
or surveys of retail buyers who regularly deal with the organisation via e-
mail. The benefits of this method include cheaper distribution and
processing fees, faster turnaround time, more flexibility, and less paper
chasing.
SELF-CHECK 6.4
(e) Internet
The Internet is a rapidly growing source of information. More people are
getting access to the Internet today and are using it as a quick reference.
However, like mass media, information from the Internet can be
questionable. The authenticity and credibility of the Internet source is an
issue. This is mainly because anyone could put up anything on the Internet.
Regular surveys are those undertaken repeatedly over time or at regular intervals
by various organisations. They may be used for comparative purposes,
monitoring purposes or general purposes by public organisations, non-
governmental organisations or private firms. The data may have gone through
detailed analyses and the results of the surveys may be kept in many different
forms. Data collected by certain private firms or organisations may not be
accessible to individual researchers if the information produced from the surveys
is sensitive in nature.
Ad-hoc surveys are usually one-off surveys and undertaken for specific
purposes. Organisations, government and independent researchers may carry
out the surveys on an ad-hoc basis. To get the relevant data requires substantive
search because of the nature of the ad-hoc surveys. The data from ad-hoc surveys
may be kept in aggregate form, thus the data may have to be reanalysed.
6.5.5 Triangulation
Triangulation entails using multiple source of data to study the same
phenomena. The concept is similar to the concept in physical science whereby
multiple reference points are used to locate an objectÊs exact location. The concept
has been adopted in research whereby more than one data collection method
would be employed in order to increase confidence about the findings.
Triangulation can be used in either quantitative or qualitative research.
Furthermore, combining quantitative and qualitative methodologies in one
research study is actually a way to triangulate the research findings.
ACTIVITY 6.4
1. What is secondary data?
2. What is the purpose of collecting secondary data?
3. Give three examples of different situations where secondary
data might be used.
Secondary Data
Advantages Disadvantages
(a) Have fewer resource requirements (a) Does not meet the purpose of study
Save cost and time Data collected may differ, be
Less expensive inappropriate or irrelevant for the
present study (outdated).
(b) Unobtrusive (b) Difficult or costly access
Quickly obtained and of higher Data mining for commercial purposes
quality. uses a lot of time and money.
(c) Feasible longitudinal study (c) Unsuitable aggregations and
Compiled and recorded data is definitions
used using comparable methods on Aggregation and inappropriate
regional and international bases. definition of data cause difficulties in
combining different data sets.
(d) Comparative and contextual data (d) No real control over data quality
Collected data is compared with Data sets are not always of higher
secondary data to determine the quality.
representativeness of the Predispositions, culture and ideals
population of original collector influences the
nature of the data.
(e) Unforeseen discoveries
May lead to unexpected new
discoveries.
(f) Permanence of data
Permanent and available data can
be easily checked by other
researchers and is open to public
scrutiny.
(a) A researcher interested in small farm tractors finds that the secondary data
on the subject is broader, less pertinent in category and encompasses all
agricultural tractors. Moreover, the data was collected five years ago.
(b) An investigator wishing to study those who make more than RM100,000
per year finds the top-end category in a secondary study reported at
RM75,000 or more per year.
(d) The Daily Gold Index reports the stock market indicator series. This
secondary data source reflects the prices of 50 non-randomly selected blue
chip stocks. This data is readily available and inexpensive, thus the source
of information may not suit the needs of individuals concerned with the
typical companies listed on the KLSE.
ACTIVITY 6.5
Secondary data is abundant online. All one needs is a good search engine
and a little imagination. Many libraries have access to many search engines
that charge a fee to use them. Table 6.2 shows a few examples.
Sources Addresses
Ministry of Agriculture http://agrolink.moa.my
Department of Statistics, Malaysia www.statistics.gov.my
www.census.govmain/www/stat_int.ht
ml
Bank Negara Malaysia www.bnm.gov.my
Malaysia Industry, Investment, Trade www.miti.gov.my
and Productivity (MITI)
Summary of Annual Fisheries Statistics agrolink.moa.my/dof/statdof.html
Tourism Malaysia www.tourism.gov.mytourism.gov.my/sta
tistics/statistics.asp
Department of Civil Aviation Malaysia www.dca.gov.my/homeng.htm
Malaysian Key Economic Indicators jpbpo@stats.gov.my, hadi@stats.gov.my
Websites of National Statistical Offices www.planet-venture.de/seiten/stat.htm
and other national bodies dealing with
statistics Department Of Statistics,
Ministry of International Trade
SELF-CHECK 6.5
ACTIVITY 6.6
SELF-CHECK 6.6
Door-to-door personal interviews get high response rates but they are also
more costly to administer than the other forms of surveys.
However, not all households have telephones and not all telephone numbers
are listed in directories; this causes problems in obtaining a representative
sampling frame.
Absence of face-to-face contact and inability to use visual materials are other
limitations of telephone interviewing.
Mail questionnaires must be more structured than other types of surveys and
cannot be changed if problems are discovered in the course of data collection.
Questionnaires are now distributed electronically via e-mail, fax machine and
by sending computer disks by mail.
Surveys are also conducted using the Internet and interactive kiosks.
Secondary data is gathered and recorded prior to (and for purposes other
than) the current needs of the researcher. It is usually historical and already
assembled, and does not require access to respondents or subjects.
Primary data is data gathered for the specific purpose of the current research.
The main advantage of secondary data is that it is almost less expensive than
primary data.
One of the main sources of secondary data for business research is internal
proprietary sources such as accounting records.
Due to the rapid changes in computer technology, they are now almost as
easily accessible as internal data. Hence, the distribution of multiple types of
related data by single-source suppliers has radically changed the nature of
research using secondary data.
INTRODUCTION
According to Christensen, „research design refers to the outline, plan or strategy
specifying the procedure to be used in seeking an answer to the research
question. It specifies such things as how to collect and analyse the data‰. The
design of an experiment will show how extraneous variables are controlled. The
design will determine the types of analysis that can be done to answer your
research questions and the conclusions that can be drawn from your research.
The extent to which your design is good or bad will depend on whether you are
able to get the answers to your research questions. If your design is faulty, the
results of the experiment will also be faulty. How do you go about getting a good
research design that will provide answers to the questions asked? It is not easy
and there is no fixed way of telling others how to do it. The best that can be done
is to examine different research designs and to point out their strengths and
weaknesses, and leave it to you to make the decision.
praising the pupils. You will find that their performance in mathematics is
significantly improved.
You conclude that praise increases the pupils' mathematics score. This design is
weak for the following reasons:
(a) Selection Bias: It is possible that the pupils you selected as subjects were
already good in mathematics.
(b) History: The school had organised a motivation course on mathematics for
Year 4 pupils. So, it is possible that it might influence their performance.
ACTIVITY 7.1
The three designs described are „weak‰ research designs because they do not
allow for extraneous factors that might influence the outcome of the
experiment to be controlled within the research construct. For example, if the
attitude towards mathematics and additional tuition classes in mathematics
are not controlled, it may not be possible to conclude that „praise‰
(treatment) affects mathematics performance (dependent variable). Also,
poor research designs do not attempt to randomly assign subjects to the
groups. This introduces extraneous factor affecting the dependent measure.
Random assignment controls for both known and unknown extraneous
variables that might affect the results of the experiment.
SELF-CHECK 7.1
1. Identify the major differences between the one-shot design,
one-group pre-test post-test design and non-equivalent post-test
only design.
ACTIVITY 7.2
What is the difference between the two designs? The after-only design relies only
on a post-test while the before-after design (as the name suggests) relies on both
a pre-test and a post-test.
students in the control group were not taught using the inductive approach.
Instead, students in this group were taught the same science content using the
traditional didactic approach („chalk-and-talk‰ method).
In the above example, the experimental and control groups consist of two
different sets of students. This procedure is called a between-subjects design (also
sometimes known as an independent or unrelated design). One advantage of this
design is that the students are less likely to get bored, with the study because
each set of students is exposed to only one condition. In a similar vein, the
research is less susceptible to practice and order effects. However, you will need
more students to participate in your research. There is also a need to ensure that
both groups of students are homogeneous in any confounding variables that
might affect the outcome of the study. This is because different students bring
different characteristics to the experimental setting. Even though we randomly
assign students to experimental and control conditions, we might allocate
students with one characteristic to one condition by chance, and this might
produce confusing results.
One obvious advantage is that you need fewer students to participate in your
research. Besides, you will have much greater control over confounding variables
between conditions because the same students are used in both conditions. By
large, the same individual will bring the same characteristics to the conditions.
However, it is not all rosy in the within-subjects design. First, since the same
students are exposed to different conditions, they might get bored by the time
they are given the experimental treatment in the later condition. Besides, there is
an increased likelihood of practice and order effects.
SELF-CHECK 7.2
1. What is the main strength of „true‰ experiments?
SELF-CHECK 7.3
1. What is the main advantage of using factorial design?
For example, sometimes it is not possible to assign students to groups which are
a requirement of strong experimental research. Due to logistic reasons, it is
challenging to randomly assign subjects to groups and so a whole class may have
to be used in the research. Is it still possible to do an experiment despite these
limitations? The answer is „yes‰, you can use a quasi-experimental design.
The fact there is no random assignment means that subjects in the experimental
group and control group may not be equivalent for all variables. For example,
you could have more poor performing students in the control group compared to
the experimental group. Hence, it may be difficult to establish whether the better
performance of the experimental group is due to the treatment or because there
are more high performing students in the group.
In the non-equivalent control-group design, both groups are given first a pre-test
and then a post-test (after the treatment is given to the experimental group). The
pre-test score and the post-test score are compared to determine if there are
significant differences.
When you cannot assign subjects randomly, you can be sure that extraneous
variables or factors will influence the experiment and threaten its internal
validity. Do you leave it alone or do you take action regarding the external
threats?
Cook and Campbell proposed the following steps to enhance the internal validity
of the non-equivalent control-group design or quasi-experiments in general:
(a) Selection: Ensure that subjects in the experimental and control groups are
matched in terms of important variables that may affect the results of the
experiment. For example, match subjects in terms of academic ability, IQ,
attitudes, interests, gender, socioeconomic background and so forth.
(b) Testing: Ensure that the time period between the pre-test and post-test is
not too short that subjects are able to remember the questions given to them
earlier.
(c) History: Ensure that events outside the experiment do not affect the
experiment. The problem is most serious when only subjects from one of
the groups are exposed to such events (e.g. motivation talks, private
tuition).
(d) Instrumentation: Ensure that the pre-test and the post-test are similar. If a
different test is used, you should make sure that the two tests are
equivalent in terms of what it is measuring (i.e. high reliability and
validity).
A hypothetical example may illustrate how the interrupted time series design is
used. Say that you want to determine whether positive reinforcement encourages
slow learners to be more attentive. Identify a group of 11-year-olds who are slow
learners and persuade them to attend an experimental classroom for at least one
period each school day as in Figure 7.8.
This assessment reveals that the percentage of students who were attentive and
focused remained rather constant during the first three baseline class sessions, or
the class sessions prior to the implementation of the positive classroom
environment. After the implementation of positive classroom environment, the
percentage of attentive behaviour increased gradually over the next three class
sessions, suggesting that the implementation of positive approach had a
beneficial effect on the behaviour of inattentive students.
SELF-CHECK 7.4
1. What is the meaning of non-equivalent in the non-equivalent control
group design?
2. How can you enhance the internal validity of quasi-experimental
research designs?
3. When would you use the interrupted time-series design?
Any researcher conducting an experiment must ensure that the dignity and
welfare of the subjects are maintained. The American Psychological Association
published the Ethical Principles in the Conduct of Research with Human
Participants in 1982. The document listed the following principles:
(a) In planning a study, the researcher must take responsibility to ensure that
the study respects human values and protect the rights of human subjects.
(b) The researcher should determine the degree of risk imposed on subjects by
the study (e.g. stress on subjects, subjects required to take drugs).
(c) The principal researcher is responsible for the ethical conduct of the study
and be responsible for assistants or other researchers involved.
(d) The researcher should make it clear to the subjects before they participate in
the study regarding their obligations and responsibilities. The researcher
should inform subjects of all aspects of the research that might influence
their decision to participate.
(e) If the researcher cannot tell everything about the experiment because it is
too technical or it will affect the study, then the researcher must inform
subjects after the experiment.
(f) The researcher should respect the individualÊs freedom to withdraw from
the experiment at any time, or refuse to participate in the study.
(g) The researcher should protect subjects from physical and mental
discomfort, harm and danger that may arise from the experiment. If there
are risks involved, the researcher must inform the subjects of that fact.
(h) Information obtained from the subjects in the experiment is confidential
unless otherwise agreed upon. Data should be reported as group
performance and not individual performance.
ACTIVITY 7.3
1. What are some ethical principles proposed by the American
Psychological Association with regard to doing experiments
involving human subjects?
1. Make a case for the superiority of true experimental designs.
2. What are the quasi-experimental research designs and how do they
differ from true experiments?
3. Discuss the circumstances in which researchers have to use intact
groups.
4. What can a researcher do to increase the equivalence of subjects in the
control and experimental groups in a quasi-experiment design?
Weak research designs do not allow for the control of extraneous factors that
might influence the experiment.
True experimental designs enable the researcher to maintain control over the
situation in terms of assignment of subjects to groups.
Examples of true designs are after-only research design, factorial design and
before-after research design.
INTRODUCTION
The term „qualitative research‰ is a general definition that includes many
different methods used in understanding and explaining social phenomena with
minimum interference in the natural environment. Qualitative research begins by
accepting that there are many different ways of understanding and making sense
of the world. You are not attempting to predict what may happen in the future.
You want to understand the people in that setting (e.g. What are their lives like?
What is going on for them? What beliefs do they hold about the world?) In short,
qualitative research relates to the social aspects of our world and seeks to find
out answers for the following questions:
Why do people behave the way they do?
How are opinions and attitudes formed?
How are people affected by the events occurring in their surroundings?
How and why cultures have developed in the way they have?
What are the differences between social groups?
A research work deploying the case study method may have single or multiple
cases. Conclusions can be drawn upon the similarities or differences among the
cases involved in a research work. Figure 8.4 shows the sequence of case study
(Yin, 1994) in a research work.
Case studies can be in single or multiple designs. Single case design is ideal for
studying extreme cases in order to confirm or challenge a theory. Additionally it
is also used in that cases a researcher did not have access previously. However, it
is important for a researcher to be careful in interpreting what is being observed.
A multiple case design is appropriate when a researcher is keen to use more than
one case to gather data and draw upon a conclusion based on the facts. The
multiple case design confirms the evidence which enhance the reliability and
validity of a research work.
8.2.3 Ethnography
Ethnography is a qualitative research method which involves description of
people and nature of phenomena. Atkinson and Hammersley (1994) suggested
that ethnography involves exploring the nature of phenomena, working with
unstructured data and analysing data through interpretation of the meanings
attributed by research respondents. This method involves primary observations
conducted by a researcher during a stipulated period.
Although it originates from social research, the method is now widely used in
other fields as well.
They also defined that a „category‰ emerges from the data and may stand by
itself as a conceptual element. The term „grounded‰ refers to the idea whereby a
theory emerged from the study is derived from and „grounded‰ in data collected
in the field rather than taken from research literature.
content analysis requires thorough planning from the very beginning. Research
problem or research questions need to be specified from the beginning.
SELF-CHECK 8.1
(a) Organise data into several forms (i.e. database, sentences or individual words);
Copyright © Open University Malaysia (OUM)
126 TOPIC 8 QUALITATIVE RESEARCH METHODS
(b) Peruse the data sets several times to gain a complete picture or overview of
what it contains as a whole. During the process, a researcher should jot
down short notes or summarise of the key points that suggest possible
categories or interpretations;
(c) Identify general categories or themes and classify them accordingly. This
will help a researcher to see a pattern or meaning of the data obtained; and
(d) Finally, integrate and summarise the data for the audience. This step also
may include hypotheses that state the relationships among those categories
defined by the researcher. The data summary could be represented by
table, figure or matrix diagram.
The stages in the analysis of qualitative data are shown in Figure 8.5. It usually
begins with familiarisation of the data, transcription, organisation, coding,
analysis (grounded theory or framework analysis) and reporting (though the
order may vary).
(a) Familiarisation
The first step of data analysis is familiarisation in which you listen to tapes
and watch video material, read and re-read field notes, and make memos
and summaries before formal analysis begins. This is especially important
when besides you, others are also involved in data collection. You have to
get familiar with the field notes they made (perhaps try to decipher their
handwriting!).
(b) Transcription
Almost all qualitative research studies involve some degree of transcription.
What is transcription? Transcription is the process of converting audio or
video-recorded data obtained from interviews and focus groups as well as
handwritten field notes into verbatim form (i.e. written or printed) for easy
reading. Why do you have to do this? If you were to analyse directly from an
audio or video recording, there is the likelihood that you may include those
sections that seem relevant or interesting to you and ignore others. With a
transcript of everything that you observed and recorded (audio, video or field
notes), you get the whole picture of what happened and the chances of your
analysis being biased is minimised.
(c) Organisation
After transcription, it is necessary to organise your data into sections that
is easy to retrieve. What does this mean? Say for example, in your study
you interviewed 10 teachers (30 minutes each) on their opinion about the
leadership style of their principal. It is advisable that you give each teacher
a pseudonym (e.g. Elvis, Jagger, Dina ⁄ not their real name) or referred to
by a code number (e.g. T1, T2⁄..T10). You need to keep a file that links the
pseudonym or code number to the original informants which are to be kept
confidential and destroyed after completion of the research. Names and
other identifiable material should be removed from the transcripts.
The narrative data you obtained from the 10 teachers need to be numbered
depending on your unit of analysis. In other words, you have to determine
whether you intend to analyse at the word level, sentence level or
paragraph level and they have to be numbered accordingly. Make sure that
the unit of text you use can be traced back to its original context.
(d) Coding
Coding is the process of examining the raw qualitative data in the
transcripts and extracting sections of text units (words, phrases, sentences
or paragraphs) and assigning different codes. This is done by marking
sections of the transcript and giving a numerical reference, symbol,
descriptive words or category words. Most of the text (or transcript) will be
marked and given different codes which will be later refined or combined
to form themes or categories.
Grounded theory has evolved from the work of sociologists Glaser and Strauss
(1967). It is a method to conduct qualitative research and is an inductive
method of qualitative research in which theory is systematically generated
from data. However, many studies in education, business, management and in
the health field (especially in nursing), have adopted grounded theory as a
procedure for conceptualising and analysing data without taking on the whole
methodology. The appeal of grounded theory analysis is that it allows for the
theory to „emerge‰ from the data through a process of rigorous analysis (see
Figure 9.1). The word „theory‰ is used to refer to the relationships that exist
among concepts generated from the data and to help us understand our social
world better (Strauss and Corbin, 1998).
The main feature of the grounded theory procedure is the use of the
constant comparison technique. Using this technique, categories or concepts
emerged from a stage of analysis are compared with categories or concepts
emerged from the previous stage. The researcher continues with this
technique until a situation called „theoretical saturation‰ is reached.
Theoretical satisfaction refers to a situation where no new significant
categories or concept emerge. The grounded theory procedure is cyclical,
involving frequent revisiting of data in the light of emergence of new
categories or concepts as data analysis progresses. The theory being
developed is best seen as provisional until proven by the validation of data
from others.
data analysis allows the researcher to set the categories and themes from
the beginning of the study. However, this approach also allows for
categories and themes that may emerge during the data analysis process
which the researcher had not stated at the beginning of the study.
Using the headings, you can create charts of your data so that you can
easily read across the whole data set. Charts can be either thematic for each
theme or category across all respondents (cases) or by case for each
respondent across all themes:
In the chart boxes, you could put line and page references to relevant
passages in the interview transcript. You might also want to include some
text e.g. key words or quotations as a reminder of what is being referred to
(see (i) and (ii)). For example, under the theme Psychological Causes, Case 2
talks about „stress in the workplace‰ while Case 3 talks about „business
failure‰.
Next, let us look at the data analysis spiral, as illustrated by Creswell, 1998, in
Figure 8.6.
ACTIVITY 8.1
Qualitative method also involves field work where a researcher must participate in the
setting especially for observation and interviews with respondents of the research
topic. Table 8.1 lists the differences between qualitative and quantitative research.
Qualitative Quantitative
Focus Quality (features) Quantity (how much, numbers)
Philosophy Phenomenology Positivism
Method Ethnography/Observation Experiments/Correlation
Goal Understand, meaning Prediction, test hypothesis
Design Flexible, emerging Structured, predetermined
Sample Small, purposeful Large, random, representation
Data collection Interviews, observation, Questionnaire, scales, tests,
documents and artefacts inventories
Analysis Inductive (by the researcher) Deductive (by statistical methods)
Findings Comprehensive, description Precise, numerical
detailed, holistic
Researcher Immersed Detached
takes the form of communication of the respondents itself, extracts from research
documents, multimedia resources like audio and video recordings. These also
support the finding of a study.
SELF-CHECK 8.2
Collective administration and mailed questionnaires are the two most used
techniques in questionnaires distribution to respondents.
INTRODUCTION
The goal of most research is to provide information. There is a difference
between raw data and information. Information refers to a body of facts that is in
a format suitable for decision-making, whereas data is simply recorded measures
of certain phenomena. Raw data collected in the field must be transformed into
information that will provide answers to the managerÊs questions. The
conversion of raw data into information requires that the data be edited and
coded so that it can be transferred to a computer or other storage medium. This
topic introduces the processes of data analysis. These comprise several
interrelated procedures that are performed to summarise and rearrange the data.
Researchers edit and code data to provide input that results in tabulated
information that will answer the research questions. With this input, researchers
could logically and statistically describe research findings.
ACTIVITY 9.1
What is raw data? How is it different from primary and secondary data?
(c) Assign to the item the mean value of the responses of all those who have
responded to that particular item.
(d) Give the item the mean of the responses of this particular respondent to all
other questions measuring this variable.
(e) Give the missing response a random number within the range for that scale.
(f) Give to the missing response of an interval-scaled item with a mid-point the
midpoint in the scale as the response to that particular item.
9.2 CODING
If scanner sheets for collecting questionnaire data are used, such sheets facilitate
the entry of the responses directly into the computer without manual keying in of
the data. However, if this cannot be done, then it is perhaps better to use a coding
sheet first to transcribe the data from the questionnaire and then key in the data.
This method, in contrast to flipping through each questionnaire for each item,
avoids confusion especially when there are many questions and a large number
of questionnaires involved.
Responses could be coded either before or after the data is collected. If at all
possible, it is best to code them ahead of time. Coding means assigning a number
to a particular response so the answer can be entered into a database. For
example, if a five-point Agree-Disagree scale is used, then it must be decided if
Strongly Agree will be coded with a 5 or a 1. Most researchers will assign the
largest number to Strongly Agree and the smallest to Strongly Disagree; for
example; 5 = Strongly Agree and 1 = Strongly Disagree, with the points in
between being assigned 2, 3 or 4. A special situation arises when the researcher
has a two-category variable like gender. Some researchers use a coding approach
that assigns 1 = male and a 2 = female. It is recommended that in such instances a
coding approach be used that assigns 1 to one of the categories and 0 to the other
category. This enables greater flexibility in data analysis and is referred to as
using dummy variable coding.
Human errors can occur while completing the questionnaire, coding it or during
keying in data. Therefore, at least 10 percent of the coded questionnaires, as well
as the actual database, are checked for possible coding or data entry errors.
Questionnaires to be checked are selected by a systematic, random sampling
process.
Selective coding system is used as the basis for theory development. In this
coding system, categories are rearranged and reorganised in order to relate them
to a core concept. This core concept will form a framework or model to explain
the phenomenon being studied. The framework or model built upon categories
and subcategories is an important milestone for theory development because it
facilitates the process of further data collection to test the framework or model.
ACTIVITY 9.2
remains a mainstay for researchers who need to create a data file immediately
and store it in a minimal space in a variety of media medium.
Voice recognition and response systems, while still far from mature, are
providing some interesting alternatives for the telephone interviewer. Such
systems can be used with software programmed to call specific three-digit
prefixes and generate four-digit numbers randomly, reaching a sample within a
set geographical area. Upon getting a voice response, the computer branches into
a questionnaire routine. Currently, the systems are programmed to record the
verbal answers but voice recognition systems are improving rapidly and soon
this system will be able to translate voice responses into data files.
Even with these time reductions between data collection and analysis, continuing
innovations in multimedia technology are being developed by the personal
computer business. The capability to integrate visual images, audio and data may
soon replace video equipment as the preferred method for recording an
ACTIVITY 9.3
People nowadays are attracted to SMS service provider advertisements,
be it for mobile ring tone services or contests. Even local TV stations use
this service to get information on certain survey questions. Why has this
phenomenon become so widely acceptable by the public even when the
charges are expensive?
Data transformation is usually done to reduce bias when ages of respondents are
being studied. To reduce the biased response, respondents are asked the year
they were born. In such cases, the researcher would have to simply transform the
birth year to obtain the age of the respondents. Data transformation is required
when the researcher wants to create a new variable by respecifying the data
according to logical transformation. In many cases, the Likert scales are
combined into a summated rating. Usually, the transformed variable involves
combining the scores (raw data) for several attitudinal statements into a single
summated score.
The researcher could also calculate an average summated score that involves
dividing the total summated score by the number of variables. For example, if
three 5-point statements are used, the summated score might be 4 + 4 + 5 = 13.
Using the average summated score, the result becomes 4 + 4 + 5 = 13/3 = 4.3.
SELF-CHECK 9.1
Suppose the researcher notices that the item in the data set does not have a good
spread (range) and shows little variability. The researcher can deduce that the
question may not be understood by the respondents due to improper wording or
the respondents may not fully understand the intent of the question. If the
respondents have given similar answers to all items, the researcher may want to
check for biases (e.g. if the respondents have stuck at only certain points on the
scale). The objective of descriptive analysis is to portray an accurate profile of
persons, events or situations. The analysis could be an extension or a beginner for a
piece of exploratory research. Table 9.1 summarises data presentation by data type.
To compare the frequency of Multiple Bar Chart (continuous data must be grouped, other
occurrences of categories/ data may need grouping)
values for two/more variables
so that highest and lowest are
clear.
To compare the trends for Multiple Line Graph/Multiple Bar Chart
two/more variables so that
conjunctions are clear.
To compare the proportions of Comparative Pie Charts/Percentage Component Bar Chart
occurrences of categories/ (continuous data must be grouped, other data may need
values for two/more variables. grouping)
To compare the distribution of Multiple Box Plot
values for two/more variables.
To compare the frequency of Stacked Bar Chart (continuous data must be grouped, other
occurrences of categories/ data may need grouping)
values for two/more variables
so that totals are clear.
To compare the proportions and Comparative Proportional Pie Charts (continuous data must
totals of occurrences of be grouped, other data may need grouping)
categories/values for two/
more variables.
Source: Adapted from Saunders. M., Lewis. P., and Thornhill. A. (2003)
ACTIVITY 9.4
What are the factors that determine the choice of data analysis?
Compare your answers with those of your classmate.
(b) Mean
Mean is the arithmetic average. It is the sum of the observed values in the
distribution divided by the number of observations. It is the location
measure most frequently used for interval-ratio data but can be misleading
when the distribution contains extreme scores, large or small.
Formula:
x/N
(c) Median
Median is the midpoint of the distribution. Half of the observations in the
distribution fall above and the other half fall below the median. When the
distribution has an even number of observations, the median is the average
of the two middle scores. The median is the most appropriate locator of
centre for ordinal data and has resistance to extreme scores, thereby making
it a preferred measure for interval ratio data particularly those with
asymmetric distributions.
(d) Mode
Mode is the most frequently occurring value. When there is more than one
score that has the highest yet equals frequency, the distribution is bimodal
or multi modal. When every score has an equal number of observations,
there is no mode. The mode is the location measure for normal data and a
point of reference along with the median and mean for examining spread
and shape.
(f) Variance
Variance is the average of the squared deviation scores from the
distributionÊs mean. It is a measure of score dispersion about the mean. If
all the scores are identical, the variance is 0. The greater the dispersion of
scores, the greater is the variance. Both the variance and the standard
deviation are used with interval ratio data. The symbol for the sample
variance is (s2), and the population variance is the Greek letter sigma
squared ( 2)
Formula:
Sample variance,
(Xi Xj)2
s2
n 1
Population variance,
2 (Xi Xj)2
N
Formula:
s Variance
(h) Range
Range is the difference between the largest and smallest score in the
distribution. Unlike the standard deviation, it is computed from only the
minimum and maximum scores. Thus, it is a very rough measure of spread.
With the range as a point of comparison, it is possible to get an idea of the
homogeneity (small std. dev.) or heterogeneity (large std. dev.) of the
distribution. For homogeneous distribution, the ratio of the range to the
standard deviation should be between 2 and 6. A number above 6 would
indicate a high degree of heterogeneity. The range provides useful but
limited information for all data. It is mandatory for ordinal data.
ranked data use this measure in conjunction with the median. It is also used
with interval-ratio data if there are asymmetrical distributions or for
exploratory analysis. Recall the following relationships: the minimum value
of the distribution is the 0th percentile and the maximum is the l00th
percentile.
The first quartile (Q1) is the 25th percentile; it is also known as the lower
hinge when used with box plots. The median, or (Q2), is the 50th percentile.
The third quartile (Q3) is the 75th percentile; it is also known as the upper
hinge. The IQR is the distance between the hinges.
Formula:
Q = (Q1 Q3)/2
The semi-interquartile range is always used with the median for ordinal
data. It is helpful for interval-ratio data of a skewed nature. In a normal
distribution, a quartile deviation (Q) on either side encompasses 50 percent
of the observations. Eight (Qs) cover approximately the range. QÊs
relationship with the standard deviation is constant (Q = 0.6745s) when
scores are normally distributed.
One Sample
The mean brand preference score of male teachers aged 35 to 40 is 85.
Two Samples
There is no difference in the mean brand preference score between male and
female teachers aged 35 to 40.
The alternate hypothesis (or sometimes called the research hypothesis) is the
hypothesis that contradicts the null. It is commonly written as Ha. The alternative
hypothesis can indicate the direction of the differences or relationship, or assume
a neutral position. If direction is emphasised (indicated in the alternative
hypothesis), we called it one tailed-test. Otherwise, the test will be a two-tailed
test. The following are examples of alternate hypothesis.
One Sample
The mean brand preference score of male teachers aged 35 to 40 is not equal
to 85.
Two Samples
There is a difference in the mean brand preference score between male and
female teachers aged 35 to 40.
In the latter case, the researcher would not be able to detect any significant
differences between the groups. It is important to understand that while the null
hypothesis may not be rejected, it is not necessarily accepted as true. The null
hypothesis typically is developed so that its rejection leads to an acceptance of
the desired situation. The alternative hypothesis represents what we think may
be correct.
SELF-CHECK 9.2
Extremely low levels of Type I error ( ) will result in a high level of Type II error
( ), thus it is necessary to reach an acceptable compromise between the two types
of error. Sample size can help control Type I and Type II Errors. Generally, the
researcher will select the sample size in order to increase the power of the test
and to minimise Type I error and Type II error.
The trade off between Type I and Type II Errors has a practical dimension as
defined by the costs incurred for each error. Often a change in the status quo is
associated with a great cost (the risks or gambling the future of the firm on a new
technology, a new investment in an equipment, etc.) Since the change must be
beneficial, the risks associated with alpha should be kept very low. However, if it
is essential to detect changes from a hypothesised mean, the risk of a beta would
be more important. Thus, a higher less critical alpha level would be chosen.
(b) Make a judgment about the sampling distribution of the population and
then select the appropriate statistical test based upon whether you believe
the data is parametric or non-parametric.
(c) Decide upon the desired level of significance (p = .05, .01, or something
else).
(d) Collect data from a sample and compute the statistical test to see if the level
of significance is met.
(e) Accept or reject the null hypothesis. Determine whether the deviation of the
sample value from the expected value would have occurred by chance
alone (five times out of one hundred).
(a) Mean
Mean is also known as average. A mean is the sum of all scores divided by
the number of scores. The mean is used to measure central tendency or
centre of a score distribution generally. For example, the mean for the
following set of integers: 3, 4, 5, 7 and 6 = 5.
As shown in Figure 9.3, when the scores are bunched together around the
mean, the standard deviation is small and the bell curve is steep. When the
scores are spread away from the mean, the standard deviation is large and
the bell curve is relatively flat.
To explore better what standard deviation means, refer to Figure 9.4. The mean is
20 and the standard deviation (SD) is 5. Figure 9.4 represents the score obtained
on grid test for two organisation terminals using cluster computing with the
same mean of 20.
(a) One standard deviation (SD = 5) from the mean in either direction on the
horizontal axis accounts for around 68% of the organisation in this group.
In other terms, 68% terminals obtained 15 and 25 optimal time.
(b) Two standard deviations (5 + 5 = 10) away from the mean account roughly
95% of terminals. In other words, 95% terminals obtained are between 10
and 30 optimal time.
(c) Three standard deviations (5 + 5 + 5 = 15) away from the mean account for
roughly 99% terminals. In other words, 99% terminals obtained are
between 5 and 35 optimal time.
H0: There is no significant difference between the experimental group and the
control group in terms of enhancing network appliances detection.
To solve this, you may use the statistical approach called t-test to obtain the
t-value for independent means. In this case, independent means that the two
groups consist of different subjects. The t-test gives the probability that the
difference between the two means is caused by chance. For testing the
significance, you will need to set a risk level called the alpha level. In social
science research, the alpha level is set at 0.05. This means that the obtained result
which is significant at .05 level could occur by chance only 5 times in a trial of
100.
(a) Table 9.2 displays t-value of 2.65 obtained. If you are using statistical
software like SPSS or SAS, the probability value is given (i.e. p < 0.02). We
could also refer to the table of critical values to find out whether the t-value
is large enough to say that the difference between the groups is not likely to
have been a chance finding.
(b) We can also determine the degrees of freedom (df) for the test which is the
sum of the terminals in both groups minus 2 (i.e. n 2). By the given alpha
level, the df and the t-value, we can refer to the t-value in the table of
critical values.
(c) Refer to Table 9.3. The obtained t-value (2.65) is bigger than the critical
value (2.1009) for 18 degree of freedom (20 2 = 18). From this, we can
conclude that the differences between the means for two organisations is
significantly different at the 0.05 level of significance.
df p = 0.05 p = 0.01
17 2.1098 2.8982
18 2.1009 2.8784
19 2.1009 2.8609
(d) Please note the difference is NOT SIGNIFICANT at the 0.01 level of
significance because the t-value (2.65) is smaller than the critical value
(2.8784) for 18 degrees of freedom.
H0: There is no significant difference between the pre-test mean and the post-test
mean in terms of network appliances detection enhancement.
Table 9.4: Means and Standard Deviation Obtained for the Pre-test and Post-test Scores
(a) By using the t-test for dependent groups, we can obtain the value of 1.94. In
this case, dependent means that the two means are obtained from the same
groups.
(b) From Table 9.5, we can highlight that for 9 degrees of freedom, the critical
value is 2.2622, which is larger than the t-value 1.94. We can conclude that
the means are NOT significantly different at the 0.05 level of significance.
df P = 0.05 p = 0.01
8 2.3060 3.3554
9 2.2622 3.2498
10 2.2281 3.1693
Table 9.6 Means and Standard Deviation for Three Types of Buffering Methods
From Table 9.8, we can conclude that there is no significant difference between
the performance using Buffer Method 1 and Buffer Method 2. Buffer Method 3
performed significantly better at significance level 0.05. Buffer Method 3 also
outperformed Buffer Method 1 at the 0.01 level of significance.
SELF-CHECK 9.3
SELF-CHECK 9.4
1. When would you use a longitudinal survey rather than a cross-
sectional survey? Discuss.
2. Discuss a situation of using Analysis of Variance (ANOVA) in a
research work (use an example with a research topic of your own).
3. Discuss some ethical issues you think could arise during survey
research.
After data is collected and before it is analysed, the researcher must examine
it to ensure its validity. Blank responses, referred to as missing data, must be
dealt accordingly.
If the questions were pre-coded, then they can be inserted into a database. If
they were not pre-coded, then a system must be developed so they can be
included in the database.
The typical tasks involved are editing, dealing with missing data, coding,
transformation and entering data.
There are two possible types of error in statistical tests: Type I, rejecting a true
null hypothesis, and Type II, accepting a false null hypothesis.
There are two types of survey: cross sectional survey and longitudinal survey.
Questionnaires are used widely due to their cost effectiveness and easier
management nature.
The t-test is used for significant differences between means for independent and
dependent groups.
ANOVA is used when comparing the means of more than two groups.
ANOVA Mode
Descriptive Statistics Median
Correlation Normal Distribution
Comparing Differences Standard Deviation
Inferential Statistics T-test
Mean Variance
INTRODUCTION
Before embarking on a research project, students must have an overall research
plan that indicates the research problem to be studied, research objectives,
significance of the research, strategies to obtain answers to the research problems
and the research project implementation schedule. This overall research plan is
called a research proposal. It is very important that students write a good
research proposal because without a good research proposal the student may not
get a research supervisor or even an approval to carry out the research. The
student must also know about ethics in research because that governs the
researchersÊ behaviour in terms of what should do and should not be done in the
research project.
(a) Introduction
(i) In this section, you should provide an overview of the issues that you
intend to study. The content of introduction should be brief, precise
and straight to the point. After giving an overview on the scope of the
research, you will need to narrow it down to the specific area of your
concern.
(ii) Try to elaborate on the main theme that needs to be addressed.
Identify the gap that exists and the research question will surface
naturally.
(iii) Define the problem statement in your proposal. This will give a clear
perception to the reader on the issue you are going to solve. However,
the problem statement you write in the proposal may only be
tentative at the point of proposal preparation because the research has
not been carried out yet.
(iv) Use simple sentences and make sure you narrow down the research
issue to focus on a very specific field.
(ii) You can also provide justification if the methodology you are going to
use has some degree of novelty and your research would contribute
to new knowledge.
(iii) Some other criteria you can mention are the variables that you are
going to use and the expected outcomes of your research and its
influence on the model or design.
(f) Methodology
(i) Generally, it is not necessary to describe the methodology that you are
going to use in details. However, you should justify why it is chosen
over other similar methodologies.
(ii) You may explain some reasons why you are using certain theory
or models, whether your research approach will be qualitative or
quantitative, or a combination of both.
(iii) Describe variables, sampling techniques and data collection methods
as well.
(iv) Explain the type of data involved and what type of analysis and
testing will be performed.
(h) References
(i) List of references must be provided by following the academic
guideline or scholarly fashion. References will convince the reader
that your proposal is comprehensive and demonstrate your
understanding in the particular field of interest.
(ii) Use citation style of APA (Manual of the American Psychological
Association) or other bibliographical standard accepted in ICT.
SELF-CHECK 10.1
(a) Familiarise yourself with word processor software (i.e. Microsoft Word,
Open Office) and learn the features of the software, for instance inserting
tables, graphs, footnotes and other specially formatted features. This will
help you towards preparing a research report at the end of the process.
(c) While writing the first proposal, present your ideas by narrowing it down
sequentially and focus more on presenting the information in an interesting
manner. You must remember that your proposal should be expressed
clearly and states the overview of your research intention.
(d) Write and explain about your research problem at the beginning of the
proposal content (i.e. Introduction section). This is important as to give
readers attention since the entire research process is driven by the research
problem.
(e) Write about the methodology you are going to implement briefly and
precisely. It is a good practice to outline methods and source of data in the
proposal stage itself. This will put your proposal in a better position in
order to determine its worth and potential contribution.
SELF-CHECK 10.2
(c) Proposal Author The author does not have sufficient training or experience
for the proposed research topic. Hence, it is important for
you to choose a topic which you are good at and familiar
with in the specified field.
The author has insufficient time to devote to the research
project.
The author does not critically review the related works
but simply rewrites the available literature of the
particular research topic.
The author does not quote and cite the necessary
references based on academic journals and research
papers.
SELF-CHECK 10.3
Privacy is accepted as the key to ethical issues that the researcher has to confront
in carrying out any research project. Almost all aspects of ethics, for example,
consent, confidentiality, participant reactions, and the effects of the way the
researcher uses, analyses and reports research findings have the capacity to
affect, or are related to, the privacy of participants.
ACTIVITY 10.1
In your opinion, why do ethical issues become a major concern in the
research process? Explain how to handle this issue.
SELF-CHECK 10.4
Tick True or False for each statement below:
The process of formulating and clarifying the research topic is the most
crucial part of the research process.
Writing a research proposal helps the researcher in organising ideas and can
also be thought as a contract between the researcher and the client.
The content of the research proposal should tell the reader what the
researcher wants to do, why he wants to do it, what he wants to achieve and
how he wants to achieve it.
Research ethics should be recognised and considered from the outset of the
research project and be used as one of the criteria in judging the proposal.
Ethical concerns are likely to occur at all stages of the research project, when
seeking access to data, during data collection, while analysing data and in
reporting the results.
OR
Thank you.