Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 17

Bahir Dar University

Faculty of Humanities

Department of English Language and Literature

Course Title: Advanced Research Methods (TEFL 701)

A term paper on Qualitative Data Analysis Procedures

By: Dagninet Gebey

Submitted to: Birhanu S. (PhD)

Yenus N. (PhD)

January, 2023

Bahir Dar, Ethiopia


Table of Contents
Introduction.................................................................................................................................................1
1. Qualitative data analysis procedures...................................................................................................1
1.1 Approaches to Qualitative Data Analysis...........................................................................................2
1.1.1 Deductive Approach...................................................................................................................2
1.1.2 Inductive Approach.....................................................................................................................2
1.2 Qualitative Data Analysis Methods and Techniques..........................................................................2
1.2.1 Qualitative Content Analysis......................................................................................................2
1.2.2 Narrative Analysis......................................................................................................................3
1.2.3 Discourse Analysis.....................................................................................................................4
1.2.4 Thematic Analysis......................................................................................................................4
1.2.5 Grounded theory (GT)................................................................................................................5
1.3 Basic Principles of Qualitative Data Analysis......................................................................................6
1.4 Steps to Qualitative Data Analysis.....................................................................................................6
1.5 Stages to Analyze Qualitative Data....................................................................................................8
1.5.1 Familiarization............................................................................................................................8
1.5.2 Data Reduction...........................................................................................................................9
1.5.3 Data Display..............................................................................................................................11
1.5.4 Drawing Conclusions.................................................................................................................11
1.6 Categorization..................................................................................................................................12
1.7 Data management...........................................................................................................................12
1.8 Data cleaning...................................................................................................................................12
1.9 Credibility, Transferability, Confirmability, and Dependability........................................................13
1.10 Triangulation..................................................................................................................................14
1.11 Conclusion.....................................................................................................................................14
References.............................................................................................................................................15
Introduction
LeCompte and Schensul (1999) define analysis as the process researcher uses to reduce the data
to a story and its interpretation. Data analysis is the process of reducing large amount of
collected data to make sense of them. Patton (1987) indicates that three things occur during
analysis: data are organized, data are reduced through summarization and categorization, and
patterns and themes in the data are identified and linked.

Qualitative data are a source of well-grounded, rich descriptions and explanations of human
processes. With qualitative data, one can preserve chronological flow, see which events led to
which consequences, and derive fruitful explanations. Analyzing your data is vital, as you have
spent time and money collecting it. It is an essential process because you don’t want to find
yourself in the dark even after putting in so much effort. However, there are no set ground rules
for analyzing qualitative data; it all begins with understanding the two main approaches to
qualitative data.

This paper tries to discuss the basic points of qualitative data analysis. Specifically, the paper
highly focuses on approaches to qualitative data analysis, Basic Principles of Qualitative Data
Analysis, Steps to Qualitative Data Analysis, Categorization, data management, Grounded
theory, Triangulation, so on.

1. Qualitative data analysis procedures


To begin, as we know qualitative data is typically generated through the following data
collection tools:

 Interview transcripts
 Surveys with open-ended questions
 Contact center transcripts
 Texts and documents
 Audio and video recordings
 Observational notes

Qualitative Data Analysis (QDA) is the range of processes and procedures whereby we move
from the qualitative data that have been collected into some form of explanation, understanding or
interpretation of the people and situations we are investigating.

Qualitative data analysis is a process of gathering, structuring and interpreting qualitative data to
understand what it represents. Qualitative data is non-numerical and unstructured. Qualitative data
generally refers to text, such as open-ended responses to survey questions or user interviews, but
also includes audio, photos and video.

1
According to Creswell (2014), Data analysis in qualitative research will proceed hand-in-hand
with other parts of developing the qualitative study, namely, the data collection and the write-up
of findings. While interviews are going on, for example, researchers may be analyzing an
interview collected earlier, writing memos that may ultimately be included as a narrative in the
final report, and organizing the structure of the final report. This process is unlike quantitative
research in which the investigator collects the data, then analyzes the information, and finally
writes the report.

1.1 Approaches to Qualitative Data Analysis


According to Creswell (2014), there are two approaches to qualitative data analysis. These are:

1.1.1 Deductive Approach


The deductive approach involves analyzing qualitative data based on a structure that is
predetermined by the researcher. A researcher can use the questions as a guide for analyzing the
data. This approach is quick and easy and can be used when a researcher has a fair idea about the
likely responses that he/she is going to receive from the sample population.

Deductively, the researchers look back at their data from the themes to determine if more
evidence can support each theme or whether they need to gather additional information. Thus,
while the process begins inductively, deductive thinking also plays an important role as the
analysis moves forward.

1.1.2 Inductive Approach


The inductive approach, on the contrary, is not based on a predetermined structure or set ground
rules/framework. It is a more time-consuming and thorough approach to qualitative data analysis.
An inductive approach is often used when a researcher has very little or no idea of
the research phenomenon.

Qualitative researchers build their patterns, categories, and themes from the bottom up by
organizing the data into increasingly more abstract units of information. This inductive process
illustrates working back and forth between the themes and the database until the researchers have
established a comprehensive set of themes.

1.2 Qualitative Data Analysis Methods and Techniques


There are a wide variety of qualitative data analysis methods and techniques. According to
Kerryn (2020), the most popular and best known are the following:

1.2.1 Qualitative Content Analysis


As Kerryn (2020) mentioned content analysis is used to evaluate patterns with in a piece of
content or across multiple pieces of content or sources of communication. Content analysis is
possibly the most common and straightforward QDA method. At the simplest level, content
analysis is used to evaluate patterns within a piece of content (for example, words, phrases or
images) or across multiple pieces of content or sources of communication.

2
With content analysis, you could, for instance, identify the frequency with which an idea is
shared or spoken about – like the number of times a Kardashian is mentioned on Twitter. Or you
could identify patterns of deeper underlying interpretations – for instance, by identifying phrases
or words in tourist pamphlets that highlight India as an ancient country.

Because content analysis can be used in such a wide variety of ways, it’s important to go into
your analysis with a very specific question and goal, or you’ll get lost in the fog. With content
analysis, you’ll group large amounts of text into codes, summarize these into categories, and
possibly even tabulate the data to calculate the frequency of certain concepts or variables.
Because of this, content analysis provides a small splash of quantitative thinking within a
qualitative method.

Naturally, while content analysis is widely useful, it’s not without its drawbacks. One of the
main issues with content analysis is that it can be very time consuming, as it requires lots of
reading and re-reading of the texts. Also, because of its multidimensional focus on both
qualitative and quantitative aspects, it is sometimes accused of losing important nuances in
communication.

Content analysis also tends to concentrate on a very specific timeline and doesn’t take into
account what happened before or after that timeline. This isn’t necessarily a bad thing though –
just something to be aware of. So, keep these factors in mind if you’re considering content
analysis. Every analysis method has its drawbacks, so don’t be put off by these – just be aware of
them!

1.2.2 Narrative Analysis


According to Kerryn (2020), narrative analysis is all about listening to people telling stories to
gain insights into the way that people deal with and make sense of reality. As the name suggests,
narrative analysis is all about listening to people telling stories and analyzing what that means.
Since stories serve a functional purpose of helping us make sense of the world, we can gain
insights into the ways that people deal with and make sense of reality by analyzing their stories
and the ways they’re told.

You could, for example, use narrative analysis to explore whether how something is being said is
important. For instance, the narrative of a prisoner trying to justify their crime could provide
insight into their view of the world and the justice system. Similarly, analyzing the ways
entrepreneurs talk about the struggles in their careers or cancer patients telling stories of hope
could provide powerful insights into their mindsets and perspectives. In other words, narrative
analysis is about paying attention to the stories that people tell – and more importantly,
the way they tell them.

Of course, the narrative approach has its weaknesses, just like all analysis methods. Sample sizes
are generally quite small due to the time-consuming process of capturing narratives. Because of
this, along with the multitude of social and lifestyle factors which can influence a subject,

3
narrative analysis can be quite difficult to reproduce in subsequent research. This means that it’s
difficult to test the findings of some of this research.

Similarly, researcher bias can have a strong influence on the results here, so you need to be
particularly careful about the potential biases you can bring into your analysis when using this
method. Nevertheless, narrative analysis is still a very useful qualitative method – just keep these
limitations in mind and be careful not to draw broad conclusions.

1.2.3 Discourse Analysis


Kerryn (2020) explained that discourse analysis is all about analyzing language such as a
conversation or a speech, with in the culture and the society it takes place in. Discourse is
simply a fancy word for written or spoken language or debate. So, discourse analysis is all
about analyzing language within its social context. In other words, analyzing language – such as
a conversation, a speech, etc. – within the culture and society it takes place in. For example, you
could analyze how a janitor speaks to a CEO, or how politicians speak about terrorism.

To truly understand these conversations or speeches, the culture and history of those involved in
the communication is important. For example, a janitor might speak more casually with a CEO
in a company that emphasizes equality among workers. Similarly, a politician might speak more
about terrorism if there was a recent terrorist incident in the country.

So, as you can see, by using discourse analysis, you can identify
how culture, history or power dynamics (to name a few) have an effect on the way concepts are
spoken about. So, if your research aims and objectives involve understanding culture or power
dynamics, discourse analysis can be a powerful method.

Because there are many social influences in how we speak to each other, the potential use of
discourse analysis is vast. Of course, this also means it’s important to have a
very specific research question (or questions) in mind when analyzing your data and looking for
patterns and themes, or you might land up going down a winding rabbit hole.

Discourse analysis can also be very time consuming as you need to sample the data to the point
of saturation – in other words, until no new information and insights emerge. But this is, of
course, part of what makes discourse analysis such a powerful technique. So, keep these factors
in mind when considering this QDA method.

1.2.4 Thematic Analysis


As Kerryn (2020) stated thematic analysis takes bodies of data and groups them according to
similarities (themes), which help us make sense of the content. Thematic analysis looks
at patterns of meaning in a data set – for example, a set of interviews or focus group transcripts.
But what exactly does that… mean? Well, a thematic analysis takes bodies of data (which are
often quite large) and groups them according to similarities – in other words, themes. These
themes help us make sense of the content and derive meaning from it.

4
With thematic analysis, you could analyze 100 reviews of a popular sushi restaurant to find out
what patrons think about the place. By reviewing the data, you would then identify the themes
that crop up repeatedly within the data – for example, “fresh ingredients” or “friendly wait staff”.

So, as you can see, thematic analysis can be pretty useful for finding out
about people’s experiences, views, and opinions. Therefore, if your research aims and objectives
involve understanding people’s experience or view of something, thematic analysis can be a
great choice.

Since thematic analysis is a bit of an exploratory process, it’s not unusual for your research
questions to develop, or even change as you progress through the analysis. While this is
somewhat natural in exploratory research, it can also be seen as a disadvantage as it means that
data needs to be re-reviewed each time a research question is adjusted. In other words, thematic
analysis can be quite time-consuming – but for a good reason. So, keep this in mind if you
choose to use thematic analysis for your project and budget extra time for unexpected
adjustments.

1.2.5 Grounded theory (GT)


Kerryn (2020) explained that Grounded theory is used to create a new theory (theories) by using
the data at hand, as opposed to existing theories and frameworks. According to Charmaz (2006),
Grounded theory is a method of conducting qualitative research that focuses on creating
conceptual frameworks or theories through building inductive analysis from the data. Hence, the
analytic categories are directly 'grounded' in the data.

Grounded Theory is powerful qualitative analysis method where the intention is to create a new
theory (or theories) using the data at hand, through a series of “tests” and “revisions.” For
example, you could try to develop a theory about what factors influence students to read watch a
YouTube video about qualitative analysis… The important thing with grounded theory is that
you go into the analysis with an open mind and let the data speak for itself – rather than dragging
existing hypotheses or theories into your analysis. In other words, your analysis must develop
from the ground up (hence the name)…

In Grounded Theory, you start with a general overarching question about a given population –
for example, graduate students. Then you begin to analyze a small sample – for example, five
graduate students in a department at a university. Ideally, this sample should be reasonably
representative of the broader population. You’d then interview these students to identify what
factors lead them to watch the video.

After analyzing the interview data, a general hypothesis or pattern could emerge. For example,
you might notice that graduate students are more likely to read a post about qualitative methods
if they are just starting on their dissertation journey, or if they have an upcoming test about
research methods.

5
From here, you’ll look for another small sample – for example, five more graduate students in a
different department – and see whether this pattern or this hypothesis holds true for them. If not,
you’ll look for commonalities and adapt your theory accordingly. As this process continues, the
theory develops. What’s important with grounded theory is that the theory develops from the
data – not from some preconceived idea. You need to let the data speak for itself.

So, what are the drawbacks of grounded theory? Well, some argue that there’s a
tricky circularity to Grounded Theory. For it to work, in principle, you should know
as little as possible regarding the research question and population, so that you reduce the bias in
your interpretation. However, in many circumstances, it’s also thought to be unwise to approach
a research question without knowledge of the current literature. In other words, it’s a bit of a
“chicken or the egg” situation.

Regardless, grounded theory remains a popular (and powerful) option. Naturally, it’s a very
useful method when you’re researching a topic that is completely new or has very little existing
research about it, as it allows you to start from scratch and work your way from the ground up.

1.3 Basic Principles of Qualitative Data Analysis


According to Bryman and Burgess (1994) there are basic principles of qualitative data analysis.
These are mentioned below:

 People differ in their experience and understanding of reality (Constructivist-many


meanings).
 A social phenomenon can’t be understood outside its own context (Context-bound).
 Qualitative research can be used to describe phenomenon or generate theory grounded on
data.
 Understanding human behavior emerges slowly and non-linearly.
 Exceptional cases may yield insights into a problem or new idea for further inquiry.

1.4 Steps to Qualitative Data Analysis


According to Creswell (2014), qualitative data analysis has the following steps.

Step1. Organize and prepare the data for analysis. This involves transcribing interviews,
optically scanning material, typing up field notes, cataloguing all of the visual material, and
sorting and arranging the data into different types depending on the sources of information.

Step2. Read or look at all the data. This first step provides a general sense of the information and
an opportunity to reflect on its overall meaning. What general ideas are participants saying?
What is the tone of the ideas? What is the impression of the overall depth, credibility, and use of
the information? Sometimes qualitative researchers write notes in margins of transcripts or
observational field notes, or start recording general thoughts about the data at this stage. For
visual data, a sketchbook of ideas can begin to take shape.

6
Step3. Start coding all of the data. Coding is the process of organizing the data by bracketing
chunks (or text or image segments) and writing a word representing a category in the margins
(Rossman & Rallis as cited in Creswell (2014)). It involves taking text data or pictures gathered
during data collection, segmenting sentences (or paragraphs) or images into categories, and
labeling those categories with a term, often a term based in the actual language of the participant
(called an in vivo term).

Tesch as cited in Creswell (2014) provided the following eight steps typically used in forming
codes.

1. Get a sense of the whole. Read all the transcriptions carefully.


2. Pick one document (i.e., one interview)—the most interesting one, the shortest, the one
on the top of the pile. Go through it, asking yourself, “What is this about?”
3. When you have completed this task for several participants, make a list of all topics.
Cluster together similar topics.
4. Now take this list and go back to your data. Abbreviate the topics as codes and write the
codes next to the appropriate segments of the text.
5. Find the most descriptive wording for your topics and turn them into categories.
6. Make a final decision on the abbreviation for each category and alphabetize these codes.
7. Assemble the data material belonging to each category in one place and perform a
preliminary analysis.
8. If necessary, recode your existing data.

Step 4.Use the coding process to generate a description of the setting or people as well as
categories or themes for analysis. Description involves a detailed rendering of information about
people, places, or events in a setting. Researchers can generate codes for this description. This
analysis is useful in designing detailed descriptions for case studies, ethnographies, and narrative
research projects. Use the coding as well for generating a small number of themes or categories
— perhaps five to seven themes for a research study. These themes are the ones that appear as
major findings in qualitative studies and are often used as headings in the findings sections (or in
the findings section of a dissertation or thesis) of studies. They should display multiple
perspectives from individuals and be supported by diverse quotatxions and specific evidence.
Beyond identifying the themes during the coding process, qualitative researchers can do much
with themes to build additional layers of complex analysis. For example, researchers
interconnect themes into a story line (as in narratives) or develop them into a theoretical model
(as in grounded theory). Themes are analyzed for each individual case and across different cases
(as in case studies) or shaped into a general description (as in phenomenology). Sophisticated
qualitative studies go beyond description and theme identification and form complex theme
connections.
Step5. Advance how the description and themes will be represented in the qualitative narrative.
The most popular approach is to use a narrative passage to convey the findings of the analysis.

7
This might be a discussion that mentions a chronology of events, the detailed discussion of
several themes (complete with subthemes, specific illustrations, multiple perspectives from
individuals, and quotations) or a discussion with inter connecting themes. Many qualitative
researchers also use visuals, figures, or tables as adjuncts to the discussions. They present a
process model (as in grounded theory), advance a drawing of the specific research site (as in
ethnography), or convey descriptive information about each participant in a table (as in case
studies and ethnographies).
Step6. A final step in data analysis involves making an interpretation in qualitative research of
the findings or results. Asking, “What were the lessons learned?” captures the essence of this
idea (Lincoln & Guba as cited in Creswell (2014)). These lessons could be the researcher’s
personal interpretation, couched in the understanding that the inquirer brings to the study from a
personal culture, history, and experiences. It could also be a meaning derived from a comparison
of the findings with information gleaned from the literature or theories. In this way, authors
suggest that the findings confirm past information or diverge from it. It can also suggest new
questions that need to be asked questions raised by the data and analysis that the inquirer had not
foreseen earlier in the study.

1.5 Stages to Analyze Qualitative Data


There are no universally agreed stages in the process of analyzing qualitative data. Different
authors mentioned different stages in order to analyze qualitative data. For instance, Scott and
Scott & Usher (2004) conceived that a typical qualitative analytical approach to consist five
stages. However, Bryman & Burgess (1994) argue that analysis of qualitative data has to follow
six stages. Moreover, Creswell (2009), contrary to the view of Bryman & Burgess (1994),
believes that the process of qualitative data analysis and interpretation can best be represented by
a spiral image, a data analysis spiral, in which the researcher moves in analytic circles rather than
using a fixed linear approach. Therefore, there are always variations in the number and
description of steps for doing qualitative data analysis by different authors. Thus, the process of
qualitative data analysis consists of four stages (steps), namely: Familiarization, Data Reduction,
Data Display, and Report Writing. The details of these analytical stages are described and
illustrated as follows.

1.5.1 Familiarization
Before beginning the process of filtering and sorting data, the researcher must become familiar
with their variety and diversity of material gathered. Even if the researcher own does not collect
the data, it is must to form feeling about key issues and emergent themes in the data by
considering the context. Essentially, familiarization involves concentration in the data: listening
to tapes, reading transcripts, studying observational notes and so on. According to Bryman and
Burgess (1994) in some cases it is possible to review all the material at the familiarization stage,
for example where only a few interviews have been carried out, or where there is a generous

8
timetable for the research. They further outline number of features in the data collection process
and points to be depend on while selecting the material to be reviewed, such as:

 The range of methods used


 The number of researchers involved
 The diversity of people and circumstances studied
 The time period over which the material was collected
 The extent to which the research agenda evolved or was modified during that time.

1.5.1.1 Identifying a Thematic Framework


In the familiarization stage, the researcher is not only gaining an overview of the richness, depth,
and diversity of the collected data, but also he/she starts the process of abstraction and
conceptualization. While reviewing the material, the researcher is expected to make notes, record
the range of responses to questions posed by the researchers themselves, jot down frequent
themes and issues which emerge as important to the study participants themselves.

Theming refers to the drawing together of codes from one or more transcripts to present the
findings of qualitative research in a coherent and meaningful way. Thus, when the findings are
organized for presentation, each theme can become the heading of a section in the report or
presentation.

Bryman and Burgess (1994) stated that once the selected material has been reviewed, the
researcher returns to these research notes, and attempts to identify key issues, concepts and
themes according to which the data can be examined and referenced. That is, she or he sets up a
thematic framework within which the material can be filtered and sorted. When identifying and
constructing this framework or index, the researcher will be drawing upon a priori issues such as:

 Issues informed by the original research aims and introduced into the interviews using the
topic guide
 Emergent issues raised by the respondents themselves
 Analytical themes arising from the recurrence or patterning of particular views or
experiences.

1.5.2 Data Reduction


As Huberman and Miles (1994) data reduction refers to the process of selecting, focusing,
simplifying, abstracting, and transforming the data that appear in written-up field notes and
transcriptions. As we see it, data reduction occurs continuously throughout the life of
qualitatively oriented project.

9
It is very likely that qualitatively captured research project is going to generate more data than its
final write up. However, engaging in data reduction process is very helpful in order to edit the
data, summarize it, and make it presentable. Therefore, we have to reduce our data to make
things more manageable and evident.
According to Huberman and Miles (1994) with data reduction, the potential universe of data is
reduced in an anticipatory way as the researcher chooses a conceptual framework, research
questions, cases, and instruments. Once actual field notes, interviews, tapes, or other data are
available, data summaries, coding, finding themes, clustering, and writing stories are all
instances of further data selection and condensation. From the very possible ways to reduce and
organize data in qualitative study, this paper attempts to look in to coding of qualitative data and
mapping concepts graphically. Hence, these ideas give a useful starting point for finding order in
qualitative data.

1.5.2.1 Coding
Coding refers to the identification of topics, issues, similarities, and differences that are revealed
through the participants’ narratives and interpreted by the researcher. This process enables the
researcher to begin to understand the world from each participant’s perspective.
Saldana (2013) has argued that coding does not constitute the totality of data analysis, but it is a
method to organize the data so that underlying messages portrayed by the data may become clear
to the researcher. Charmaz (2006) describes coding as the pivotal link between data collection
and explaining the meaning of the data. A code is a descriptive construct designed by the
researcher to capture the primary content or essence of the data. Coding is an interpretive activity
and therefore it is possible that two researchers will attribute two different codes to the same
data. The context in which the research is done, the nature of the research and interest of the
researcher will influence which codes the researcher attributes to the data (Saldana, 2013).
During the coding process, some codes may appear repeatedly and that may be an indication of
emerging patterns. These emerging patterns or similarity among the codes may give rise to
categories. Coding is not only labeling, but also linking, that is, linking data to an idea. It is a
cyclic process. By incorporating more cycles into the coding process, richer meanings,
categories, themes and concepts can be generated from the data (Saldana, 2013).

1.5.2.2 Concept Mapping


It should be clear by now that qualitative data analysts spend a lot of time committing thoughts
to paper (or to a computer file), but this process is not limited to text alone. Often, we can think
out relationships among concepts more clearly by putting the concepts in a graphic format, a
process called concept mapping. Some researchers put all their major concepts on a single sheet
of paper, whereas others spread their thoughts across several sheets of paper, blackboards,
magnetic boards, computer pages, or other media (Strauss & Corbin, 1998).

10
1.5.3 Data Display
According to Matthew, B., Huberman, A., and Saldaña, J. (2014), one of the major flows of
analysis activity is data display. Generically, a display is an organized, compressed assembly of
information that allows conclusion drawing and action. In daily life, displays vary from gasoline
gauges to newspapers to Facebook status updates. Looking at displays helps us to understand
what is happening and to do something—either analyze further or take action— based on that
understanding. The most frequent form of display for qualitative data in the past has been
extended text. As we will note later, text (in the form of, say, 1,000 pages of field notes) is
terribly cumbersome. It is dispersed, sequential rather than simultaneous, poorly structured, and
extremely bulky. Using only extended text, a researcher may find it easy to jump to hasty,
partial, and unfounded conclusions. Humans are not very powerful as processors of large
amounts of information. Extended text overloads our information-processing capabilities and
preys on our tendencies to find simplifying patterns. Or we drastically overweight vivid
information, such as the exciting event that jumps out of page 124 of the field notes after a long,
“boring” passage. Pages 89 through 123 may be ignored, and the criteria for weighting and
selecting may never be questioned. In the course of our work, we have become convinced that
good displays are a major avenue to robust qualitative analysis.

The displays discussed and illustrated many types of matrices, graphs, charts, and networks. All
are designed to assemble organized information into an immediately accessible, compact form so
that the analyst can see what is happening and either draw justified conclusions or move on to
the next step of analysis that the display suggests may be useful. As with data condensation, the
creation and use of displays is …not separate from analysis—it is a part of analysis. Designing
displays—deciding on the rows and columns of a matrix for qualitative data and deciding which
data, in which form, should be entered in the cells—are analytic activities. (Note that designing
displays also has clear data condensation implications.) In this book, we advocate more
systematic, powerful displays and urge a more inventive, self-conscious, and iterative stance
toward their generation and use. As we’ve coined in our previous writings, “You know what you
display.”

1.5.4 Drawing Conclusions


This last step in the analysis involves making meaningful statements about how your data
illustrates your topic of interest. As Huberman and Miles (1994) note, this step involves ‘drawing
meaning from displayed data’. The word ‘drawing’ should be taken quite literally here: you draw
the relevant meaning, structure or processes out of the data based on the type of analysis you
choose.

What meanings should be drawn from the analysis? The answer depends on your disciplinary
orientation. For example, we recognize certain studies as being sociological based on the way the
researchers made sense of their data, or the particular meanings they drew from their data using a
sociological orientation.

11
1.6 Categorization
According to Charmaz (2006), Categorizing is the analytic step in grounded theory of selecting
certain codes as having overriding significance or abstracting common themes and patterns in
several codes into an analytic concept. As the researcher categorizes, he or she raises the
conceptual level of the analysis from description to a more abstract, theoretical level. The
researcher then tries to define the properties of the category, the conditions under which it is
operative. The conditions under which it changes and its relation to other categories. Grounded
theorists make their most significant theoretical categories into the concepts of their theory.

1.7 Data management


Data management is the practice of collecting, organizing, protecting, and storing an
organization’s data so it can be analyzed for business decisions. As organizations create and
consume data at unprecedented rates, data management solutions become essential for making
sense of the vast quantities of data. Data management plays several roles in an organization’s
data environment, making essential functions easier and less time-intensive.
Data management is a crucial first step to employing effective data analysis at scale, which leads
to important insights that add value to your customers and improve your bottom line. With
effective data management, people across an organization can find and access trusted data for
their queries.
Data management is the practice of collecting, keeping, and using data securely, efficiently, and
cost-effectively. The goal of data management is to help people, organizations, and connected
things optimize the use of data within the bounds of policy and regulation so that they can make
decisions and take actions that maximize the benefit to the organization. A robust data
management strategy is becoming more important than ever as organizations increasingly rely on
intangible assets to create value.

1.8 Data cleaning

Data cleaning is an important early step in the data analytics process. This crucial exercise,
which involves preparing and validating data, usually takes place before your core analysis. Data
cleaning is not just a case of removing erroneous data, although that’s often part of it. The
majority of work goes into detecting rogue data and (wherever possible) correcting it.

Data cleaning consists primarily in implementing error prevention strategies before they occur
(see data quality control procedures later in the document). However, error-prevention strategies
can reduce but not eliminate common errors and many data errors will be detected incidentally
during activities such as:

 When collecting or entering data


 When transforming/extracting/transferring data

12
 When exploring or analyzing data
 When submitting the draft report for peer review

Data cleaning involves repeated cycles of screening, diagnosing, treatment and documentation of
this process. As patterns of errors are identified, data collection and entry procedures should be
adapted to correct those patterns and reduce future errors.

1.9 Credibility, Transferability, Confirmability, and Dependability


According to Mackey and Gass (2016), in analyzing qualitative data, researchers must pay
attention to four concerns that arise as part of the research: credibility, transferability,
Confirmability, and dependability.

In terms of credibility, because qualitative research can be based on the assumption of multiple,
constructed realities, it may be more important for qualitative researchers to demonstrate that
their findings are credible to their research population.

For transferability in qualitative research, the research context is seen as integral. While
qualitative research findings are rarely directly transferable from one context to another, the
extent to which findings may be transferred depends on the similarity of the context.

Important for determining similarity of context is the method of reporting known as “thick
description,” which refers to the process of using multiple perspectives to explain the insights
gleaned from a study, and taking into account the actors’ interpretations of their actions and the
speakers’ interpretations of their speech.

Davis as cited in Mackey and Gass (2016) distinguishes three essential components of thick
description:

1. Particular description (representative examples from the data);


2. General description (information about the patterns in the data); and
3. Interpretive commentary (explanation of the phenomena researched and interpretation of the
meaning of the findings with respect to previous research).

For confirmability, researchers are required to make available full details of the data on which
they are basing their claims or interpretations. This is similar to the concept of replicability in
quantitative research, with the point being that another researcher should be able to examine the
data and confirm, modify, or reject the first researcher’s interpretations.

For dependability, researchers aim to fully characterize the research context and the relationships
among the participants. To enhance dependability, researchers may ask the participants
themselves to review the patterns in the data. Electronically recorded data can help to recreate
the data collection context and allow the researcher to make use of all interpretive cues necessary
to draw inferences and evaluate the dependability of the inferences that have been drawn.

13
Recordings can also help research participants and other researchers working in similar contexts
to assess whether dependable inferences have been drawn from the data.

1.10 Triangulation
Creswell (2014) states that triangulation involves using multiple research techniques and
multiple sources of data in order to explore the issues from all feasible perspectives. In addition,
according to Patton (1999), triangulation refers to the use of multiple methods or data sources in
qualitative research to develop a comprehensive understanding of phenomena. Triangulation is a
method used to increase the credibility and validity of research findings. Credibility refers to
trustworthiness and how believable a study is; validity is concerned with the extent to which a
study accurately reflects or evaluates the concept or ideas being investigated. Using the technique
of triangulation can aid in credibility, transferability, and dependability of qualitative research.
Different types of triangulation have been identified, including theoretical triangulation (using
multiple perspectives to analyze the same set of data), investigator triangulation (using multiple
observers or interviewers), and methodological triangulation (using different measures or
research methods to investigate a particular phenomenon). The most common definition of
triangulation, however, is that it entails the use of multiple, independent methods of obtaining
data in a single investigation in order to arrive at the same research findings.

1.11 Conclusion
Qualitative data offers a source for deep understanding of behaviors, actions, thoughts, and
experiences. Human nature makes each qualitative study unique. It is the distinctive qualities of
the participants, researchers, and settings that bring nuance to qualitative data analysis. Although
the results are less generalizable than those produced from experimental studies, qualitative
findings play an important role in building evidence for public health programs, policies, and
interventions. Qualitative data analysis can be complex, challenging, and even intimidating to the
novice researcher. Quantitative researchers may be especially apprehensive about conducting
qualitative data analysis.

The process of analyzing qualitative data varies from one study to another, depending on how the
researcher is guided by the research questions, the theoretical framework of the study, and the
appropriateness of the techniques of making sense of the study. The purpose of analysis is to
interpret and, hence convert the data into a story that describes the phenomena or the participants
the views, using emic perspective.

The process typically involves collecting data that will inform the study, breaking down the data
in varies categories and making connection between these categories in terms of relationships
among them, then visually display the interpretation and writing it up for dissemination.

14
References
1. Alison Mackey and Susan M. Gass (2016). Second Language Research Methodology and
Design (2nd ed.). Routledge.
2. Bryman, A. and Burgess, R. (1994). Analyzing Qualitative Data. Routledge.
3. Charmaz, K. (2006). Constructing Grounded Theory A Practical Guide Through Qualitative
Analysis. Sage.
4. Huberman, A., & Miles, M., (1994). Qualitative data analysis (2nd ed.). Sage Publications.
5. Ian Dey (1993). Qualitative data analysis: A user-friendly guide for social scientists.
Routledge.
6. Creswell, J. W. (2009). Research Design: Qualitative, Quantitative, and Mixed Methods
Approaches (3rd ed.). Sage Publications.
7. John W. Creswell (2014). Research design: qualitative, quantitative, and mixed methods
approach (4th ed.). Sage Publications.
8. Kerryn W. (2020). Qualitative Data Analysis Methods 101: The “Big 6” Methods +
Examples. https://gradcoach.com/qualitative-data-analysis-methods/
9. LeCompte, M. D., & Schensul, J.J. (1999). Analyzing and interpreting ethnographic data.
Altamira press.
10. Matthew, B., Huberman, A., and Saldaña, J. (2014). Qualitative data analysis: a methods
sourcebook (3rd ed.). Sage.
11. Patton, M. Q. (1987). How to use qualitative methods in evaluation. Sage.
12. Saldana, J., (2013). The Coding Manual for Qualitative Researchers (2nd ed.).Sage.
13. Scott D. & Usher R. (2004). Researching Education: Data, Methods, and Theory in
Educational Inquiry. Continuum.
14. Strauss, A., & Corbin, J. (1998). Basics of Qualitative Research. Sage Publications.
15. Retrieved December 30, 2022, from https://www.tableau.com/learn/articles/what-is-data-
management#:~:text=Data%20management%20is%20the%20practice,the%20vast
%20quantities%20of%20data.

15

You might also like