2022 Open Data Innovation - Visualizations and Process Redesign As A Way To Bridge The Transparency-Accountability Gap

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Government Information Quarterly 39 (2022) 101456

Contents lists available at ScienceDirect

Government Information Quarterly


journal homepage: www.elsevier.com/locate/govinf

Open data innovation: Visualizations and process redesign as a way to


bridge the transparency-accountability gap
Sora Park a, b, *, J. Ramon Gil-Garcia a, b, c
a
Center for Technology in Government, University at Albany - State University of New York, USA
b
Rockefeller College of Public Affairs and Policy, University at Albany - State University of New York, USA
c
Universidad de las Americas Puebla, Mexico

A R T I C L E I N F O A B S T R A C T

Keywords: Open government data has led to public policy innovation in pursuit of various expected benefits. One of the
Open data intended goals of open data innovation to improve transparency and accountability. However, our current un­
Innovation derstanding of open data innovation and its ability to generate transparency and accountability is limited,
Transparency
particularly with regard to empirical evidence. In this paper, we describe how a state agency redesigned its
Accountability
Open government
organizational processes around visualization tools and how such efforts helped bridge the transparency-
Visualization accountability gap by enhancing the understandability and usability of open government data. We conclude
that open data innovation does not stop with the adoption of an open data policy, but rather involves an ongoing
cycle of improvements through which the organization responds to its various stakeholders’ use of open data,
thereby increasing usefulness of those datasets and, subsequently, improving overall accountability.

1. Introduction data innovation suggests that computer-mediated transparency will


promote public accountability by providing citizens with access to
Open government data has become the latest global phenomenon, government decision-making processes and outcomes (Bertot, Gorham,
leading to policy innovation in the public sector over the past several Jaeger, Sarin, & Choi, 2014; Grimmelikhuijsen & Meijer, 2012; Meijer,
years (Chatfield & Reddick, 2018; Dawes et al., 2016; Jetzek, Avital, & 2009; Parycek, Hochtl, & Ginner, 2014). Similarly, Zuiderwijk, Janssen,
Bjorn-Andersen, 2014; McBride, Aavik, Toots, Kalvet, & Krimmer, 2019; and Davis (2014) note that the transformation of data processing from
Mergel, 2015; Ruijer, Grimmelikhuijsen, & Meijer, 2017; Wang & Lo, open data innovation creates public value by facilitating transparency,
2016; Zuiderwijk, Helbig, Gil-García, & Janssen, 2014). Open data is efficiency, and accountability. However, the creation of public value is
expected to bring not only economic (e.g., efficiency) and social (e.g., only possible when the outcomes of open data innovation align with the
democratic values) benefits, but also advanced forms of policy decision- intended goals of providing useful information to the public. In reality,
making through data analytics (Kalampokis, Tambouris, & Tarabanis, only a small number of citizens are able to understand the underlying
2013) and knowledge sharing capabilities for various stakeholders in meanings and implications of open data and thus fully realize its benefits
both the public and private sectors (Davies & Edwards, 2012; Zuider­ (Fierro & Gil-Garcia, 2012; Janssen, Charalabidis, & Zuiderwijk, 2012).
wijk, Janssen, & Davis, 2014). Open data relates to many societal-level As a result, an increasing number of researchers have raised questions
advantages, but it also refers to innovation of technical infrastructure about “a broken link” between transparency and accountability (Shka­
and integration of multiple, complex elements at different levels of batur, 2012), indicating that data producers cannot anticipate that their
government for its effective use (Chatfield & Reddick, 2018; Zuiderwijk, diverse stakeholders will all have the same level of understanding about
Janssen, & Davis, 2014). However, there is a lack of empirical evidence the nature of the data (Dawes & Helbig, 2010; Helbig, Gil-Garcia, &
about how open data innovation meets its intended goals and what Ferro, 2009; Zuiderwijk & Janssen, 2014).
challenges block the ability to achieve the ultimate goal of creating The potential benefits and challenges of implementing open data
public value. policy raise the following research question: How do government or­
The fundamental assumption underpinning the global trend of open ganizations attempt to bridge the gap between transparency and

* Corresponding author at: Rockefeller College of Public Affairs & Policy, 1400 Washington Ave, Albany, NY 12222, USA.
E-mail address: spark22@albany.edu (S. Park).

https://doi.org/10.1016/j.giq.2020.101456
Received 1 November 2017; Received in revised form 26 September 2019; Accepted 6 January 2020
Available online 17 May 2021
0740-624X/© 2020 Elsevier Inc. All rights reserved.
S. Park and J.R. Gil-Garcia Government Information Quarterly 39 (2022) 101456

accountability through the implementation of visualizations and the understood as the “visibility of the process” and specifically as “the
corresponding changes in organizational processes? We observed the availability of information…to monitor the workings or performance” of
open data innovation process at a state health agency in which the government (Meijer, 2013, p. 430). While the general concept of
broken link between transparency and accountability was identified and transparency is focused on information availability, researchers have
partly resolved as the agency made an effort to assess and enhance the increasingly noted that the concept of transparency has different di­
outcomes of its open data innovation in collaboration with a team of mensions and nuances when considering the diverse needs of relevant
computer programmers and analysts from a university research center. stakeholders (Fierro & Gil-Garcia, 2011, 2012). For example, Heald
We conducted semi-structured interviews with various internal stake­ (2006) conceptualized different dimensions of transparency based on
holders at different levels of the organization in order to better under­ the information receptor’s perception and capability to process the in­
stand the complex dynamics between the agency and its stakeholders formation made available. Drawing on external stakeholders’ percep­
within its legal, political, and technological contexts (Bannister & tions of usefulness and understandability, he differentiated nominal
Connolly, 2011; Meijer, 2013), as well as the agency’s organizational transparency from effective transparency. Similarly, Michener and
structures and processes (Lourenço, 2015). Bersch (2013) noted that the concept of transparency expands beyond
We found that innovation does not stop with the adoption of an open simple visibility. They argue that transparency includes “inferability” or
data policy, but instead involves an ongoing cycle of improvements to the degree to which information is user-friendly and verifiable so that it
organizational processes through which the transparency-accountability can help the user reach accurate conclusions (p. 236).
gap can be reduced. Notably, middle managers and non-managers In other words, data availability does not necessarily translate into
played an essential role in reducing this gap by evaluating and an increased level of transparency (Janssen et al., 2012) unless open
enhancing data understandability and usability from both the data data is designed and implemented in a way to meet the different users’
producers’ and end users’ perspectives. Although this is a single case expectations. While the concept of open data is associated with multiple
study, we contribute to the extant literature by providing an empirical benefits, previous studies found many challenges to using open data.
grounding of open data innovation, especially in connection with the One of the most widely discussed challenges is that citizens have a hard
ambiguous relationship between transparency and accountability, and a time making sense of raw data (Dawes, 2010; Graves & Hendler, 2013;
basis for future theory-testing studies. We also highlight the role of Hellberg & Hedström, 2015). These researchers suggest that only a small
middle managers and non-managers in the continuous process of open number of citizens are capable of understanding the underlying statis­
data innovation, while prior studies have primarily focused on the role tical meanings and implications of open data and fully realize its ben­
of top management support in open data adoption (Barry & Bannister, efits. Janssen et al. (2012) defined this problem as one of the “myths” of
2014; Chatfield & Reddick, 2018). open government data, arguing that the use of open data in reality re­
This paper is organized in six sections, including the foregoing quires that the end-user has a certain level of knowledge.
introduction. In the next section, we first identify two of the expected
benefits of open data policy: transparency and accountability. While 2.2. The concept of accountability
these two terms are often assumed to be closely related, we found it
necessary to develop a clear understanding of the conceptual link be­ Since several researchers have argued that transparency does not
tween transparency and accountability as their ambiguous relationship automatically lead to accountability, here we explore the concept of
has been continuously debated (Fox, 2007; Janssen et al., 2012; accountability independently. Drawing on the public administration
Mayernik, 2017; Yu & Robinson, 2012). We then describe the open data literature, the concept of accountability concerns the interpersonal,
innovation process and the observed outcomes from it, followed by our organizational, and political aspects of bureaucratic systems. For
analysis of how the transparency-accountability gap is created and how example, Romzek and Dubnick (1987) provide typologies of account­
it could be reduced through wider organizational participation in the ability systems based on the degree and sources of agency control, which
assessment process and the implementation of visualizations and their address government agencies’ constraints and obligations to satisfy a
respective changes in organizational processes. Finally, drawing on the wide range of issues in modern society. Accountability also includes
concepts identified earlier, we found that the agency’s effort to assess norms, expectations, and interorganizational and interpersonal
and enhance its policy outcomes improved the level of accountability in behavior, although they are considered informal (Romzek, LeRoux, &
relationship to responsiveness. Blackmar, 2012). Koppell (2005) explores the five dimensions of
accountability in the context of public administration. Those dimensions
2. Literature review include transparency, liability, controllability, responsibility, and
responsiveness. While most of the dimensions are closely related to the
This section explores the relationship between transparency and political and legal arrangements (e.g., Congress sets rules and the or­
accountability in the context of open data innovation. While those two ganization faces consequences for its performance), the responsiveness
concepts are often represented as one of the intended goals of open data dimension is left to the discretion of individual agencies as it involves
policy, a number of researchers have argued that transparency does not the question of whether the organization fulfilled its substantive ex­
automatically produce increased accountability (Janssen et al., 2012; Yu pectations. These expectations can relate to citizen demands or to the
& Robinson, 2012). In exploring the various factors that can bridge the organizational need for an assessment of policy goals.
transparency-accountability gap, we conceptualize that open data While there has been an ongoing discussion about the various di­
innovation is not only related to making data available (e.g., visualiza­ mensions of accountability in public administration, the concept of
tions), but also redesigning organizational processes and practices to accountability has not been fully developed in the context of open
meet the users’ needs and improve their understanding of the data. government data. One of the major challenges of understanding
accountability is that there are no uniform standards for the evaluation
2.1. The concept of transparency of accountability in practice. Although many studies recognize that data
availability alone may not be sufficient for the user to make informed
Transparency is considered to facilitate government accountability decisions, there is a lack of consensus on the definition of accountability
as citizens are provided with the necessary information to make for open data policy. Fox (2007) conceptualizes accountability in rela­
informed decisions (Meijer, 2015, Lourenço, 2015) and to monitor tion to the data producer’s ability to respond to the data user’s needs,
government decision-making processes and outcomes (Bertot et al., regardless of whether it is voluntary or forced by compliance.
2014; Harrison & Sayogo, 2014; Grimmelikhuijsen & Meijer, 2012; In particular, the opening of data in the public sector requires an
Meijer, 2009; Parycek et al., 2014). Transparency in that sense is understanding of the complex dynamics between government agencies

2
S. Park and J.R. Gil-Garcia Government Information Quarterly 39 (2022) 101456

and their stakeholders within their legal, political, and technological provide an assessment framework or guidelines as to what kind of data
contexts (Bannister & Connolly, 2011; Meijer, 2013), as well as the disclosure is useful and usable and how agencies can evaluate their
agency’s organizational structures and processes (Lourenço, 2015). For implementation plans and outcomes (Bertot, McDermott, & Smith,
example, government agencies have to understand various frameworks 2012; Lourenço, 2015; Harrison et al., 2012). For example, one of the
within state and national contexts while they assess the realities of major challenges of transparency and open data policy is the citizens’
implementing open government data initiatives by weighing the costs of ability to understand data or their level of technical knowledge (Fierro &
such a mandate (e.g., technological and organizational capacity for Gil-Garcia, 2011, 2012). Janssen et al. (2012) find that the opening of
handling additional tasks) against other potential benefits for various data itself has little value and thus it would not automatically yield
stakeholders outside the organization. On the one hand, government benefits unless the user sees any value in using the data. Thus, simply
agencies as data producers cannot expect their diverse stakeholders to making data available does not necessarily translate into accountability
all have the same level of understanding about the nature of the data (Harrison et al., 2012; Janssen et al., 2012) and a gap can be identified
(Dawes & Helbig, 2010; Helbig et al., 2009; Zuiderwijk & Janssen, between the two (see Fig. 1). Fig. 1 visually shows how we conceptualize
2014). On the other hand, government agencies need to develop data the transparency-accountability gap, which can be created by the
systems that go beyond their own needs in order to communicate data misalignment between intended benefits of open data policy and its
errors and improve the overall quality of the data (Dawes & Helbig, actual outcomes.
2010). Integrating and linking data across organizational boundaries is
challenging, especially when it is expected to benefit all their users and
relevant stakeholders (Heise & Naumann, 2012; Hellberg & Hedström, 2.4. Bridging the transparency-accountability gap
2015; Lourenço, 2015).
Given the challenges of implementing open data policy, the concept The gap between transparency and accountability highlights both
of accountability addresses government agencies’ constraints and obli­ conceptual and practical challenges to open data policy. Since data users
gations when attempting to satisfy a wide range of expectations sur­ often encounter difficulties in making sense of government data due to
rounding open data policy. Drawing on Romzek and Dubnick’s (1987) their limited technical knowledge (Hellberg and Hedström, 2015;
typologies of accountability systems, the government’s role as a pro­ Lourenço, 2015; Janssen et al., 2012), visualization is often considered
ducer of open data relates to their ability and willingness to adjust their an approach that can bridge the transparency-accountability gap (Arti­
norms, or interorganizational and interpersonal behavior (Romzek gas & Chun, 2013; Graves & Hendler, 2014). Visualizations, particularly
et al., 2012) in response to the data user’s needs (Fox, 2007), particu­ in the health data context, can help technical and non-technical users
larly in relation to the understandability of data. As Koppell (2005) also understand the meaning of the data in the form of numbers, ratings, and
suggested, accountability is multidimensional. That is, the concept of symbols (Graves & Hendler, 2014; Sopan et al., 2012; Tu & Lauer,
accountability not only relates to the political and legal arrangements (e. 2009).
g., Congress sets rules and the organization faces consequences for its Visualizations are often proposed as a solution to bridge the
performance), but it also involves the discretion of individual agencies transparency-accountability gap, but they are just one component of
in fulfilling the substantive expectations from the public, as well as their making data usable. The transparency-accountability gap requires a
own organizational need in terms of assessing policy goals and holistic approach to the practice of open data policy because making
outcomes. useful and usable data available involves a full understanding of data,
information technology, management structure and organizational
2.3. The transparency-accountability gap in open data policy processes, and relevant laws and regulations (Gil-Garcia & Pardo, 2005).
Consequently, open data innovation involves not only the trans­
While the ultimate goal of open data policy is to achieve both the formation of technical infrastructure, but also other social factors that
enhancement of government performance and the creation of public enable the effective use of open data, such as changes in organizational
value (Harrison et al., 2012), current open data practices are often processes in support of a better user experience. For example, the
implemented under the assumption that transparency will be one of the redesigning of organizational processes and communication channels
results and it will almost automatically lead to accountability. However, may be necessary to enable the effective implementation of visualiza­
transparency alone may not be sufficient to facilitate accountability due tions. As seen in Fig. 2, we propose that open data innovation includes
to the sheer volume of data and their complex functionalities, as well as various efforts that can help bridge the transparency-accountability gap:
the challenges arising from existing organizational structures and pro­ the development of visualizations and the redesign of organizational
cesses (Lourenço, 2015). Moreover, transparency itself can take various processes.
forms based on complex dynamics between governments and stake­
holders within their legal, political, and technological contexts (Ban­ 3. Research design and methods
nister & Connolly, 2011; Meijer, 2013).
Yu and Robinson (2012) questioned the ambiguous notion of the This section briefly describes our research design and methods,
term “open government data,” indicating that data transparency (i.e., an including case selection, data collection, and data analysis. Overall, this
emphasis on “data”) is not necessarily linked to accountability (i.e., an study is based on semi-structure interviews with open data producers
emphasis on “open government”) depending on what policymakers
prioritize. Fox (2007) raised questions as to under what conditions
transparency leads to accountability or what types of transparency
promote what types of accountability. He provides typologies of trans­
parency and accountability. His conceptions of transparency include
dissemination of and access to information and institutional answer­
ability, while his conceptions of accountability relate to the capacity to
sanction or compensate. Drawing on these closely-related concepts of
transparency and accountability, Mayernik (2017) developed a model
that shows the multi-dimensional relationship between the two.
While there have been numerous attempts to create open data
assessment frameworks (e.g., open data barometer, open data maturity
initiative; also see Vetrò et al., 2016), open government policy does not Fig. 1. The transparency-accountability gap.

3
S. Park and J.R. Gil-Garcia Government Information Quarterly 39 (2022) 101456

their performance data annually. Once they submit their data, the QARR
Unit at the agency collects, validates, and publishes the data in both
their annual QARR reports and on their official agency website. The
QARR data available online is called “eQARR.” Essentially, one of the
major challenges of using eQARR involves the format of data presen­
tation because data are shown in a static table format. The user is
directed to the webpage shown in Fig. 3. There is no additional
description of those numbers on the same page. Additionally, healthcare
Fig. 2. Open data innovation as a way to bridgethe transparency- plans cannot be sorted or re-ordered based on user preference.
accountability gap.

3.3. Research design


and users in a state agency.
The pre- and post-test research design is a common assessment
3.1. Case study methodology method for measuring the change in outcomes before and after an
experimental manipulation (Creswell, 2009). Since the purpose of this
This research used a case study approach to examine open data study is to conduct in-depth analysis of the open data innovation pro­
innovation at a state health agency. While a case study provides a small cess, we chose to conduct semi-structured, qualitative interviews to
number of observations, it provides a vivid picture of a phenomenon and examine not only the changes in user experience, but also the broader
enables in-depth analysis of the research problems (Creswell, 2009; organizational changes that accompanied the implementation of the
Eisenhardt, 1989; Yin, 2009). This project investigated how the orga­ visualization tools. Even within the agency, internal users have varying
nizational process redesign for the implementation of visualization tools degrees of experience with open government data as they are involved in
enhanced the understandability and usability of open government data different stages of data processing (e.g., collecting, validating, publish­
and thus helped bridge the transparency-accountability gap. ing, updating, etc.) and therefore play diverse roles around visualiza­
tions. As the researchers identified the individuals involved in different
3.2. Case selection stages of data processing, we assessed the specific changes in each stage
before and after the adoption of visualization tools. This allowed a
This study focused on a state health agency that has been actively deeper understanding of organizational change from the various per­
making their data publicly available and usable. This agency has spectives of data producers and internal users. We relied on semi-
maintained a large number of databases for various health-related policy structured interviews to provide detailed perspectives of individuals
areas. Among many program areas, the state agency chose to update a about their roles in the organization (Crotty, 1998; Strauss & Corbin,
specific dataset relating to quality of care provided by managed care 1998).
plans. Drawing on selected visualization assessment frameworks (Graves &
This dataset contains providers’ performance information on na­ Hendler, 2014; Sopan et al., 2012) and open data quality measurement
tionally standardized measures and state-specific measures, all validated frameworks (Ubaldi, 2013; Vetrò et al., 2016), this study developed a set
by the licensed auditor. Most of those quality measures follow the of questions to assess user experience and organizational change before
guidelines from the National Committee for Quality Assurance (NCQA)’s and after the adoption of visualization tools. A general assessment of
Healthcare Effectiveness Data and Information Set (HEDIS). The HEDIS user experience was based on the data characteristics and information
data set provides standardized performance measures on a set of public systems. Interview questions were composed of two parts: first, a general
health issues such as cancer, smoking, and asthma as well as standard­ assessment of user involvement in data processing and their user char­
ized surveys of consumers’ access to care and their experiences with acteristics, such as the level of technical knowledge and interaction with
care. These data provide a good foundation for assessing the outcomes of the open data on a daily basis; second, a specific assessment of user
open data policy because they specifically relate to consumer informa­ experience with the open data before and after the implementation of
tion and they are collected and published on a regular basis. new visualizations.
The state health agency requires all managed care plans to report A total of nine individuals participated in the pre-test interviews

Fig. 3. Current view of the data (New York State Department of Health 2016)

4
S. Park and J.R. Gil-Garcia Government Information Quarterly 39 (2022) 101456

from May 31 to June 15, 2016. The same nine interviewees participated automatically lead to broader organizational change. The state agency
in the post-test interviews from August 8 to August 12 after the imple­ in this case put the extra effort into enhancing not only the availability of
mentation of specific visualization tools. Most interviews were one-on- open data, but also the understandability of open data by aligning their
one meetings, except for two meetings where the interviewer met with internal users’ needs and the functions of visualization tools.
two participants from the same office. Interview participants were from
various program areas in the state agency with different priorities, yet 4.1. The primary causes of the transparency-accountability gap
one of their responsibilities dealt with analyzing QARR data in order to
monitor managed care providers and their quality of care. Although they 4.1.1. The gap between “data” quality and data “system” quality
are familiar with the raw datasets, their familiarity with eQARR, the In comparing user experiences before and after the implementation
online version of the data, varied greatly based on their level of of data visualization tools, we found that users saw little value in
involvement with eQARR data processing or their job responsibilities, prioritizing data availability over data usability, while the goal of an
which may require extensive data analysis. Each interview lasted from open data policy could be seen as to achieve both at the same time.
40 min up to 1.5 h. The interview questions remained the same for the Particularly, an emphasis on transparency can result in data that the
pre- and post-test interviews in order to compare any changes in user users cannot understand. Since most open data policies are mandated by
experience. legislative bodies without specific guidelines as to how data should be
published in a user-friendly manner for public consumption, public
3.4. Data analysis managers often focus on data availability rather than on its usability in
practice, given the resource constraints they have for data enhancement.
All interviews were recorded and notes were taken, which were later What is worse, there are no standardized metrics for evaluation of the
transcribed for coding. Key terms, issues with data use, and other major quality of open government data. This in turn creates a gap between data
emergent themes were coded and compared (Strauss & Corbin, 1998). quality and data system quality, where users may not be able to use data
Codes were developed based on the review of the literature discussed to its full potential, leading to a misalignment between user expectations
earlier. Those codes were evaluated continuously as more interviews and agency goals for open data policies.
were conducted because they can help the researcher understand the In this case, we found that the selected dataset was relevant, reliable,
key ideas and further investigate the deeper meanings of them (Strauss and complete in a timely fashion for the purpose of quality control and
& Corbin, 1998). In addition, the codes from pre-test interviews were for statewide planning of health program initiatives. All interview par­
compared and contrasted with those from post-test interviews so as to ticipants trusted the selected dataset because the data must be validated
assess the role of visualization tools in relation to perceived trans­ by certified auditors when individual health plans report to the state
parency (See Table 1). agency. Participants understood that health plans try to make sure that
Examples of emergent codes are as follows with sub-codes in pa­ the data is a true and accurate representation of their performance. On
rentheses: data characteristics of open data before and after visualiza­ the other hand, participants expressed their concerns about the current
tion (content, format, relevance, usefulness, perceptions of quality and electronic display of the dataset (as opposed to the raw data) due to
trust), difficulties with using the open data before visualization (ease of some accessibility, usability, and navigation issues. One of the major
navigation, display/format/organization of the data), user satisfaction challenges they faced with using the data was its static representation
after visualization (usability, efficiency, cost savings), organizational format. Most participants expressed their frustration that quality mea­
factors (performance goals, management support), political and regu­ sures (e.g., colon cancer screening, adult BMI assessment, flu shots for
latory environment (federal and/or state laws that may have an impact adults, etc.) were categorized under different domains (e.g., adult
on availability, accessibility, and quality of open data). health, behavioral health, child and adolescent health, etc.), which was
not intuitive even for experienced users and required them to follow
4. Analysis and results multiple steps to find the data they want.
As mentioned before, the original open data format in the website
In our analysis, we first identified the various factors that caused the not only lacked user-friendliness, but also had some navigation issues.
transparency-accountability gap. Second, we discussed how the For example, the user had to choose one among commercial HMO,
transparency-accountability gap was reduced through major changes in commercial PPO, and Medicaid, which would take her to another web
the organizational processes associated with the implementation of page with multiple quality measures under different disease categories.
visualization tools. The new visualization tools obviated the production Since this categorization system was not intuitive, the user often had to
of dozens of web pages and thus required a significant change in the click back and forth several times to get access to the data she wants and
current organizational process around the “opening” of data. While it was almost impossible to compare information across different cate­
visualization may prompt some changes in data processing, it may not gories at once. In fact, even an experienced user might get confused
because similar measures exist under different disease categories (e.g.,
preventive care under adult health as opposed to preventive care under
Table 1
women’s health).
Comparison of codes before and after visualization.
As one participants stated, “It is burdensome to click multiple do­
Pre-test Post-test
mains. It takes several clicks to get to where you need to be. For example,
The user description of Not intuitive Intuitive the colorectal cancer screening measure is in a different measurement
features and Multiple steps Responsive domain from the breast and cervical cancer. Colorectal cancer is in
characteristics of the Time consuming Easily comparable
data A quick reference User-friendly
preventive care while breast and cervical cancer are in women’s health.
Allows for analysis It takes several clicks to toggle between those different measures. I wish I
Factors relating to the Burdensome to access A shortcut could identify the measures I would look at and display them on the
data system to data Interactive same page.”
Static
Another commented, “If you want to look at health plan performance
Factors relating to data Tedious, inefficient Implementation of data
processing and data processing visualization tools in on the asthma medication ratio for ages 5-18, you have to look that up
organizational process Limited server response to the user’s needs under children and adolescent health. If you want to see the perfor­
capabilities and the Improvement in data mance in Medicaid for asthma measures, you look that up under adult
lack of technical processing and management health. It’s the same measure in different age categories. They are the
expertise
same measure but in more than one section just because it is broken out

5
S. Park and J.R. Gil-Garcia Government Information Quarterly 39 (2022) 101456

by different age groups. It’s not like we’re dividing categories by disease. data transparency, the costs outweighed the potential benefits from
I think it would be cumbersome to do it that way. We would want to look using it. As data producers, members of the unit realized that data
at the results of the measure across the whole spectrum of ages instead of processing required a considerable amount of their time, which reduced
separating out age groups.” the time they had available for data analysis. On the other hand, they
The confusion over the groupings of health quality measures was recognized that the dataset often did not allow for a deeper level of
mainly caused by the way the dataset was originally implemented. The analysis for other researchers and end-users. One participant from the
data production unit is made up of research scientists whose primary job data production unit expressed her concerns:
responsibilities are to analyze health data rather than to enhance the
“We are researchers who really understand how the data is collected
presentation of data. Due to their lack of expertise and data represen­
and how to use the data, but we most start with how to best display it.
tation capabilities, members of the unit chose to reduce the already
That’s not our expertise. We miss the mark a lot. I think it’s better
heavy burdens associated with data processing (e.g., collecting, vali­
suited of our time to work in partnering with other agencies to do
dating, updating, and publishing) by following a prescribed, simple
that for their business. We can help provide the context and insti­
approach to data display. The unit ran a preset SAS program which
tutional knowledge about the data and build off of their expertise on
would produce a number of HTML pages, each containing the data in a
how it can be used by consumers in making decisions.”
table format. Then the user had to navigate more than 60 web pages to
explore various measures under different categories and domains. This From a management perspective, another participant in the unit
approach is not convenient even for an internal user, but the unit also suggested that data enhancement was outside the scope of their work in
had to take additional steps to make the data available for the public in practice:
accordance with the agency’s open data policy goals, thereby spending a
considerable amount of time every year to create web pages for their “I think it’s actually inappropriate for my staff to create webpages.
public website. That’s not what they are hired for and it’s not their expertise. I think
Once they created web pages and validated all of the data, they it’s not good use of their time to be programming web pages from
forwarded the web pages to the Public Information Administration Of­ datasets. I want my staff very focused on integrity of the data, data
fice who then made minor revisions to coloring and shading on the web checking, validating, benchmarking, and hand it off to somebody to
pages in order to make it consistent with other content on the agency worry about the display and have them do more of using it to inform
website. Although the Public Information Administration Office policymakers and help them make decisions where we need to look
managed the agency website in general, they had limited server capacity and focus resources and redirect quality improvement activities. I
and lacked technical expertise in data visualization; under these con­ don’t want them spend much time on how to make the data visually
straints, the HTML data format was the best option available in meeting appealing. That’s not what they are hired for.”
the goals of open data policy.

4.1.2. The misalignment between agency goals and user expectations 4.2. Closing the transparency-accountability gap
The selected data in this case is originally mandated as part of State
Public Health and Social Services Laws. As a public reporting system, the 4.2.1. Queryable and customizable visualization tools meet user
dataset is designed to monitor managed care plan performance and to expectations
improve their quality of care. The state health agency further utilizes As most participants in the pre-test interviews expressed their pref­
individual health plan’s quality performance outcomes in calculating erence for queryable tools such as drop-down menus, the newly devel­
financial incentives and organizing their focus areas for statewide health oped visualization tools met their expectations and helped them find the
improvement plans. While participants saw a great deal of potential for relevant information in a more effective way. As Fig. 4 shows, users can
promoting government transparency and accountability through the now produce individually customizable visualizations on one web page
collection and publication of this data, they admitted that their needs by selecting different payers, domains, subdomains, and years in the
were not being fully addressed in the current system. While they had menu option, rather than having to go back and forth on multiple web
dynamic needs for information, their understanding of the data was very pages. Users can also sort columns in ascending or descending order.
limited even at the agency level in terms of who created the data, their Moreover, the web page provides multiple visualization tools, including
involvement in data collection, what their expectations were for its use, bar charts, line charts, and scatter plots.
and how the data were actually used in practice. In the post-test interviews, participants agreed that the availability of
In developing a full understanding of user experiences and expec­ various data visualization tools helped them to bring new insights to the
tations, this study found that there were different types of users engaged data. If the user wanted to compare quality performance information
with open data processing and actual data use. While some of the par­ across different health plans, years, or measures, the visualization tools
ticipants preferred to use open data as a quick reference for validation provided an easy comparison by displaying all of the selected informa­
purposes, the rest of the participants did not find open data relevant to tion. Interestingly, the data format and its user-friendliness were the
their job tasks because of its limited functionality. For example, open major deciding factor for user satisfaction because participants already
data only showed quality performance information for each individual had a high level of trust in the data, which is validated by state-certified
health plan compared to the state average for a given year. Some users, auditors based on nationally standardized measures. Knowing that the
however, wanted to see all of the health plans’ performance across information content is regularly validated by a trusted source, partici­
different years so as to easily compare a specific health plan’s perfor­ pants paid more attention to how easily the data could be manipulated
mance to its peers. Since the current data format did not allow for a for their needs rather than whether the system provides comprehensive
deeper level of analysis, many participants chose to personally down­ information. Some of the burden on the agency to build trust was eased
load the original dataset for their own analysis instead of referring to the because the data were collected and validated based on a set of highly
public website. standardized guidelines, ensuring their integrity. Generally, internal
stakeholders, who may have a limited understanding of healthcare
4.1.3. Organizational inefficiencies created from the misalignment quality measures and the meaning of statistical data, were highly
The misalignment of agency goals and user expectations also created satisfied with the new data format. One stated, “I think it is easier to use
organizational inefficiencies around data processing because users could because it allows the user to customize it a little bit more. Probably the
not make use of the data to its fullest potential. While those who were data I would be taking from this, I could find it in the old system, but it’s
heavily involved in data processing saw a lot of value in data use and presented nicer. I like that you can compare the different payers.”

6
S. Park and J.R. Gil-Garcia Government Information Quarterly 39 (2022) 101456

Fig. 4. Visualization tools.

Another commented, “It was what we wanted. Being customizable and Executive Deputy clearance to publish them on the website. However, in
user friendly was our goal and it was met.” However, data visualization the new process we don’t have to do that because the dataset is up there
may not meet the diverse needs of all stakeholders who are impacted by and we just have to append the data to it every year, so we are not
open data policies, particularly when they are external to the agency. A changing anything and we don’t have to get any additional permissions.
future study is necessary to identify the needs of external users and the So that saves time because it takes several days for just the permission to
role of data visualizations in open data ecosystems at large. come back.”
Similarly, the Public Information Administration Office could see a
4.2.2. New organizational processes surrounding visualizations reduce substantial increase in efficiency around managing their agency web
organizational inefficiencies and improve data quality site. One staff member stated, “Instead of us having to reproduce this
As discussed earlier, one of the major challenges for the data pro­ content, in theory when they update the listing on Health Data, it will
ducer was the unit’s lack of technical expertise in order to enhance the automatically update it here [agency website] because it should be
visual representation of the data. Consequently, they perceived their pulling the data as it gets updated from the data site…There’s basically
data management-related job task as non-essential, although it took up a no maintenance to it. Some of the other stuff we have to produce, by not
considerable amount of their time over the course of a year. If a staff having to produce anymore, it’s gonna save us a considerable amount of
member in the data production unit found an error on the current open time. Some of those directories we have a [web] page for every county.
data web page, she would have to make a correction in the raw dataset, So we’re looking at 62 [web] pages and some would take a little while to
rerun the SAS program to create an HTML web page, and ask another update.”
staff member in the Public Information Administration Office to upload Participants in other offices anticipated cost-savings associated with
the HTML web page to the agency website. Instead, the newly developed reduced time spent on finding relevant information to complete their job
data visualization tools now directly link a data repository with the tasks. While some participants noted that the use of open data is not
agency website through the use of JavaScript codes, which can be easily required to complete their job tasks, it can certainly help them as a quick
manipulated and reproduced for other types of data. In that sense, data reference. For some participants, open data was an “invaluably
visualization tools helped bridge the gap in technical expertise at the wonderful shortcut” as they try to quickly check the state’s progress (e.
state agency and increased the level of flexibility and convenience g., comparing statewide averages to national averages) in formulating
around validating and publishing data. As a result, participants antici­ business arguments. In this case, open data was a reliable source and it
pated that they would see benefits associated with data processing and saved the staff’s time otherwise spent on downloading the raw data set,
use. selecting certain measures of interest, and calculating quality perfor­
First, the data production unit was most likely to benefit from the mance rates. Since visualization tools reduce the time spent navigating
new visualization tools because it would save 1–2 months of full-time different web pages (e.g., information under different payers, domains,
staff work hours, while also eliminating inefficiencies in the current subdomains, and measures) to access the information that the user
organizational process around data reporting. wants, any user could anticipate cost savings associated with data
One data producer said, “Not having to recreate these [web] pages accessibility and usability.
probably saves me weeks of work…I would say two actually. Creating
the pages, making sure everything run smoothly, getting them up on the 5. Discussions and implications
website, validating them, and coordinating with other colleagues to
have them go through and check all the webpages and work with the As open data policy has led to innovations in the public sector across
public website team to make sure that they get rid of any HTML errors the world due to its transformative power (Zuiderwijk, Helbig, et al.,
that I might have missed…I think that’s two solid weeks of work. So 2014), some posited that transparency would automatically promote
that’s 75 hours of work over the course of a month, easy…” accountability (Lourenço, 2013; Lourenço, Piotrowski, & Ingrams,
Another commented, “All the levels of approvals…every time we 2017; Ruijer et al., 2017). However, an increasing number of researchers
want to publish something on the website, even things like [this dataset] have cast doubt on the ambiguous relationship between transparency
would take years. We still have to print out PDFs and go through the and accountability (Fox, 2007; Mayernik, 2017; Wong & Welch, 2004;

7
S. Park and J.R. Gil-Garcia Government Information Quarterly 39 (2022) 101456

Yu & Robinson, 2012). They argue that information-based transparency, structural elements such as formal authority (Matheus, Janssen, &
focusing on accessibility rather than the quality of data, creates a gap Maheshwari, 2018) but also ongoing efforts to identify and reduce the
between the intended goals and actual outcomes of open data policy. transparency-accountability gap. Open data innovation, hence, relates
Thus, the goal of this study was to examine what creates the to a comprehensive understanding of how the information can be
transparency-accountability gap and what efforts can help bridge such a recreated and reused for both ordinary citizens and the public organi­
gap in the open government data context. zation that is responsible for the implementation of open data policy and
Based on our observation of a state health agency, we found that what organizational efforts are appropriate to support the effective use
middle managers and non-managers played an essential role in identi­ of open data. Making open government data more useful and usable
fying the transparency-accountability gap as they were heavily involved could potentially contribute to reducing the gap between transparency
in multiple stages of data processing. For example, they were responsible and accountability.
for not only managing data on a regular basis (i.e., collecting, cleaning,
publishing, and validating) to meet the open data policy guidelines, but 6. Conclusions
also for using data to perform their job tasks as research scientists (i.e.,
analyzing data for policymaking). In that light, publishing open data Acknowledging that transparency does not automatically promote
itself could be an onerous task for them when they did not have the accountability in the context of open data policy, this paper first
technical capability to recreate and reuse the data in a meaningful way. explored the relationship between transparency and accountability.
The role of middle managers and non-managers in open data inno­ Drawing on both public administration and open data literature, we
vation has three implications. First, because middle managers and non- argue that various efforts that can help bridge the transparency-
managers are involved in data processing, they are often aware of the accountability gap could be conceptualized as open data innovation.
effectiveness of open data policy both from the data producers’ and the We then provided some empirical evidence about how data visualiza­
end users’ perspectives. Non-managers frequently serve the dual role of tions, and their corresponding organizational processes redesign, could
publishing and using open data at public organizations and thus they can bridge the transparency-accountability gap by realigning the agency
play an essential role in identifying the transparency-accountability gap goals with user expectations. While enhancing technical capabilities
in open data innovation. Yet there is a paucity of research investigating such as visualization tools can be useful to ordinary citizens, the gov­
the role of middle-managers and non-managers in open data innovation, ernment agency − as the main actor who is responsible for disclosing
with prior studies primarily focused on the role of top management data to the public – could not realize the value of open data unless vi­
support in open data adoption (Barry & Bannister, 2014; Chatfield & sualizations were complemented with organizational efforts to identify
Reddick, 2018). inefficiencies arising from the existing processes and structures around
Second, non-managers’ dissatisfaction with data processing reveals the open data policy.
the organizational inefficiencies caused by the misalignment between We contribute to existing literature on open data innovation by un­
agency goals and user expectations. Here, “user expectations” include derstanding how the organizational process redesign around the
not only citizens, but also internal users at government agencies. implementation of visualization tools can enhance the understandability
Achieving the goal of transparency alone does not translate into an in­ and usability of open government data and thus helped bridge the
crease in public value if the internal users themselves do not see the transparency-accountability gap. We also have identified the role of
value of open data beyond simply complying with the policy guidelines. middle managers and non-managers in closing the transparency-
Although the open government rhetoric is organized around democratic accountability gap and understanding the role of visualizations in rela­
values (e.g., participation, transparency) and economic efficiency tion to the efficiency gain from redesigning organizational processes.
(Kornberger, Meyer, Brandtner, & Höllerer, 2017), public organizations However, there are a few limitations with our current study that we
are often left with limited options to prioritize data availability and hope to address in future research. First, this study relies on a single case
accessibility over data quality (e.g., usefulness and usability) when they and a small number of observations about the internal use of open data
are seldom equipped with the appropriate resources or methods to begin at the individual organizational level and the results cannot be gener­
with (Pasquier & Villeneuve, 2007). alized beyond that setting. We suggest that future research should
Lastly, although transparency and accountability goals seem com­ further examine what other efforts constitutes open data innovation
plementary, promoting one does not automatically lead to the other. beyond the organizational context. For example, open data innovation
Transparency is not always positively related to user expectations may involve interorganizational- and institutional-level efforts in which
mainly due to data quality issues concerning their fitness for use the transparency-accountability gap could also be identified.
(Thompson, Ravindran, & Nicosia, 2015; Wang & Strong, 1996). While Second, we were able to focus on organizational factors (e.g., orga­
previous studies found that data quality (e.g., usefulness) is an essential nizational processes) that might cause or reduce the transparency-
component in evaluating the effectiveness of open data policy (Dawes, accountability gap without much concern about data quality (such as
2010; Lourenço, 2015), it was often limited to technical features. accuracy and reliability) given the fact that the datasets we observed
Visualization tools can certainly help enhance data quality, yet the role were nationally standardized and audited. In many other cases, how­
of visualization tools in open data innovation has not been fully explored ever, we understand that data quality can also create or at least
in relation to its impact on the organizational processes that are neces­ contribute to the transparency-accountability gap. As previous research
sary to implement it and the resulting improvements in accountability. has established, transparency is not always positively related to
As internal users at the state health agency distinguished the benefits perceived trustworthiness of government organizations due to data
from the usability of data from that of the data system, the redesign of integrity and quality issues (Grimmelikhuijsen & Meijer, 2012). Thus,
data processing was necessary to allow for the successful implementa­ we hope that future research tests our findings by exploring other cases,
tion of visualization tools, which improved the overall quality of avail­ including non-standardized datasets.
able data.
Open data innovation involves continuous adjustments for the Acknowledgements
effective use of open data, including evaluation of the “fitness” of the
data from a non-technical end user’s point of view (Lourenço, 2015) and This study was partially supported by the funding opportunity
the efficiency of data processing from the data producer’s point of view. number PR-PRP-13-001 from the U.S. Department of Health and Human
In extending the concept of accountability to the data producer’s ability Services (HHS), Centers for Medicare & Medicaid Services. The content
to meet the data user’s needs, regardless of whether it is voluntary or of this study is solely the responsibility of the authors and does not
forced by compliance (Fox, 2007), accountability relates to not only necessarily represent the official views of HHS, HHS agencies, HRI, or

8
S. Park and J.R. Gil-Garcia Government Information Quarterly 39 (2022) 101456

the New York State Department of Health. Kornberger, M., Meyer, R. E., Brandtner, C., & Höllerer, M. A. (2017). When bureaucracy
meets the crowd: Studying “open government” in the Vienna City Administration.
Organization Studies, 38(2), 179–200.
References Lourenço, R. P. (2013). Data disclosure and transparency for accountability: A strategy
and case analysis. Information Polity, 18(3), 243–260.
Artigas, F., & Ae Chun, S. (2013 Jun). Visual analytics for open government data. In Lourenço, R. P. (2015). An analysis of open government portals: A perspective of
Proceedings of the 14th Annual International Conference on Digital Government Research transparency for accountability. Government Information Quarterly, 32(3), 323–332.
(pp. 298–299). ACM. Lourenço, R. P., Piotrowski, S., & Ingrams, A. (2017). Open data driven public
Bannister, F., & Connolly, R. (2011). The trouble with transparency: A critical review of accountability. Transforming Government: People, Process and Policy, 11(1), 42–57.
openness in e-government. Policy & Internet, 3(1), 1–30. Matheus, R., Janssen, M., & Maheshwari, D. (2018). Data science empowering the public:
Barry, E., & Bannister, F. (2014). Barriers to open data release: A view from the top. Data-driven dashboards for transparent and accountable decision-making in smart
Information Polity, 19(1, 2), 129–152. cities. Government Information Quarterly. https://doi.org/10.1016/j.giq.2018.01.006
Bertot, J. C., Gorham, U., Jaeger, P. T., Sarin, L. C., & Choi, H. (2014). Big data, open (in press).
government and e-government: Issues, policies and recommendations. Information Mayernik, M. S. (2017). Open data: Accountability and transparency. Big Data & Society,
Polity, 19(1, 2), 5–16. 4(2), 1–5.
Bertot, J. C., McDermott, P., & Smith, T. (2012). Measurement of open government: McBride, K., Aavik, G., Toots, M., Kalvet, T., & Krimmer, R. (2019). How does open
Metrics and process. In 45th Hawaii international conference on system science (HICSS) government data driven co-creation occur? Six factors and a ‘perfect storm’; insights
(pp. 2491–2499). USA: IEEE, Institute of Electrical and Electronics Engineers. from Chicago’s food inspection forecasting model. Government Information Quarterly,
Chatfield, A. T., & Reddick, C. G. (2018). The role of policy entrepreneurs in open 36(1), 88–97.
government data policy innovation diffusion: An analysis of Australian federal and Meijer, A. (2009). Understanding modern transparency. International Review of
state governments. Government Information Quarterly, 35(1), 123–134. Administrative Sciences, 75(2), 255–269.
Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods Meijer, A. (2013). Understanding the complex dynamics of transparency. Public
approaches. Thousand Oaks, CA: SAGE. Administration Review, 73(3), 429–439.
Crotty, M. (1998). The foundations of social research: Meaning and perspective in the Meijer, A. (2015). Government transparency in historical perspective: from the ancient
research process. Thousand Oaks, CA: SAGE. regime to open data in the Netherlands. International Journal of Public Administration,
Davies, T., & Edwards, D. (2012). Emerging implications of open and linked data for 38(3), 189–199.
knowledge sharing in development. IDS Bulletin, 43(5), 117–127. Mergel, I. (2015). Opening government: Designing open innovation processes to
Dawes, S., & Helbig, N. (2010). Information strategies for open government: Challenges collaborate with external problem solvers. Social Science Computer Review, 33(5),
and prospects for deriving public value from government transparency. Electronic 599–612.
Government, 50–60. Michener, G., & Bersch, K. (2013). Identifying transparency. Information Polity, 18(3),
Dawes, S. S. (2010). Stewardship and usefulness: Policy principles for information-based 233–242.
transparency. Government Information Quarterly, 27(4), 377–383. Parycek, P., Hochtl, J., & Ginner, M. (2014). Open government data implementation
Dawes, S. S., Vidiasova, L., & Parkhimovich, O. (2016). Planning and designing open evaluation. Journal of Theoretical and Applied Electronic Commerce Research, 9(2),
government data programs: An ecosystem approach. Government Information 80–99.
Quarterly, 33(1), 15–27. Pasquier, M., & Villeneuve, J. P. (2007). Organizational barriers to transparency: A
Eisenhardt, K. M. (1989). Building theories from case study research. Academy of typology and analysis of organizational behaviour tending to prevent or restrict
Management Review, 14(4), 532–550. access to information. International Review of Administrative Sciences, 73(1), 147–162.
Fierro, A. E., & Gil-Garcia, J. R. (2011). Más allá del acceso a la información: El uso de Romzek, B. S., & Dubnick, M. J. (1987). Accountability in the public sector: Lessons from
tecnologías de información para fomentar la transparencia, la participación y la the challenger tragedy. Public Administration Review, 47(3), 227–238.
colaboración en el sector público. In , 262. Documento de trabajo. http://repositori Romzek, B. S., LeRoux, K., & Blackmar, J. M. (2012). A preliminary theory of informal
o-digital.cide.edu/handle/11651/727. accountability among network organizational actors. Public Administration Review,
Fierro, A. E., & Gil-Garcia, J. R. (2012). Transparency websites as tools for decision 72(3), 442–453.
making in a democratic government. In Paper presented at the transatlantic conference Ruijer, E., Grimmelikhuijsen, S., & Meijer, A. (2017). Open data for democracy:
on transparency research. Utrecht, Netherlands: Utrecht University. Developing a theoretical framework for open data use. Government Information
Fox, J. (2007). The uncertain relationship between transparency and accountability. Quarterly, 34(1), 45–52.
Development in Practice, 17(4–5), 663–671. Shkabatur, J. (2012). Transparency with (out) accountability: Open government in the
Gil-Garcia, J. R., & Pardo, T. A. (2005). E-government success factors: Mapping practical United States. Yale Law & Policy Review, 31(1), 79–140.
tools to theoretical foundations. Government Information Quarterly, 22(2), 187–216. Sopan, A., Noh, A. S. I., Karol, S., Rosenfeld, P., Lee, G., & Shneiderman, B. (2012).
Graves, A., & Hendler, J. (2013, June). Visualization tools for open government data. In Community Health Map: A geospatial and multivariate data visualization tool for
Proceedings of the 14th Annual International Conference on Digital Government Research public health datasets. Government Information Quarterly, 29(2), 223–234.
(pp. 136–145). Quebec City, Canada: ACM. Strauss, A., & Corbin, J. (1998). Basics of qualitative research techniques. SAGE.
Graves, A., & Hendler, J. (2014). A study on the use of visualizations for open Thompson, N., Ravindran, R., & Nicosia, S. (2015). Government data does not mean data
government data. Information Polity, 19(1, 2), 73–91. governance: Lessons learned from a public sector application audit. Government
Grimmelikhuijsen, S. G., & Meijer, A. J. (2012). Effects of transparency on the perceived Information Quarterly, 32(3), 316–322.
trustworthiness of a government organization: Evidence from an online experiment. Tu, H. T., & Lauer, J. R. (2009). Designing effective health care quality transparency
Journal of Public Administration Research and Theory, 24(1), 137–157. initiatives. Issue Brief (Center for Studying Health System Change), 126, 1–6.
Harrison, T. M., Guerrero, S., Burke, G. B., Cook, M., Cresswell, A., Helbig, N., Ubaldi, B. (2013). Open Government Data: Towards Empirical Analysis of Open
Hrdinova, J., & Pardo, T. (2012). Open government and e-government: Democratic Government Data Initiatives. In OECD Working Papers on Public Governance, No. 22.
challenges from a public value perspective. Information Polity, 17(2), 83–97. Paris: OECD Publishing. https://doi.org/10.1787/5k46bj4f03s7-en.
Harrison, T. M., & Sayogo, D. S. (2014). Transparency, participation, and accountability Vetrò, A., Canova, L., Torchiano, M., Minotas, C. O., Iemma, R., & Morando, F. (2016).
practices in open government: A comparative study. Government information Open data quality measurement framework: Definition and application to open
quarterly, 31(4), 513–525. government data. Government Information Quarterly, 33(2), 325–337.
Heald, D. (2006). Varieties of transparency. In C. Hood, & D. Heald (Eds.), Transparency: Wang, H. J., & Lo, J. (2016). Adoption of open government data among government
The key to better governance? (pp. 25–43). Oxford, UK: Oxford University Press for agencies. Government Information Quarterly, 33(1), 80–88.
The British Academy. Wang, R. Y., & Strong, D. M. (1996). Beyond accuracy: What data quality means to data
Heise, A., & Naumann, F. (2012). Integrating open government data with stratosphere for consumers. Journal of Management Information Systems, 12(4), 5–33.
more transparency. Web Semantics: Science, Services and Agents on the World Wide Wong, W., & Welch, E. (2004). Does e-government promote accountability? A
Web, 14, 45–56. comparative analysis of website openness and government accountability.
Helbig, N., Gil-Garcia, J. R., & Ferro, E. (2009). Understanding the complexity of Governance, 17(2), 275–297.
electronic government: Implications from the digital divide literature. Government Yin, R. K. (2009). Case study research: Design and methods. Thousand Oaks, CA: SAGE.
Information Quarterly, 26(1), 89–97. Yu, H., & Robinson, D. G. (2012). The new ambiguity of “open government,” UCLA law
Hellberg, A. S., & Hedström, K. (2015). The story of the sixth myth of open data and open review. Discourse, 59, 178–208.
government. Transforming Government: People, Process and Policy, 9(1), 35–51. Zuiderwijk, A., Helbig, N., Gil-García, J. R., & Janssen, M. (2014). Special Issue on
Janssen, M., Charalabidis, Y., & Zuiderwijk, A. (2012). Benefits, adoption barriers and Innovation through Open Data: Guest Editors’ Introduction. Journal of Theoretical
myths of open data and open government. Information Systems Management, 29(4), and Applied Electronic Commerce Research, 9(2), i–xiii.
258–268. Zuiderwijk, A., & Janssen, M. (2014). Barriers and development directions for the
Jetzek, T., Avital, M., & Bjorn-Andersen, N. (2014). Data-driven innovation through open publication and usage of open data: A socio-technical view. In M. Gascó-Hernández
government data. Journal of Theoretical and Applied Electronic Commerce Research, 9 (Ed.), 4. Open government: Opportunities and challenges for public governance (pp.
(2), 100–120. 115–135). New York, NYL: Springer.
Kalampokis, E., Tambouris, E., & Tarabanis, K. (2013). Linked open government data Zuiderwijk, A., Janssen, M., & Davis, C. (2014). Innovation with open data: Essential
analytics. In M. A. Wimmer, M. Janssen, & H. J. Scholl (Eds.), Proceedings of the 12th elements of open data ecosystems. Information Polity, 19(1, 2), 17–33.
IFIP WG 8.5 international conference, EGOV2013 (pp. 99–110). Koblenz, Germany: New York State Department of Health (2016). eQARR - An Online Report on Quality
Springer. Performance Results for Health Plans in New York State.
Koppell, J. G. (2005). Pathologies of accountability: ICANN and the challenge of
“multiple accountabilities disorder”. Public Administration Review, 65(1), 94–108.
Sora Park is a Doctoral Candidate in Public Administration at Rockefeller College of
Public Affairs & Policy, University at Albany, State University of New York. Her research is

9
S. Park and J.R. Gil-Garcia Government Information Quarterly 39 (2022) 101456

at the intersection of organizations, management, and technology. Her dissertation in­ Research Award, which is “the highest distinction given annually by the Mexican Academy
volves a theoretical and empirical study of sensemaking in regulatory agencies over of Sciences to outstanding young researchers.” Dr. Gil-Garcia is the author or co-author of
technology-related disruptions in markets. articles in prestigious international journals in Public Administration, Information Sys­
tems, and Digital Government and some of his publications are among the most cited in the
field of digital government research worldwide. His research interests include collabora­
J. Ramon Gil-Garcia is an Associate Professor of Public Administration and Policy and the
tive digital government, inter-organizational information integration, smart cities and
Research Director of the Center for Technology in Government, University at Albany, State
smart governments, adoption and implementation of emergent technologies, digital divide
University of New York (SUNY). Currently, he is also an Affiliated Professor at the Uni­
policies, and multi-method research approaches.
versidad de las Americas Puebla. In 2009, he was considered the most prolific author in the
field of digital government research worldwide and in 2013 he was selected for the

10

You might also like