Professional Documents
Culture Documents
WEF_Digital_Trust_Supporting_Individual_Agency_2024
WEF_Digital_Trust_Supporting_Individual_Agency_2024
WEF_Digital_Trust_Supporting_Individual_Agency_2024
with Accenture
Digital Trust:
Supporting Individual Agency
WHITE PAPER
FEBRUARY 2024
Images: Getty Images, Unsplash
Contents
Foreword 3
Executive summary 4
Introduction 5
Conclusion 14
Appendix 15
Contributors 18
Endnotes 19
Disclaimer
This document is published by the
World Economic Forum as a contribution
to a project, insight area or interaction.
The findings, interpretations and
conclusions expressed herein are a result
of a collaborative process facilitated and
endorsed by the World Economic Forum
but whose results do not necessarily
represent the views of the World Economic
Forum, nor the entirety of its Members,
Partners or other stakeholders.
Foreword
Kathryn White
Executive Fellow, Centre
Daniel Dobrygowski
for the Fourth Industrial
Head, Governance and
Revolution, World Economic
Trust, Centre for the Fourth
Forum; Global Principal
Industrial Revolution, World
Director, Responsible
Economic Forum
Emerging Technology and
Innovation, Accenture, USA
At the 2024 World Economic Forum Annual providing them – will protect all stakeholders’
Meeting in Davos, stakeholders convened under interests and uphold societal expectations and
the meeting’s theme of Rebuilding Trust. At that values.”2 This definition has informed the publication
meeting, participants continued to endorse the of a decision-making framework, high-level
Forum’s guidance that trust in technology – implementation roadmap3 and guidance on pre-
especially new and emerging technologies like implementation steps4 and how to measure digital
artificial intelligence (AI) – must be earned through trust5 for organizations’ leaders. This document
responsible decision-making while emphasizing that builds on these groundbreaking publications,
individuals and society have agency over the future ensuring individuals are at the centre of technology.
of technology.1 It further guides business and government leaders,
as well as individuals, on how to better recognize
At the Forum, the Centre for the Fourth Industrial the role of people as vital stakeholders in digital
Revolution helps stakeholders harness the full trust. With individuals increasingly interacting
potential of technological progress for the equitable with new and rapidly developing technologies like
and human-centred transformation of industries, generative AI, it has never been more relevant to
economies and societies. With thematic areas support human-centric technology.
ranging from AI to immersive technologies and
quantum technologies to space, improving This report closely examines the digital trust
governance is a top priority. To improve governance dimensions of transparency, privacy and
across technological domains, the Forum launched redressability according to the perspectives of
the Digital Trust initiative in 2021 to establish organizations, governing bodies and individuals –
a global consensus among key stakeholders all with the central lens of supporting individual
regarding what digital trust means and what agency. Business and government leaders
measurable steps can be taken to improve the are encouraged to prioritize the individual’s
trustworthiness of digital technologies. perspective throughout the technology life cycle
and take a by-design approach, especially to
Specifically, the Forum’s Digital Trust Initiative, transparency and privacy. With all actors doing
through a multistakeholder approach, has defined their part, we are hopeful that the benefits of
Digital Trust as “individuals’ expectation that digital emerging technologies can be fully realized while
technologies and services – and the organizations building trust among all stakeholders.
Protecting individual choice and agency is essential applies by-design principles to transparency,
to any trustworthy system. The world is navigating digital literacy and privacy, offering suggestions
the complexities of modern technology roughly a for how these concepts can be embedded into
decade into the Fourth Industrial Revolution. People the user’s experience of a digital product or
are experiencing a social, political and economic service. Among the various dimensions of digital
shift from the digital age of the late 1990s and early trust, three of the most relevant are highlighted
2000s to an era of embedded connectivity – a due to their direct relationship with individual
fusion of the digital, biological and physical worlds,7 agency: transparency, privacy and redressability
where an individual’s digital experience can be (this order is not meant to suggest an order of
more embodied, immersive and ever-present.8 importance). This exploration covers important
In this context, individual agency, which enables considerations, including the responsibility of
individuals to navigate their digital lives in an organizations to ensure clarity and trust and
informed, self-sufficient and protected manner, detail how privacy mechanisms can support
is vital. Supporting individual agency includes individual agency and the tiers of mechanisms
activities and decisions related to the deployment available for redress when harm does occur.
and development of digital systems. Support for The efforts summarized in this paper supporting
individual agency requires upholding human rights individual agency are not a cure-all. Instead, they
and, as such, is a necessary component for durable are a piece of a larger puzzle of organizational
trust – and durable digital trust is critical to a fairer, protection and support. Likewise, the ordering
safer and more sustainable digital economy. People of interventions in this paper does not represent
deserve to be able to make choices for themselves an order of preference or application. Rather, the
and on their behalf. Enabling individual agency concepts here are described in order of most
further supports responsible innovation, which is the cooperative and preventative approaches first,
only sustainable and fair way to ensure the adoption with post-hoc resolution mechanisms at the end.
of Fourth Industrial Revolution technologies such as
artificial intelligence and spatial computing. Individual agency by design
Digital trust is the expectation by individuals
that digital technologies and services – and Digital trust requires a by-design approach to
the organizations providing them – will protect technology that emphasizes the need for principles
all stakeholders’ interests and uphold societal that protect and support individuals from inception,
expectations and values.9 Pursuing this objective, putting their needs and values at the earliest
the World Economic Forum created the Digital Trust possible stage of development. First popularized
Initiative10 and published its foundational work on through the “secure-by-design” concept, the
the subject, Earning Digital Trust, which focuses by-design concept has influenced several
on how organizations can make more trustworthy other approaches to designing and developing
decisions regarding technology. This paper – Digital technologies, including privacy by design,
Trust: Supporting Individual Agency – discusses accessibility by design and sustainability
the much larger group of stakeholders who use by design.11
or otherwise interact with digital technologies.
For digital trust to be effectively established and This methodology, wherein user protection
maintained, these individuals must be able to and online harm prevention are baked into the
recognize themselves as stakeholders with the technology, translates into a digital product or
agency to self-navigate digital technologies. service that supports individual agency and is
more trustworthy.12 The idea is not to take choices
This paper raises examples from the data away from individuals or increase the burden of
management space, a sector at the forefront of responsibility on everyday people but to ensure
navigating digital trust considerations, to explore they are presented with fair options within a safe,
how organizations can make decisions that enable secure and trustworthy environment that other
individual agency and foster digital trust. The paper organizational safeguards enable.
Building and As a crucial first step in recognizing individual harmony to support individual agency and earn
maintaining trust agency, organizations seeking to cultivate digital digital trust. Their interplay forms the baseline
is an ongoing trust must provide sufficient “light” for individuals to for accountability for a given technology and an
feel empowered to make their own choices and act organization’s accountability culture while providing
endeavour.
in their own best interests in the digital environment. a more reliable and secure digital environment for
Building and maintaining trust is an ongoing the individual.
endeavour requiring organizations to consistently
demonstrate a commitment to consumer protection
across their policies, products and services. Visibility into data flow
Specifically, organizations aim to ensure that users
maintain control by building transparency into their
products and services, offering effective digital The use and movement of individuals’ data
literacy programmes and making transparency offers a helpful example of digital literacy and
tools more accessible, available and intuitive. transparency at work. Data underpins a person’s
Having consumers represented in the design interactions with digital technologies. As the digital
process bolsters the decision-making that goes economy has expanded, with several applications
into achieving such transparent ends. Consumer collecting data for some corporations, advertising
advocates seek to work with organizations has become a way to offer digital products and
and regulators to ensure transparency efforts services without erecting paywalls. In exchange
are pervasive and beneficial for consumers. for the use of technology services, consumers
Such efforts result in appropriate choices in the are incentivized to share their personal data. As
marketplace, consumer-friendly online choice this dynamic has increased, consumer data has
architecture and corresponding default settings. become more valuable and collection methods
more thorough. This has amplified the scale of
Digital literacy and transparency, crucial elements the opportunities and the potential harms for both
in the design and use of technology, work in users and organizations.26
– Improved services: With insights from data collected, – Privacy and data breaches: Data collection without consent
companies can refine their services, leading to better infringes on individual privacy. Furthermore, there’s the ever-present
user satisfaction. risk of data being accessed unlawfully, whether through cyber-
attacks, inadequate security or internal malpractice.
– Personalized experiences: Users can receive
recommendations and content tailored to their preferences, – Over-personalization: Overly personalized marketing can feel
enhancing the user experience. invasive, giving users the impression that their online (and offline)
actions are constantly monitored and exploited.
– Access to subscription-free services: Access to services
that would otherwise be behind a paywall may be granted to – Data misuse: Even with initial consent, a risk remains that data
users in exchange for sharing their data. might be sold, shared or accessed by third parties without the
user’s knowledge.
– Enhanced product development: Companies can develop
or refine products based on user feedback and data insights. – Default settings: Users may be unaware of functionality that is
programmed by default. Because defaults are set by the product
and service providers, they may skew in favour of the interests of
those providers.
– Strategic decision-making: Analysing user data can lead – Security breaches: Holding vast amounts of data increases the risk
to more informed decisions regarding product development, of data breaches and the scale of the resulting harms, which can be
marketing strategies and user experience enhancements. financial, reputational and legal in nature.
– Predictive analytics: By understanding patterns in data, – Regulatory penalties: Non-compliance with data protection
organizations can predict future trends, user behaviours and regulations can lead to significant fines and sanctions.
market demand.
– Market perception: Inappropriate data use can erode brand loyalty
– Personalized marketing: Organizations can tailor and reputation.
advertisements and promotions to specific user preferences,
leading to increased sales and user engagement. – Misinterpretation: Incorrect analysis of data can lead to flawed
strategic decisions.
The profound amount of data that companies that technology operates as the user expects,
possess about users and their engagement doing only what the user has granted permission
with digital technologies may risk misaligned for. For example, in artificial intelligence, resources
incentives. Individuals are increasingly cognizant like the Foundation Model Transparency Index
of this potential risk. A study has revealed that, as can provide information that may help users
transparency and digital literacy have increased, understand the ramifications of their activities
86% of consumers care about data privacy, as they increasingly engage with generative
signalling a significant shift in consumer sentiment AI.28 This comprehensive assessment of the
and a burgeoning demand for enhanced protection transparency of foundation model developers
and control over personal data, all of which uses 100 transparency indicators.29 They report
underscore the urgent need for organizations on the transparency of foundation models, the
to adapt and address these concerns.27 In resources required to build them and their role in
this environment, better transparency lays the the ecosystem.30 Such transparency regarding
groundwork for effective adherence to individuals’ responsible AI seeks to promote trustworthiness
choices and expectations. As consumer and engender trust.
awareness and expectations have shifted, so have
the capabilities of organizations, including ethics Effective transparency requires that digital
and compliance programmes and the like. products and services be both well-explained
and easily understandable. Regarding data
Transparency in digital interfaces refers to collection and management, a variety of
providing clear visibility into the technology’s methods are employed today, each with its own
characteristics, including what data it collects, advantages and disadvantages (see the table on
how it processes the information and the purposes the advantages and disadvantages of common
for which the data is used. Such transparency methods of transparency in data collection
supports individual agency by respecting the and management in the appendix). Drawing
individual’s right to understand and control their from current best practices in data collection
interactions with technology. It is about ensuring transparency, helpful practices include:
– Algorithm transparency – Clearly explain how – Clear action steps when an individual’s
data influences algorithmically-driven system experience is out of alignment with their
processes and decisions expectations of trustworthiness
– Educational tools – Offer concise tutorials or The synergy of transparency and digital literacy
FAQs to clarify data practices fosters a culture of accountability that promises
a more trustworthy future. With transparency
– Feedback-driven updates – Continuously providing clear insight into a technology’s
adapt strategies based on user feedback and operations and digital literacy enabling an
comprehension studies understanding of these insights, users can maintain
a healthy dialogue about their end-user needs
– Third-party audits – Periodically validate data with technology providers. This accountability is
practices through external reviews mutually beneficial and ensures that digital tools
respect consumer rights and expectations and
work in the users’ best interests. This accountability
Digital literacy aspect becomes pivotal in maintaining digital
trust, as technology developers, being the
“least cost avoiders”,34 can effectively prevent
In an age of information overload, digital literacy and remediate online harms.35 Well-defined and
acts as the individual’s compass, providing clearly assigned responsibilities, paired with
insight as they seek to understand how digital feedback mechanisms, significantly enhance
technologies and services work so they can make the trustworthiness of digital technologies.
informed decisions.31 Literacy makes transparency
action-oriented – without an understanding of Taken together, digital literacy and transparency
what is being shared, transparency is merely guide technology to be more user-centred,
theatre. Alongside transparency, digital literacy shaping a digital landscape that is understandable,
enables individuals to effectively navigate these explainable, controllable and accountable –
interfaces and comprehend the technological supporting individual agency and building digital
controls in place. As users enhance their trust. Box 1 provides an example of a transparency
digital literacy, they can make more informed tool in support of individual agency.
Privacy by design appears in a wide variety manner that ensures they will be prioritized
of regulations, such as the EU’s General Data in the development of the integral components
Protection Regulation (GDPR)37,38 and India’s Digital that shape the user’s experience and interaction
Personal Data Protection Bill.39 Major technology with technology.
developers use similar approaches (Google’s
Privacy and Security Principles,40 Microsoft’s As trust- and confidence-building measures,
Responsible AI Standard,41 IBM on Operationalizing these protections enable users to interact with
Trustworthy AI42). These concepts aim to proactively technologies more confidently and engage
incorporate principles into a product or service in a more authentically.
Default protections underscore the organization’s – Managing trust-enhancing features over time by
focus on a human-centric approach that prioritizes monitoring users’ engagement with them, testing
respect for the individual, which in turn strengthens to confirm understanding and adoption, and
digital trust. They send a message that the user’s adjusting and improving these features over time
safety, privacy and rights are a priority, regardless
of the user’s ability to set up these protections – Implementing organization-wide digital trust
themselves. Examples of a privacy-by-design programmes (see the World Economic Forum’s
approach include: Digital Trust guidance43 and briefing on
implementation44)
– Making significant investments in privacy
settings and controls – Creating internal guidelines, principles and
standards (e.g. Microsoft’s comprehensive
Responsible AI Standard45)
Google Privacy Google provides a dashboard where users can understand, track and control permissions and data access across
Dashboard46 Google applications (apps). Such interactive visuals and transparency tools allow users to understand data use and
tailor their privacy settings.
Apple’s app Apple introduced mandatory labels in all App Stores that provide a summary of the app’s privacy policies, similar to
“Privacy Nutrition food nutrition facts. This mechanism offers users a clear understanding of the app’s data processing (i.e. collection,
Labels”47 analysis, secondary use). Such a concise transparency mechanism enables easy comparison between different apps
regarding how they process data. Additionally, Apple offers users an App Privacy Report48 that provides visibility into
how apps use the privacy permissions users have granted them and the corresponding network activity.
Redressability, the possibility of obtaining recourse However, automated regulation, supervision and
where technological processes, systems or data enforcement will likely be insufficient to fully offer
uses have negatively affected individuals, groups appropriate redress and will always require a human
or entities,49 can take many forms. Organizational in the loop. Nevertheless, these technologies
leaders can take steps to support redressability are useful tools in the broader toolkit to support
by making decisions regarding prevention, individual agency and digital trust.
engagement and how to address. Firstly, a by-
design approach allows for the implementation
of solutions such that the risk of the need for Enabling redress procedures
redress is minimized (for example, regulatory tech,
supervisory tech, enforcement tech). Secondly,
allowing individuals to directly engage with an Regulatory regimes relating to new and emerging
organization and its representatives can support technologies are a patchwork of existing and
simple resolutions to redress scenarios (such as proposed policies, so there is room for consideration
fair credit reporting). However, prevention and of how to apply the best practices of and principles
engagement activities may not be sufficient in all behind consumer protections to the digital realm.
cases, so opportunities should exist for a specific
individual or broad consumer groups to take Redress, however, is not merely an issue facing
action to seek redress. technology. Existing means of redress – including
those related to the misuse of individuals’ data –
may provide useful illustrations of how redress can
Enabling harm prevention work in digital technologies. In the US, Title VI of
the Fair Credit Reporting Act53 embodies several
important protections, including a user’s right to
Technology crafted under a redressability- request their credit score, be informed if information
by-design approach will automate regulatory in their file has been used against them, access
compliance (RegTech), regulatory supervision the contents of their file, and request information
(SupTech) or, in certain instances, seek to enforce in their file be corrected. Moreover, consumer
consumer protections (EnforceTech). Together, reporting agencies are mandated to rectify or
these technologies fortify the architecture of remove inaccurate, incomplete or unverifiable data.
individual agency and enhance transparency. Through these provisions, the regulation improves
Supporting individuals in understanding that, if transparency by ensuring users can access and
harms occur, redress is possible is important in understand their data. It also provides an avenue
cultivating an environment of digital trust. for companies to prioritize and support individual
rights, fostering a proactive culture of user-centricity.
Specifically, RegTech can aid firms in adhering to Importantly, in situations where discrepancies
compliance requirements, ensuring organizations arise, these regulations also offer a clear redress
efficiently and effectively align their digital practices mechanism for consumers, solidifying their trust in
with legal regulations thanks to risk management, the system. These regulations hold credit reporting
monitoring and reporting functionalities.50 agencies to a high standard of promoting individual
SupTech can help an organization ensure its agency, which goes a significant way towards
digital systems are compliant, namely with respect cultivating trust in the system as a whole.
to data collection and analytics.51 Related to
SupTech, EnforceTech encompasses innovative Additionally, in the financial sector, the European
technologies that can support consumer advocate Commission has proposed regulations to improve
agencies in fulfilling their objectives of protecting consumer protection in a way that empowers
consumers.52 These types of technologies, while consumers to share their data to enable better and
still emerging, promise to play an increasingly cheaper financial products and services.54 Such
pivotal role in preserving individual agency in the examples from the financial sector may be relevant
digital domain. for digital economies globally.
Illustrating the potential for such collaborations, Furthermore, Singapore has implemented a
Singapore has pioneered the Sunlight Alliance four-star system for rating the security of smart
for Action, a public-private collaboration initiative devices across different providers, giving people
launched in 2021 to bridge the digital safety an easy-to-understand framework for selecting
gap. The initiative operates through workstreams products and offering a marketing incentive
including research, victim support and public for tech companies to elevate their security
education, as mentioned in the Forum’s Earning standards. This system has earned bilateral
Digital Trust: Decision-Making for Trustworthy recognition with countries like Germany and
Technologies insight report.58 Finland and may potentially gain future recognition
from international bodies like the International
Organization for Standardization (ISO).59
Upholding digital trust and individual agency is The critical interplay between individual agency and
a shared responsibility that requires effective digital trust anchors the digital ecosystem in a human-
collaboration across the public and private first approach. Organizations aiming to cultivate this
sectors. As the world advances into the digital trust have crucial areas of best practice to focus on –
future, globally recognized, shared and verified design methodologies that actively protect, inform
standards can serve as powerful tools to bolster and enable individuals. This commitment to individual
digital trust. They will enhance consumer agency requires broad-based collaboration across
protection and promote consensus between the both the public and private sectors, anchoring the
public and private sectors, demonstrating the digital future on globally recognized standards that
inherent value in collaborative efforts to reinforce promote respect for individual autonomy and create a
individual agency. trusted digital ecosystem.
United Nations
Consumer
Protection
Principles for
Good Business Digital consumer protections
Practices (All)60 Digital trust goal Alignment title (summarized & paraphrased)
Education and Accountability and Digital literacy – Assistance to consumers to understand the choices
awareness-raising oversight and education available to them and the consequences of those choices
Protection of privacy Security and reliability User privacy – Consumer understanding and control of the collection
and consent and use of personal data
Inclusive, ethical and
responsible use
Consumer complaints Accountability and Recourse and redress – Consumer access to simple and effective recourse
and disputes oversight
– Consumer access to fair redress
Fair and equitable Inclusive, ethical and Inclusion and – Consumer access to an affordable, good quality and
treatment responsible use protection from harm reliable internet connection and essential digital services
Commercial behaviour Accountability and Responsible – Effective governance and accountability, including
oversight business conduct consumer representation in relevant processes.
Disclosure and Accountability and Access to information – Consumer access to accurate and meaningful
transparency oversight information about digital products and services
lity I
nc
i
iab
lus
rel
Privacy Transparency
ive
and
,e
thic
rity
al an
Secu
Cybersecurity Redressability
d
Digital
responsible use
trust
Safety Auditability
Interoperability Fairness
Acc
o u nta ig ht
b ilit y a n d o v e r s
Privacy policies and – Standardized: Universally accepted format – Lengthy: Most users don’t read them due to their
terms of service length and complexity
– Comprehensive: Covers all aspects, clauses and
agreements
potential scenarios related to data usage; optimized – Complex language: Often written in legalese,
for regulatory compliance making it hard for an average user to understand
Interactive privacy – User-friendly: Interactive visuals and tools allow – Overwhelming: Too many options or poorly
dashboards users to understand and control data use designed interfaces can confuse users
Just-in-time – Contextual: Offers information when it’s most – Interruptive: Can disrupt the user experience if
notifications relevant (e.g. asking for location data when a relevant not appropriately timed; often ignored or rapidly
feature is activated) dismissed by users
Privacy “nutrition” – Simplified overview: Gives users a quick snapshot of – Limited detail: Might not convey the depth of data
labels how an app uses data, similar to nutrition labels interactions
on food
Regular data usage – Transparency: Shows users exactly how their data – Might not be seen: Relies on regular user
reports has been used over time engagement
Data access & – Data access: Individual has direct access to data – Security: Potential for security breaches
portability requests
– Agency: Gives the individual ability to take direct – Knowledge constraint: Utility depends on the level
action of user expertise and contextual understanding of the
data
Embedded device – Real-time indicators: Immediate data activity – Space constraints: Limited screen may restrict
indicators notifications comprehensive info.
(e.g. internet of things
– On-device: Direct transparency available on the – Misinterpretation: Users may misinterpret indicator
(IoT) devices) device itself meanings
User experience – Scope clarity: Explicit boundaries of data collection – Dynamic limitations: Limited utility in continually
surveys changing context
– Consent: Achieves explicit user consent
(e.g. research
– User dependence: Relies heavily on user willingness
initiatives)
Sources: Cranor, L.F. (2023, 6 May). Necessary But Not Sufficient: Standardized Mechanisms for Privacy Notice and Choice. Harvard University Privacy Tools
Project. https://privacytools.seas.harvard.edu/presentations/necessary-not-sufficient-standardized-mechanisms-privacy-notice-and.
Gage Kelley, P. et al. (2009). A ‘Nutrition Label’ for Privacy. SOUPS ‘09: Proceedings of the Symposium on Usable Privacy and Security, pp. 1-12.
Genaro Motti, V. and Caine, K. (2015). Users’ Privacy Concerns About Wearables. In Financial Cryptography and Data Security. Edited by M. Brenner, N. Christin,
B. Johnson, K. Rohloff, pp. 231-244, Springer.
Hay Newman, L. (2020, 14 December). Apple’s App ‘Privacy Labels’ Are Here—and They’re a Big Step Forward. Wired. https://www.wired.com/story/apple-app-
privacy-labels/.
Nissenbaum, H. (2011). A Contextual Approach to Privacy Online. Dædalus. https://www.amacad.org/publication/contextual-approach-privacy-online.
Porter Felt, A. et al. (2012). Android permissions: user attention, comprehension, and behavior. SOUPS ‘12: Proceedings of the Eighth Symposium on Usable
Privacy and Security, no. 3, pp. 1-14.
Schaub, F. et al. (2015). A Design Space for Effective Privacy Notices. SOUPS ‘15: Proceedings of the Symposium on Usable Privacy and Security, pp.1-17.
Tiell, S. and Pesce Ares, L., (2022, 28 June). Principles to practice: Using ethical spectrums to guide decision-making. Atlantic Council. https://www.
atlanticcouncil.org/in-depth-research-reports/issue-brief/principles-to-practice-using-ethical-spectrums-to-guide-decision-making/.
Van Alstyne, M. and Paul, S. (2016, 10 November). Platform Strategy and the Internet of Things. MIT Sloan Management Review. https://sloanreview.mit.edu/
article/platform-strategy-and-the-internet-of-things/.
Wang, N. et al. (2014). Designing the default privacy settings for Facebook applications. CSCW Companion ‘14: Proceedings of the companion publication of the
17th ACM conference on Computer supported cooperative work & social computing, pp. 249-252.
World Economic Forum. (2018). Data Policy in the Fourth Industrial Revolution: Insights on personal data. https://www.weforum.org/publications/data-policy-in-the-
fourth-industrial-revolution-insights-on-personal-data.
Kathryn White
Executive Fellow, Centre for the Fourth Industrial
Revolution, World Economic Forum; Global
Principal Director, Responsible Emerging
Technology and Innovation, Accenture, USA
Acknowledgements
Indre Raviv
Senior Vice-President, Marketing, Cujo AI